Global events can have serious economic ramifications as we witnessed with the Corona virus pandemic. An extended global Internet interruption will cause even a bigger economic crisis leading to financial sector meltdown, business bankruptcies, mass layoffs and unemployment, and many other unforeseen consequences never witnessed before.

Global Internet Interruption will Cause Worst Economic Crisis

In current times where there is huge reliance on global online connectivity, an event in one part of the world could affect the entire world. Recently, the global COVID-19 pandemic has demonstrated how much the global economy can be affected by a large scale event. The pandemic has slowed down economies and forced businesses to change their operating procedures. Most brick and mortar businesses felt the full effect of the pandemic. Smaller businesses were mostly affected while digital businesses like Amazon and Zoom flourished, however, an extended Internet disruption will have a much bigger affect in all areas of business and personal lives. During Corona virus or COVID19 crisis, restrictions on movement and gatherings to minimize contact forced many of the brick and mortar businesses to close, resulting in huge losses and layoffs. The biggest beneficiaries in these challenging times have been businesses that had invested in online infrastructure. They have enjoyed greater number of new customers, and increasing revenues. Looking into the future, many people will continue to work from home increasing online activities, and businesses are going to invest more on their online presence and business strategy as most transactions are likely to be online. However, while that is a progressive step, it also exposes businesses, and the world economy at large, to a new kind of risk.

Technology and the Risk of Cyberattacks

Presently, many businesses are making efforts to make their brands and products accessible online. Market trends and customer preferences indicate a rise in online transactions. From mobile money transfers to online payment methods, many economies across the world are relying more on internet connectivity. The innovations in technology continue to make transactions convenient for both businesses and their customers. However, the same technology that benefits global businesses and economies also exposes them to the risk of collapse. Increased dependence on the internet for transactions increases the risk of cyberattacks. A cyberattack has the potential to bring business operations to a standstill. If it goes on for long enough, it can cripple a country’s economy and bring affected businesses to their knees. The economic effects in one country could spiral into other countries across the world, depending on the scale of the attacks and connections with the affected targets. A cyberattack on a global scale could have devastating effects across the world.

The Impact of a Global Internet Interruption on the Global Economy

The world economies have faced economic crises and overcome them many times. However, there is need to reevaluate the current strategies to face an eventual Internet interruption crisis. “Our world has become digitally inter-connected to a point of no return making many countries and businesses targets of counterattacks.” says Henry Bagdasarian. “At some point, an attack on the Internet or select infrastructure will succeed. What will businesses do? Traditional disaster recovery plans will not save affected businesses. Will businesses revert back to non-digital operations if at all possible? How will businesses and consumers communicate with one another?” Bagdasarian asks. Businesses have benefited immensely from the technological innovations and advancements. For instance, businesses and people can complete transactions instantly despite the distance between the different parties involved in the transaction. Online payment methods and mobile money transfers have been a key development in technology that has enhanced global trade. This trend and advancement is highly dependent on a robust internet infrastructure. Thus, global internet interruption would have a massive impact on businesses as well as governments.

A couple of decades ago, it was possible for people to go about life without the internet. However, today, many things rely on the internet. Many people are constantly online working, transacting, studying, and communicating with others. This puts the world in a precarious position where a lot would be at stake if there were to be a critical infrastructure cyberattack or even accidental disruption.

Financial institutions have always been a common target for cyber criminals. The biggest motivation for most of the cyberattacks in the past have been financial gain. The criminals have always targeted these institutions to siphon money out of them. The attacks are rarely on a large scale. However, as cybercrime advances, the financial institutions are going to be casualties in most cyberattacks. Where the cyberattack is of significant magnitude, the effects would definitely spiral to financial systems whether they were the direct target or not.

The financial world today has put up infrastructure to facilitate faster transactions. The installation of ATM machines has reduced the need to visit banking halls for cash transactions with the bank. Similarly, debit and credit cards have become highly popular in trade and daily transactions. All of the platforms that facilitate financial transactions rely on the internet to function. For one to withdraw cash from the ATM machine, there has to be internet connectivity. Otherwise, the transaction would not go through. Similarly, to charge a transaction to a credit card, there has to be internet connectivity. As such, using the modern platforms that support cash transfers depend on the availability of the internet. A global internet disruption would have a major impact on the ability of people to transfer cash or make payments. “To the extent that businesses offer digital services and people embrace them, an internet disruption would have a direct and proportionate affect on businesses and their consumers” says Bagdasarian.

Internet Disruption Scenarios

A global internet disruption can develop in a few ways. First, it can result from an internet backbone attack by a terrorist group or rogue nation. For instance, a terrorist group may decide to cause an internet infrastructure disruption so that it can advance its agenda. It may need an internet blackout to commit other terrorist activities. The disruption may provide the terrorists a chance to carry out criminal activity unabated as nations work to restore internet connection. Nation states may also resolve to cyberattacks to resolve geopolitical conflicts. In an ongoing conflict, a nation may decide to settle scores by causing an internet blackout. The aim may be to cripple the rival nation. However, such an action by a nation may be counterproductive as if the attack were to have a global impact, it would eventually affect the perpetrator state.

Another way the world could face a massive internet disruption would be through the actions of hacktivists. Hackers may use malicious programs to make their points or settle score with a selected target. However, they may fail to foresee the full effect of their activity. A chain reaction of events may result from such hacks and go beyond what the hackers intended. It may spread to affect several countries, or even the entire world.

If hackers were to release malicious software or shut down servers, there would be a massive internet interruption and crisis. It would disconnect people and businesses across the world and cause a massive panic and chaos.

Impact of Global Events on Economies

The financial services and online businesses will not be the only casualties of an internet disruption. Internet business disruption could also result from the impact of cyberattacks on critical infrastructure. For instance, an attack may result in an interruption of the electric grid that could lead to a massive national electricity blackout. That would be debilitating to any economy.

Further, an internet disruption will definitely affect communication channels globally. It will throw the entire world into a communication blackout. That could potentially have a huge impact on economies because of the panic and chaos likely to result from it. Crippling communication lines together with financial systems is a sure recipe for economic disaster. People are likely to react to such a situation with radical measures that may throw the world economy into turmoil.

In the event of an internet disruption of a global magnitude, companies that will bear the biggest impact are likely to be in the financial sector as well as online retail stores. That is because these businesses deal with virtual money most of the time. Most of the transactions involve movement of cash from one account to another. The money transfers are only possible if the internet is on and running. Disrupting internet communication would inflict massive disruptions to the operations in the financial sector. Similarly, the online retail businesses face a huge risk in terms of loss of revenue. Customers will not be able to access such businesses. Unlike brick and mortar businesses where a customer can walk in to make cash purchases, online stores do not necessarily have a physical store. Thus, if they were to be inaccessible online, their services would be unavailable completely.

Regardless of the cause of an internet disruption, the impact would be substantial. The inability to transact or access information is going to cause a lot of panic. Many businesses will ground to a standstill, incurring major losses, healthcare patients will not access medical services, basic services such as water, electricity, gas, and telecommunication will be disrupted. The effects will be far-reaching and it may take a long time to recover. The faith that people have in online businesses will be shaken and some people may not be able to recover from the catastrophe.

It is imperative that organizations and nations take steps to put in place robust infrastructure to mitigate the risks of an internet disruption. Companies need to have an effective online business continuity plan that will ensure they can weather the impacts of an internet cyberattack. While the financial institutions have used their experiences with minor cyberattacks to develop plans for combating cyberattacks, an attack on the internet would require a different approach altogether. Organizations and countries need to collaborate in order to mitigate the impact of a global disruption of the internet. A coordinated response would be key to containing such an attack within the shortest time possible to minimize the impact.


Most businesses today have adopted technology and digitized their operations. That means that they have most of their operations online. The internet has made it possible for businesses to access a bigger market. The recent trend has been to move away from traditional business structure that relied on physical brick and mortar stores to virtual online stores. Given the inter-connectivity of businesses on the internet, an online business disruption would have a catastrophic impact on the businesses. First, customers will not be able to access online stores to make their purchases. Secondly, payment methods will be frozen. Most online businesses depend on the internet for completion of transactions. Online payments require internet connection for approval. A delay in payments by customers and to suppliers would greatly hamper the operations of a company. Businesses are bound to lose revenue due to such a blackout. Moreover, patients will not be able to access medical services and people will be left without basic services. Depending on how long the internet blackout lasts, the online business interruption resulting from the blackout would potentially cause an economic crisis of global significance.

“As digital businesses benefited form the Corona virus pandemic, can brick and mortar businesses with barter transactions benefit from an Internet interruption?” asks Bagdasarian.

Identity and Access Management blog, articles, news, analysis and reports
Visit our blog to read other articles.

Whether it is a new product implementation, system replacement and upgrade, or a massive system access audit undertaking, completed projects are how many businesses measure progress. The ability of a manager to lead teams and complete project tasks successfully is a critical skill that project managers must possess. In the world of technology, project management has its own unique challenges such as evolving customer expectations, fast-paced changes in technology and the challenge of communicating with non-technical parties. Project managers in the technology industry must use and adopt the traditional best practices in project management. Below you will find some project management best practices in technology as well as challenges, styles, and skills.

Technology project management best practices, challenges, stiles, and skills.

The Challenges of Technology

Technology impacts every industry and units of an organization. Information grows and moves faster than ever before, users are dispersed, and new devices with increased capabilities hit the market regularly. This can make project management a difficult task. Below you will find some project management challenges in technology projects:

Evolving Needs

One of the major challenges of completing a technology project is that client needs evolve and grow including changes in the project requirements while the project is in progress. Clients may suddenly want a new platform or changes to existing systems. For example, they may realize that the website they manage is no longer the website the business requires and needs a major overhaul. Or, security incidents involving passwords may require a sudden implementation of a multi-factor authentication solution. Clients will expect the project manager to adapt to rapid changes.

Amateurs and Experts

Clients may know what they want a product to do, but they may not have the language of technology to express it well. In some ways, a project manager must be a bridge between his or her team of experts and the less-knowledgeable client.

A Changing Landscape

A team may be in the middle of working on a mobile app when a new device hits the market. Should they incorporate the new device in the initial release or wait for the next update? The project manager will be a key figure in making this decision.

Technology Project Management Styles

Because of the changing nature of technology, there are a number of popular project management organizational styles in tech sector. Some of these are similar to project organization schemes used in non-IT projects. Other models were developed specifically for technical projects but are being adopted in non-tech projects. The project manager must decide which model is best for a given project.

Linear Project Management

Linear project management is sometimes referred to as waterfall management. As part of the planning stage, the manager and team define specific project phases. All of their efforts go into completing the current stage, and the team does not start a new phase until the previous one is complete. In this model, the client may only see the finished project.

Iterative Project Management

The iterative model involves a repeated process of planning, designing and analyzing. This is a common process when upgrading versions of an app or other software. The company releases the first version, analyzes its performance and then looks for ways to improve the functionality of the software.

Adaptive Project Management

The most widely-known form of adaptive management is the agile process. It was created as a means for software developers to be more responsive to changing client needs. In this process, the first step is to plan achievable goals for a phase of the project. The team then engages in a work period such as a two-week sprint. After this period, the team analyzes and shares the results with the client, and the goal-setting process begins again.

Project Management Best Practices

Technology project management requires a number of abilities. In addition to the common practices of communication and scheduling, technology project managers must know how to translate the language of tech so that clients can understand the process. Due to the broad nature of technology, project managers need an in-depth knowledge of a team members’ abilities.

Determining Client Needs and Expectations

Often, a client who may be a manager of a unit within an organization must rely on the specialized skills of a technical project manager who has specialized skills necessary to put together a project. The client may know what he or she wants to see but does not know how to make it happen. A project manager needs to do a great deal of clarification at the beginning of the process to establish realistic expectations and fully understand a client’s needs.

The Planning Phase

People with technical project management training tend to be good at breaking a project into smaller parts. Depending on the project, there may be software and hardware needs. If it involves a mobile app or website, there will be front-end and back-end programming involved. The project manager has to balance and prioritize these individual pieces while maintaining a view of the big picture. Eventually, all these pieces will need to fit together.

Asset Management

There are different programming languages for a variety of software needs. Some programmers specialize in one or two languages. Others have a broad knowledge of several languages. A project manager needs to have extensive knowledge of each team member’s skills. Unless the manager is trying to expand someone’s skill set, putting a frontend programmer on a backend part of the project will cost time.

Clear Scheduling and Goal-Setting

A common issue for technology professionals is the feeling that a project is never quite done. Many programmers want to create the cleanest code possible. While clean code makes things easier in the long run, it can slow things down when a client is looking for results. The project manager needs to set a firm timeline and establish clear goals so that the client can see a working project. Cleanup can always happen in a subsequent phase.


Like all project managers, a technology project manager needs to maintain a clear budget for the task at hand. However, this can be a challenge if the team is working with new technologies where there are several unknowns. This is another reason that communication skills are critical in technical projects. If the project manager must ask for more time or funding, he or she must be able to explain the reasons and present a realistic timeline.

Testing and Reporting

Testing is an important piece of any technology project. For a mobile app, the programmers need to be certain that their software works on a wide range of devices with different screen sizes. They need to run the programming through many different scenarios in order to find errors or programming conflicts. Conflicting commands are frequently a problem during the assembly phase of a program.

Clients need to know that the project is moving forward. Unlike a manufacturer who can display a physical prototype, technology projects do not always have an easy way to demonstrate progress. Lines of code on a computer are not that impressive to a non-technical person. The project manager will look for ways to give examples of progress and communicate how the process is going in language that makes sense to a non-specialist.

Closing and Delivering the Project

Just as individual programmers can have difficulty declaring their piece of the project finished, a team may struggle to know when the project is ready to move forward. Because of the pace of technology, it is not unusual for a “finished” project to go through several iterations before it reaches a final form. An important task of a project manager is declaring that the work phase of the current project is over.

The project manager is also responsible for delivering the completed project to the client. Once the project goes live, the next task for the team may be a new version or upgrade of the same software. One of the unusual aspects of technology projects such as mobile apps is that there may be thousands of users working with the software within days. These users can discover minor issues that snuck through the original testing phase and will require repair.

An Upgraded Version of Traditional Skills

The basic tasks of a project manager have not changed much over time. Communication with clients and team members, organizing and scheduling, budget and asset management have always been part of getting things done. In the new era of technology, faster access to information and rapid technological changes require new road maps for project management. Managing a technology project combines traditional business skills with the power of the digital age.

Identity and Access Management blog, articles, news, analysis and reports
Visit our blog to read other articles.

Data has been described as “the new currency,” and many businesses are still learning how to harness its power. At the same time, consumers are becoming more aware of the amount of data businesses collect. Concerns over privacy and safety have led governments around the world to begin passing data privacy laws.

California Consumer Privacy Act (CCPA) regulation applies to companies and organizations doing business in California that meet certain requirements.

The California Consumer Privacy Act (CCPA), passed on June 28, 2018, brings regulation similar to the EU’s GDPR to businesses in the state of California and is considered to be the most comprehensive privacy law in the USA. Effective as of January 1, 2020, the regulation is designed to minimize unauthorized data use, provide transparency and give consumers more control over their personal information.

How should businesses handle data privacy?

Consumer protection is the main goal of data privacy laws. For businesses, this means making privacy an integral part of company goals, objectives and processes. Because data plays a role in everything from marketing initiatives to customer service, this could require restructuring common business processes to comply with privacy regulations.

However, true protection goes beyond basic compliance. Complying simply to avoid fines ignores the larger objective of helping customers feel safe when conducting transactions. By committing to a data privacy policy that prevents the use of customer information for anything beyond the purpose for which it was shared, businesses avoid overstepping privacy boundaries, reduce breach risk and establish trust with both customers and stakeholders.

Why is data privacy important to individuals?

Data privacy laws clarify the rights of anyone who shares information with a business or organization and put data back under personal control. Customers, students and patients have the power to decide who sees and uses:

• Names and aliases
• Account names
• Addresses
• Email addresses
• Social security numbers
• Driver’s license numbers
• Passport numbers
• Personal health records

Other identifying information may also be protected as per the wording of specific regulations.

Under such laws, owners of data can grant or revoke the privilege of seeing, sharing, selling and otherwise handling identifying information. Individuals are also assured that only data needed for transactions and interactions is collected. Together, these regulations create better relationships and establish a safer digital environment.

The CCPA accomplishes these goals by allowing California residents to:

• Know what data is being collected about them
• Know whether data is being sold or disclosed and to whom
• Opt out of all data sales
• Access their own personal data at any time
• Request deletion of collected data
• Not be subject to discrimination for taking actions to preserve data privacy

Who should care about CCPA compliance?

CCPA may impact any organization collecting personal data for the purpose of selling goods or providing services. This includes businesses, educational institutions and healthcare providers, as well as groups engaged in market research, social research and research and development activities.

The CCPA defines personal data as “information that identifies, relates to, describes, is reasonably capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household.” This includes:

• Personal property records
• Past purchasing behaviors
• Browser and search history
• Website interactions
• Geolocation information
• Professional and/or employment data
• Privately available education information
• IP addresses
• Cookie IDs
• Biometric data

Businesses often use such data to build customer personas—semi-fictitious “psychographic profiles” of their audiences—for marketing purposes. The CCPA classifies even these profiles as protected information, which could have a significant effect on how companies research, track and market to their prospects.

Which businesses must comply with CCPA?

CCPA data privacy regulation applies to companies and organizations doing business in California that also meet at least one of the following criteria:

• Earning a gross annual revenue of $25 million or more
• Buying, receiving or selling the personal information of 50,000 or more consumers and/or households
• Earning more than half of annual revenue from selling consumers’ personal information

“Doing business” encompasses any activity in which a business “determine[s] the purposes or means of processing” data on its own or in collaboration with other entities. This includes, but isn’t limited to, deciding:

• To collect or process personal data
• What types of personal data to collect
• Whose data to collect
• To process data as part of a contract

Online businesses must be particularly diligent in assessing data collection activities to determine whether CCPA compliance is required. Because online entities collect and process consumer data on a regular basis, any business meeting CCPA criteria and interacting with customers in the state of California is likely to be subject to the regulation.

Compliance costs and penalties

According to an official publication from the California State Department of Justice, initial CCPA compliance is expected to cost $55 billion. Ongoing costs could range from $467 million to $16 billion from 2020 to 2030. Failure to comply also carries a hefty price tag: Businesses could be charged up to $7,500 per violation in the event of a data breach.

Businesses and organizations notified of noncompliance have 30 days to address the issue before penalties are applied. Should an entity fail to respond, the state attorney general is authorized to initiate a civil case to resolve the matter.

What are the CCPA compliance requirements?

Entities falling under CCPA jurisdiction are required to create and post a privacy policy on their websites. This policy must include:

• What information is collected and processed and why
• How collection and processing is conducted
• Details of data sales
• How individuals can request data access and change, move or delete their data
• A method to verify identities in the event of such requests
• An option to opt out of personal data sales

Annual policy updates are required to reflect any changes in the way data is collected, handled, processed or sold. In addition, website homepages must include a “do not sell my personal information” link leading to a page where consumers can prevent the sale of personal data for 12 months or more. If consumers don’t opt out, businesses are still required to obtain consent to sell data from 13- to 16-year-olds and from the parents of anyone under the age of 13.

Preparing for compliance provides businesses with greater data visibility and reduces or eliminates unnecessary data, which not only improves data privacy but may also reduce breach risk. While compliance alone isn’t enough to keep hackers at bay, incorporating compliance measures into cybersecurity protocols results in a more robust approach to breach prevention and mitigation.

Is the CCPA a forerunner of federal data protection regulations?

California isn’t the only state introducing data privacy laws. In 2019, New York, Massachusetts, Texas and Washington were also focused on developing legislation to give consumers greater control over personal information. However, no federal regulation similar to the GDPR yet exists in the U.S.

This may change in the future in response to growing consumer demand for data privacy. A single nationwide privacy law would unify compliance requirements, provide consumers with clarity and simplify enforcement across state lines.

For now, businesses required to comply with CCPA should already have measures in place to enable greater consumer control over data. For companies outside the state, CCPA provides a potential framework for future data privacy regulations. Starting preparations now will ensure proper systems and protocols are in place to support compliance with eventual federal regulations.

Identity and Access Management blog, articles, news, analysis and reports
Visit our blog to read other articles.

Cloud computing has become popular ever since the concept was introduced. As more data and applications move to the cloud from traditional systems, it becomes paramount for businesses and their management to secure their data from threats and attacks as they store, process, and access their data in the cloud. Cloud security and access management concepts addressed in this article cover a set of technologies, rules, and regulations that collectively help businesses protect their data and customers’ private information.

Cloud security and access management concepts presented by Identity Management Institute

Brief History of Cloud Computing

Many would think that cloud computing was invented as part of the 21st century technological advancement. However, cloud computing foundation started more than 60 years ago. A computer scientist, J.C.R Licklider, invented a system of interconnected computers in the mid-1960s. This idea assisted Bob Taylor and Larry Roberts to develop the first network that allowed communication in interconnected yet separated and distant computers called ARPANET, Advanced Research Projects Agency Network, also known as the “predecessor of the Internet.”

As time passed and technology advanced, modern cloud computing emerged. For instance, IBM invented the Virtual Machine (VM), an operating system in 1972, and by 1996 cloud computing had become a growing resource for companies, educational institutions, and many more.

Benefits of Cloud Computing

Cloud computing has many advantages that serve businesses and their users. It enables the set up of a virtual office that allows flexible connections to the businesses anytime and anywhere. Some of the benefits of cloud computing include:

1. Flexibility of work activities: Cloud computing provides flexibility to workers in many ways. For example, easy access to data from anywhere: home, outside the country, or on another continent.
2. Security: This is one of the best advantages of cloud computing. It ensures that data files are available and secured in cases of local natural disasters, crisis, or damages to the servers.
3. Saves Cost: Cloud computing typically offers a pay-as-you-use pricing model. There is no excessive upfront capital investment in software or hardware, and there is no need for in-house trained personnel for maintenance.
4. Availability: There is an easy expansion of data storage at little cost, and cloud capabilities can be modified or expanded per requirements of the business.
5. Automation: Hosts can monitor, control, and report usage of cloud computing, which provides transparency in operation.
6. Easy Maintenance: Cloud computing systems are upgraded frequently, and this makes it compatible with newer technology. The servers are also maintained easily, and the probability of disruption is minimized. Businesses are also placed at an advantage over competitors because they receive updates on information and applications quickly.
7. Accessibility: Users can easily and quickly access stored data with the use of an internet connected device such as phones, tablets, laptops, or workstations thus increasing productivity.

There are some disadvantages to cloud computing. For example, accidental third part access, the sharing of sensitive information with third-party cloud computing service providers, and the need for Internet connectivity to conduct business. However, the fact that cloud computing is widely used and accepted cannot be denied.

Types of Cloud Computing

Cloud computing has gained popularity because companies need massive amount of data storage space. Also, small and large businesses can benefit from advanced security and access management that cloud platform providers offer. A cloud environment may offer Private, Federated or Hybrid, and Public options.

Private Cloud

Gartner defines private cloud as “a form of computing that is used by only one organization or ensures that an organization is completely isolated from others.” It is designed to meet an organization’s essential needs. It offers flexibility, security, and many other benefits. It is usually based on a monthly lease.

Public Cloud

A public cloud computing system is scalable and elastic. IT capabilities are provided as a service to external customers using Internet technologies. The benefits of the Public Cloud include improved security systems, additional storage capabilities outside a traditional on-premise storage capabilities, and can save time and money.

Hybrid/Federated Cloud

Hybrid cloud is a computing environment that combines a public cloud and a private cloud by allowing data and applications to be shared between them. According to Gartner, it refers to the “policy-based and coordinated service provisioning, use and management across a mixture of internal and external cloud service.” The hybrid cloud gives businesses the needed agility for competitive advantage.

Models of Cloud Computing

Cloud computing is divided into three primary cloud service models, which are software as a service (SaaS), platform as a service (PaaS), and infrastructure as a service (IaaS}.

Software as a Service – SaaS

This is one of the widely known models of cloud computing. This is a scenario where applications are hosted and made available for customers on the Internet. Users can access an application on the Internet instead of downloading, updating and running these applications locally.

Benefits of SaaS

Due to its easy accessibility, this model of software has become famous for different business applications. The subscription fee is paid monthly or annually.

One of the benefits of SaaS is that the model runs perfectly on all devices – from computer to mobile devices – and it also supports all major browsers. Furthermore, automatic updates are frequent on the system for SaaS customers.

Thus, they reduce the costs of buying new software releases when they are available. This is greatly beneficial for a company with limited IT staff. Besides, the SaaS platform also reduces the software licensing costs.

However, there are some challenges faced by SaaS customers. The user may face difficulties when moving large files from one software platform to another. This is especially true if the user or IT department decides to replace the SaaS software all together.

Also, SaaS users need connectivity to the Internet to access their files. SaaS applications, when compared to client or server applications, run at a slower speed.

The most suitable use cases of SaaS include:

• Newly established companies that desire to run e-commerce applications quickly
• Short term projects that need quick attention
• Applications like Turbo Tax software that are used during peak seasons

Platform as a Service – PaaS

This is a cloud computing model where a third-party provider handles hardware and software tools needed for application development for users over the Internet, both of which are hosted on its infrastructure.

They can be available through public, private, and hybrid clouds to deliver application hosting, and many of their products are directed towards software development. Payment is on a per-user basis model which eliminates expenses on hardware and software.

Infrastructure as a Service – IaaS

This is a model of cloud computing that presents virtual computing resources over the Internet. Also, it is a form of cloud computing that delivers vital resources to consumers on a pay-as-you-go basis.

• Saves time: There is lower infrastructure cost, and it is an economical choice for a new business.
• Flexibility: Workers can access and connect to the server for data retrieval quickly.
• Availability: It can run when the server is down with less chance of damage to the infrastructure.

• Security: Enterprises do not have control over cloud security.
• Accessibility: Technical problems may restrict access to applications and data while relying on a third party for resolution.

Identity and access management newsletter
Subscribe to Identity Management Journal

Cloud Identity Management

Cloud computing is a combination of various computing resources like servers, storage, applications, and services that make the provision of on-demand access to cloud users and customers.

The data stored in the cloud are maintained by Cloud Service Providers (CSP). Thus, Identity and Access Management is an important concern for cloud-based services because many cases of data leakage are due to poor identity and access management.

Identity and Access Management (IAM) is a way of building security and authentication gates into distributed resources since, wide distribution of resources (services, storage, etc.) cannot be avoided, and there is no single software to generally secure all the systems.

Identity and Cloud Access Management SaaS is expanding every day as many are migrating to cloud applications. The latest Identity and Access Management Market Report estimates that the IAM market as of 2019 is worth $18.3 billion.

Standards and Protocols for Identity Management

Physical Security Mechanisms

These include access cards and biometrics that secure access to cloud physical resources.

Chip and PIN

A chip-and-PIN card is a type of credit card that requires card holder to authorize a transaction by introducing the card and entering a personal identification number (PIN). The chip is square shaped which is visible on the card and stores information. The combination of chip and pin cards prevents fraud better than older types of credit cards.

Single Sign-On

SSO is an authentication service that allows the user to access multiple applications using one set of login credentials (e.g. name and password). They can be used by companies to avoid managing usernames and passwords for thousands of users and systems.

It is a Federated Identity Management (FIM) and the use of this system is called Identity Federation. Examples of this service are Kerberos and Security Assertion Markup Language (SAML).


• Fewer passwords and usernames for each application
• Reduces complaints to IT about access issues


• Users are locked out of multiple systems connected to Single Sign-On (SSO) mechanism when they can not access the network
• Unauthorized users can gain access to more than one application once they acces the system


This is a decentralized authentication protocol based on OAuth 2.0 family of specifications which allows for user authentication to the resource provider with third party identity vendors. It supports SSO services, and users can easily login to websites that support the use of OpenID authentication. The latest version of OpenID is OpenID Connect (OIDC). It allows authentication for native and mobile applications and gives a link for communication between participants.

Zero Trust

Zero Trust was created in 2010 by John Kindervag, and is based on the belief that organizations need to thoroughly verify and scrutinize any user whether internal or external or anything that wants to gain access to their systems before it is allowed.

Zero Day

Zero day vulnerability management prevents a cyber-attack on the same day a weakness is found in the software. When software issues and updates are detected, reports are sent to companies to patch the software immediately.

LightWeight Directory Access Protocol

LDAP allows users to find data and information about organizations and individuals either on a public or corporate network. It can be used in different applications or services to authenticate users.

Content Security Policy (CSP)

Content Security Policy is a security layer that mitigates certain security risks related to Cross Site Scripting (XSS) and data injection attacks. These types of attacks are used to gain unauthorized access, steal data, insert malware, and deface a website.

Challenge Handshake Authentication Protocol

CHAP is a security protocol used for authenticating a user to access a network entity like any server or Internet Service Provider (ISP).

Authentication Mechanisms

Authentication is an identity validation process of a person or device based on something they have, know, or are such as passwords, hand wave or gesture, voice, etc.


This is a method for allowing or rejecting access to a specific resource based on the authenticated user’s entitlement or rights. The process determines what the user can do once they are inside the system.

Sometimes, these authorization rights are given by third-party vendors and the applications can access certain private information of the business or individual. Authorization in cloud computing is gained by either access control policies or access right delegations.

Mandatory Access Control

MAC is a mechanism used to define the accessibility right of users. It gives access permissions through the operating system and controls the ability of data owners to allow or deny access rights for clients into the file system. Clients have no rights to change these access rights, although, it needs careful planning and frequent monitoring.

Discretionary Access Control

DAC is a control mechanism that controls access permissions through data owners. It provides more flexibility than Mandatory Access Control (MAC). However, Discretionary Access Control is less secure and could be an access threat.

The Requirements of Regulatory Bodies


Health-Insurance Portability and Accountability Act of 1996 or HIPAA, also known as Kennedy-Kassehaun Act, was signed by President Bill Clinton in 1996. It was established to bring in a new flow of healthcare information, and it stipulates how personal information maintained by healthcare should be protected from fraud or theft.
HIPAA states that physicians and healthcare professionals can use mobile devices to access medical data in the cloud, as long as physical and administrative measures are in place to protect confidentiality and availability of medical data on the device used. Also, if a cloud service provider experiences a data security breach, reports must be made to covered entity and business associates.


Federal Risk and Authorization Management Program is a government establishment that provides a standard approach to security assessment, authorization, and continuous mentoring for cloud products and services. Included in this program are provisions for a cloud system that provides extra security to protect and encrypt government information.

Gramm-Leach-Biley Act

Financial Modernization Act, also known as the Gramm-Leach-Biley Act, is a United States Federal law that expects financial institutions to enumerate how they share and protect their clients’ private information.

Financial institutions must discuss with their clients how they share sensitive data, and the company must also explain the opt-out option for clients when they are not satisfied.

The role of the act is to ensure financial institutions protect the confidentiality and security of their customers’ private information like bank account numbers, addresses, phone numbers, credit income, and history.

GLBA requires financial institutions or CSP holding financial data to:

• Create a written Information Security Plan.
• Design and implement a safeguards program to be monitored and tested regularly.
• Adjust their services according to the pressing challenges and circumstances.

FIPS 200

This is a second standard signed by the Information Technology Management Reform Act of 1996 that enumerates minimum security requirements for federal data and information systems. The act states that federal agencies must meet certain minimum requirements in these seventeen areas:

• Access Control
• Awareness and Training
• Audit and accountability
• Certification, accreditation and Security Assessments
• Configuration Management
• Identification and Authentication
• Incident Response
• Maintenance
• Media Protection
• Physical and Environmental Protection
• Planning
• Personnel Security
• Risk Assessment
• System and Service Acquisition
• Systems and Communication Protection
• System and Information integrity

The Future of Cloud Computing

The era of the 21st century has experienced great technological advancement and cloud computing growth. The future of cloud computing has been a debated topic among many technology scientists and researchers.

By 2045, it is estimated that the world’s population will increase to 9 billion, and cloud computing will provide a digital infrastructure for future cities. Furthermore, elevators, drone taxis, and self-driving cars will be better managed through cloud computing.

The cloud will be a transformative tool for companies, especially small and medium-sized companies. Artificial intelligence and other cloud computing aspects will be part of the services rendered.

The cloud will also help the society adapt to a growing volume of data. There will be an invention and advancement of in-car technology. Thus, people will use driverless cars which will come with sensors and cameras that generate much data. Cloud computing will support emerging technologies like artificial intelligence by assisting them to adapt to new mobile platforms and devices.


Cloud computing will expand to make resources, applications, and data available at anytime and anywhere, regardless of location and distance. Security threats will increase and new cloud security technologies will emerge to protect and secure vast amount of data.

The three models of cloud computing, IaaS, SaaS, PaaS, are essential aspects of cloud computing, and they all have significant contributions to business operations while they offer unique challenges and risks. The features and benefits of cloud computing are inexhaustible: accessibility, cost saving, and simple maintenance.

Cloud security best practices, standards, and protocols include identification, authentication, and authorization controls to limit exposure to threats.

Cloud security and access management concepts and mechanisms include OAuth, OpenID, LightWeight Directory Access Protocol (LDAP), Zero Trust, Zero Day, and Content Security Policy (CSP). Although various regulations provide guidance for data protection, they also pose a risk to organizations that fail to comply and may be liable to regulators and consumers for security incidents.

Identity and access management certifications

The future for cloud computing is exciting which will bring a new area of technological advancement. The need to formulate robust cloud security policies and solutions that can eradicate hacking threats and access to data without permission is important. It is time to prepare for the advanced technological endeavors that will propel humanity to a digital and robotic future.

Approaches to identity and access management are continually evolving in response to a changing threat landscape and the growing financial impact of breach activity. The amount of data organizations manage jumped 40% between 2018 and 2019, and the total cost of losing that data increased almost $14,000 during the same period.

With the stakes so high, enterprise businesses and large organizations can’t afford to continue using security protocols designed for self-contained internal systems. Extensive networks with diverse user bases require identity management and access control measures capable of executing adaptive responses to dynamic user interactions. The CARTA framework introduced by Gartner offers one such solution.

What does the CARTA acronym mean?

CARTA stands for Continuous Adaptive Risk and Trust Assessment and is based on Gartner’s Adaptive Security Architecture. As one of the company’s top security projects in 2019, CARTA seeks to address the changing world of identity and access management and provide solutions to emerging IAM challenges.

Today’s enterprises face the unique challenge of managing cloud-based networks that are always on and always accessible from a variety of devices. While network users often lack essential cybersecurity knowledge, innovative hackers take advantage of technologies like artificial intelligence to launch subtle attacks. It’s a tricky combination with the potential for serious consequences unless reliable IAM protocols are put in place.

The CARTA strategy is designed for continuous adaptation that goes beyond basic allow or deny models to provide contextually relevant access. By operating with context as a guide, CARTA can reduce bottlenecks, maximize system efficiency, improve workflow agility and improve user experiences. It enables granular access control beyond what standard IAM procedures are capable of and allows IT teams to manage networks without the constant burden of manual monitoring.

CARTA vs. zero trust: Is there a difference?

CARTA shares many characteristics with zero-trust frameworks. In traditional network settings where the perimeter can be clearly defined, the default position is often to trust anything “inside” and require verification only for “outside” requests and inputs. However, this stance becomes problematic as perimeters expand beyond the confines of a business or organization.

Today’s network perimeters may incorporate an enterprise’s physical location, numerous remote employees and multiple third-party vendors or partners. Zero trust addresses such an environment with a new default security position: Trust nothing in the network until its identity has been verified. Network access is only granted when genuine proof of identity is presented, usually as a combination of credentials and behavioral parameters.

CARTA takes the zero trust idea further by introducing:

• Continuous monitoring, assessment, discovery and risk prioritization
• Adaptive attack protection
• Contextual access control
Continuous device visibility
• Automated device control
• Micro-segmented networks
• Ongoing cyber and operational risk assessment
• Security management for agentless devices
• Dynamic trust and risk assessments and responses

Both CARTA and zero trust encourage real-time assessments and monitoring. Trust is based on identity and verified continually using behavior and context instead of basic allow or deny rules. To achieve the best outcomes with either framework, both users and devices must be monitored on an ongoing basis. CARTA’s additional security measures not only reduce breach risk but also improve containment should a hacker gain network access.

Should businesses switch to the Gartner CARTA model?

Cybersecurity evolves quickly, and it can be difficult for businesses to keep up with the changes. However, hackers seek out vulnerabilities, and organizations with outdated security policies present easy targets. CARTA offers an approach that incorporates recent trends in IAM and security, so businesses should be able to adapt and expand existing security protocols to adhere to the updated principles.

CARTA implementation makes the most sense for enterprises and organizations with:

• Large numbers of agentless internet of things devices
• An extensive network of external vendors or partners requiring network access
• An active BYOD policy
• A large remote workforce
• A growing network perimeter
• Issues arising from silos within existing security systems
• Concerns regarding the use of unapproved third-party applications

Such complex networks have more users, including third parties, and require more oversight and automation than smaller network environments. Customer access introduces additional challenges for some enterprises and organizations. CARTA can also address these issues, such as unsecure devices and access from private Wi-Fi connections.

If a breach occurs, the CARTA model improves detection times and allows for faster responses. Instead of weeks or months passing between a breach event and its discovery, enterprises are able to shut down and mitigate hacker activity before extensive damage is done.

How can businesses implement CARTA?

Building on the idea of zero trust, CARTA assumes every potential network interaction poses a threat. It creates a security framework where identity management and breach response are both based on a reality in which hackers are definitely going to infiltrate the system.

It’s far from a paranoid assumption. Sixty-eight percent of business leaders see an increase in cybersecurity risk, and statistics show the perception is accurate. Breach incidents have increased 67% since 2014, and hacker attacks now occur once every 39 seconds. Such volume requires the kind of continual, automated monitoring and response CARTA can provide.

Management within a CARTA framework involves three interconnected phases:

• Run: Analytics are used to detect real-time behavioral anomalies. Automating the process reduces the need for direct intervention from IT teams, which improves overall security efficiency. Shorter threat response times and faster mitigation prevents costly breach consequences.
• Build: Security becomes part of the application development process. Risk evaluation takes a high priority as developers assess the tools and code used to create software solutions and take a critical approach to service partnerships.
• Planning: Risk tolerance levels can hold back enterprise development and growth.

Executives need to recognize that all growth—particularly in terms of technology—comes with a measure of risk. New opportunities must be evaluated based on whether benefits outweigh potential risks to the organization as a whole.

Successfully implementing and sustaining these phases requires a suite of automation tools to handle monitoring and responses. This enables advanced identity management and access control by introducing contextual elements, which equip networks to approve or deny access requests based on granular controls.

Finding the right tool starts with a thorough assessment of current security systems to identify gaps and see where CARTA can improve protocols. By evaluating known and potential vulnerabilities, enterprises can determine the best approach for moving away from traditional IAM and security frameworks to a new method designed for a growing perimeter.

Higher amounts of risk are inherent in larger businesses, which makes CARTA desirable for enterprises and institutions. These organizations should weigh implementation expenses against potential breach costs to determine if CARTA is superior to other security solutions.

As with any identity management and access control solution, CARTA must be approached with a solid knowledge of all hardware, software, devices and users accessing internal networks. Clearly defined metrics serve to guide ongoing monitoring activities and future assessments, which helps enterprises track whether new security protocols are working.

Although the design of the CARTA framework is suitable for enterprise IAM and security needs, companies must conduct internal assessments prior to implementation. Any protocol should support requirements for security and compliance while providing a seamless and efficient experience for all users.