Flaws and Risks of Cloud Computing

February 7, 2020


Cloud computing is the scope of services that provides IT resources to customers over the Internet on an as-needed basis with payment applied for these subscribed services.  Cloud computing is gaining increased popularity, especially with small and medium-sized enterprises due to its relatively low-cost implementation. Big companies, on the other hand, tend to employ their own infrastructure, as it provides them with a higher level of privacy, flexibility and data security. What are the limitations and risks that deter big enterprises from moving their IT infrastructure and software systems on the cloud, and are there any solutions for it?

Enterprises face different issues when they move their IT infrastructure and software into the clouds.

Privacy and security 

Privacy and security are the top main issues that enterprises have to consider prior to adopting cloud-computing services. 

As cloud resources are shared by many clients, the privacy and security of the data stored on the cloud become vulnerable to multiple threats.  There is an enormous amount of data stored in the clouds. Quite often, this data contains very important financial or private, confidential information, and any unauthorized access, security breach, or malicious activity can cause irreversible damage to companies storing their data on the clouds.

Although it is difficult to eliminate all security issues completely, there are a number of measures that cloud providers can and do take to mitigate the threat and protect their customers better. Some commonly used privacy and security measures used by providers include:

  • Using certificates and encrypting all sensitive information.  Some of the popular security algorithms are RSA algorithm, DES algorithm, AES algorithm.
  • Deploying strong authentication for all remote users and not using vendor-supplied passwords.
  • Using private IP address spaces and (virtual) networks.
  • Maintaining and using firewall technology at every point, and blocking unused protocols, ports, and services.
  • Have anti-virus software installed on every device.

Vendor lock-in and interoperability

Another major issue with cloud computing is vendor lock-in and interoperability. Cloud providers’ software and hardware platforms vary from vendor to vendor. Since different vendors use different software and hardware architecture for their own system, migrating data from one vendor to another becomes a challenging task for cloud providers’ clients.

If a customer uses IT services provided by two separate vendors, these services cannot be integrated from one vendor to the other. This situation is described as vendor lock-in. With the expansion of cloud computing situations such as data migration, data portability, and vendor lock-in are going to increase.  In addition, the clients do not have any control over cloud service providers’ software and IT infrastructure, which means they become fully dependent on the providers, and cannot control their own IT system.

 All well-established cloud providers such as Google and Amazon have their proprietary database storage.  For instance, Amazon uses Dynamo, and Google uses BigTable. A common interface between these databases does not exist. Salesforce cannot migrate its data to Gmail, and vice versa, as there are no common database systems and common interfaces. 

Some IT professionals think that the solution to vendor lock-in issue could be making all the cloud vendors use a standardized API. The downside to this is that it is likely to decrease cloud vendors’ profit. 

Performance Instability and network lag time

When the network load is high, cloud computing is subject to severe disruptions. These disruptions cannot be predicted in advance, because it depends on the intensity of process loads carried out by multiple cloud service users at the same time.

As reported by ReserachGate, a few Australian researchers conducted stress tests in Microsoft, Amazon and Google, analyzing their cloud service availability and performance based on the varied intensity of process loads. They measured the effects of the sudden workload of 2,000 simultaneous users and found that the response time varied by a factor of 20 at different points of the day.

These days, the local networks and some parts of the wide-area network are replaced with fiber optic cable for sending data, and data travels very fast. However, quite often the internet infrastructure is not capable of handling a large volumes of data causing transmission bottlenecks to happen. 

For instance, computer scientists of the University of California, Berkeley have calculated that it would take 45 days and USD $1,000 in network transfer fees to send 10 terabytes of data via average internet bandwidth from Bay Area to Amazon in Seattle. Alternatively, shipping ten 1-terabyte disks via any standard courier service would take one day and cost only US $400.  

Proper data workload scheduling technique can help resolve performance instability issues. Researchers keep working on finding solutions to fix various drawbacks related to cloud computing so that more enterprises would be interested in moving their IT system into the clouds.