The Ultimate Guide to Cloud Computing for Beginners!. LIMITATIONS OF CLOUD COMPUTING

LIMITATIONS OF CLOUD COMPUTING

The ones which become ready to move to the cloud still insist on third-party risk appraisal or enquire with cloud suppliers on the following:
  1. By whom the applications and data will be accessed and how will that be scrutinized?
  2. What security methods are used for storage and transmission of data?
  3. How data and applications from diverse consumers are reserved separately?
  4. Where will the data be stored in terms of geographical sites? Will the selection of the site influence us?
  5. Can these details and channels be specified in a service-level contract?
  6. Each of these consumer worries are the chief obstacles to the implementation and development of cloud computing. Some of the limitations of cloud computing are discussed next.

Availability of Services:

As services are a primary concern of consumers, they sometimes need to discard all the data from the cloud environment provided to them, while sometimes they may want to recover all the data. There is an augmented risk of disaster in this when compared to conventional services, as there are more ways to access the application or information over cloud computing.

Data Lock-in:

Shifting of data and applications from one platform to another is a challenge to the cloud provider for a big organization handling high volumes of data. Google is the single cloud supplier to attain a more typical environment and they also have a scheme, known as Data Liberation Front, to support user shifting applications and data in and out of their platform.

Data Segregation:

It is not simple to isolate cloud users from each other. A straight effect of the multitenant control mode, where virtual machines of distinct consumers are co-located on a single server or data on single hard disks, is the main concern related to privacy. This set of risks comprises matters regarding the break- down of mechanisms to separate memory or storage among distinct users.

Amazon EC2 service measured this as a real threat and rectified this attack by effectively overcoming the following:
  1. Finding out where a particular virtual machine command is positioned in the cloud infrastructure
  2. To determine whether two instances are resident in a similar physical machine
  3. The secrecy of the data should be guaranteed, whether or not it is in transit. It should be required to offer a closed box implementation environment where the secrecy and reliability of the data must be confirmed by its possessor.
  4. In a majority of circumstances, data should be encrypted at a certain time when it is within the cloud. Several procedures are unfeasible to perform with encoded data, and moreover performing computation with the encoded data must utilize more computing resources.
  5. The user encodes the data earlier to upload it to the cloud. When specific data is needed, the token creator is used by the user to produce a token as well as decryption key. The token is transmitted to the cloud, the chosen encoded file(s) are downloaded, and after that these files are confirmed locally and decrypted using the key. Sharing is facilitated by transmitting the decryption key and token to the other user with whom you wish to cooperate.

Privilege Neglect:

Companies sometimes take advantage of the liberty given to them. They disclose sensitive data of their company to others for some benefits. The threat of a malicious insider with access to confidential data is a concern for any outsourced computation model. Miscreants might affect and harm the consumer’s fame and brand or openly harm the consumer. Mistreatment of opportunity not only spoils brand name but may also place protected data in the hand of competitive attackers. It must be observed that similar kinds of attacks may be taken out by in-house workers in a conventional infrastructure too.

Scaling Resources:

A web application designer who hosts its service on a cloud can view how the reply time gradually increases when usage of the application rises since the cloud does not scale up resources rapidly enough. The capa- bility of scaling resources up and down to meet workload is one of the chief benefits of cloud computing. Resource pooling through multitenancy is also an important element that is managed by the cloud provider. Separate storage devices are provided to every client on the cloud network, called a single tenant; and in a multi-tenant environment, a single storage device is shared by more than one cloud user as shown in Fig. In the figure, there are two consumers 1 and 2, who are sharing a single shared storage for storing data, so there is the risk of interchanging or risk related to mismatch of data if proper
arrangement is not carried out.


Data Location:

The geographical site of the data also counts as a challenge. Being aware of the geographical site of data is essential to protect it, since there could be significant differences amid rigid strategies in various countries. The route followed by the data is also important. It may be difficult for an application operator to install applications at the smallest ‘distance’ from the users.
At present, there are cloud suppliers who leave the alternative of the data center site to the user. For example, Amazon proposes one site in Europe and two in other countries.

Recovery and Backup:

For safety purpose, keeping the data of a consumer safe at different locations, for easy recovery and backup, if there is any failure, is a big challenge. A proposal of data backup must be proposed to cloud suppliers in of the event of a disaster. This can be achieved by the replication of data diagonally on various sites and the proposal should be referred in the service level contract.

Offline Clouds:

For several users who require an application to be accessible the whole time through, becoming entirely dependent on the Internet could prove to be highly risky or unfeasible. This generates greater trouble in case the user is shuffling and there is a change in the connection quality. Thus in several cases, trusting the Internet service supplier is not an alternative. At present, a web browser is a widely used software application and all applications can be easily accessed through the interactive web browser. Locally, it is not necessary to maintain a hard disk with a strong processor because customized services are available on the cloud.
Google launched Gears, a free-of-cost add-on for the browser, which allows data to be saved locally in a complete searchable database while surfing the Internet. Gears resolved the ‘offline problem’ permitting web applications to resume their working while offline and subsequently coordinating when the link was accessible again.

Unpredictable Performance:

The cloud-end consumer would not even know the number of physical machines on which their application was functioning. The single source of information which the user has regarding these servers is the hardware specification offered by the cloud supplier for every kind of service. The end user has no control over many of them.

Note: These are not drawbacks. These are only limitations with some cloud service providers









Post a Comment

0 Comments