Research paper topic on cloud computing and how it has made

Research paper topic on cloud computing and how it has made a huge change in the way data is stored and processed. 10 pages please.

Solution

Cloud Computing:Introduction

Cloud computing refers to an emerging model of computing where machines in large data centers can be dynamically provisioned, configured, and reconfigured to deliver services in a scalable manner, for needs ranging from scientific research to video sharing to e–mail. The speed at which cloud computing has permeated Internet activities is astonishing. This means that instead of all the computer hardware and software you\'re using sitting on your desktop, or somewhere inside your company\'s network, it\'s provided for you as a service by another company and accessed over the Internet, usually in a completely seamless way.

Cloud computing is expanding rapidly as a service used by a great many individuals and organizations internationally. However, users may be unaware that they are using a cloud service. Cloud providers already offer a variety of services, with users employing cloud computing for storing and sharing information, database management and mining, and deploying Web services, which can range from processing vast datasets for complicated scientific problems to using clouds to manage and provide access to medical records.

When discussing cloud computing it is important to note that there are several different components that it encompasses: cloud infrastructure, cloud platform, and cloud application. Cloud infrastructure (Infrastructure as a service) refers to providing a computer infrastructure as a service. An example of this would be Amazon’s Elastic Compute Cloud (EC2). Cloud infrastructure includes computational resources, such as infrastructure as a service, and also storage such Amazon’s S. Cloud platform (platform as a service) refers to providing a computer platform or software stack as a service. Platform is a higher level of abstraction than infrastructure. An example of a platform would be Google’s App Engine or Salesforce. Cloud applications are Web services that run on top of a cloud computing component.

Many individual users already regularly use services that run on a cloud computing component including e–mail services (e.g.Google mail, Hotmail, and Yahoo!), photo and video services (e.g., Flickr and YouTube), and online applications (e.g., Google Docs and Adobe Express), as well as more targeted services like storing data files and processing large volumes of data. In fact, 69 percent of people online use at least one cloud computing–based service, whether or not they realize it is a cloud–based service. At the corporate level, cloud computing services are already available from Google, Amazon, Yahoo, Salesforce, desktop Two etc. In late 2008, Google announced that they were able to sort one petabyte of data in roughly six hours using 4,000 computers, via their cloud computing architecture. There are also educational uses of cloud computing being researched, with an academic–industrial collaboration spearheaded by Google and IBM, in conjunction with six major research universities in the United States, through which the companies are providing faculty and students with access to clouds for research and education.

What is the cloud?

The tremendous growth of the Web over the last decade has given rise to a new class of “Web–scale” problems — challenges such as supporting thousands of concurrent e–commercial transactions or millions of search queries a day. In response, technology companies have built increasingly large data centers, which consolidate a great number of servers with associated infrastructure for storage, networking, and cooling, to handle this ever–increasing demand. Cloud computing can also serve as a means of delivering “utility computing” services, in which computing capacity is treated like any other metered, pay–as–you–go utility service. Over the years, technology companies, especially Internet companies such as Google, Amazon, eBay, or Yahoo, have acquired a tremendous amount of expertise in operating these large data centers in terms of technology development, physical infrastructure and process management.

Cloud computing represents a commercialization of these developments. Prior to cloud computing, acquiring such resources, the initial capital investment in purchasing the computers themselves and the significant resources devoted to maintaining the infrastructure was an expensive and unlikely proposition for organizations and simply impossible for individuals. Now, cloud computing has the potential to benefit both providers and users. Cloud providers gain additional sources of revenue and are able to commercialize their large data centers and the expertise of large–scale data management. Overall cost is reduced through consolidation, while capital investment in physical infrastructure is amortized across many customers. Individual cloud users can store, access, and share information in previously inconceivable ways.

The advent of cloud computing matches the need created by the large amount of information available in electronic format today, which creates data and processing intensive problems for a wide variety of organizations and individuals. Financial companies maintain mountains of information about clients; genomics research involves huge volumes of sequence data; and even the serious hobbyist may have more video footage than can be reasonably processed by available machines. However, the most commonly used forms of cloud computing are more personal and extremely popular Web–based e–mail services and information sharing services.

Who uses the cloud?

There are three primary ways how a cloud can be used. In one way, the cloud simply hosts a user’s application, which is typically provided as a Web service accessible to anyone with an Internet connection. In such cases, the cloud provider takes over the task of maintaining and running a company’s inventory database or transaction processing system. The second way may be thought of as batch processing, in which the user transfers a large amount of data over to the cloud along with associated application codes for manipulating the data. The third way is the temporary use of cloud computing services in conjunction with existing IT infrastructures. This method has been referred to as cloud bursting and is beneficial to handle temporary peaks or seasonal peaks in traffic. In all cases, it is extremely important to remember that the user’s data and applications reside on the cloud cluster, which is owned and maintained by the cloud provider.

A reasonable question underlying is: Where is “the cloud” located? The cloud service as an application can be anywhere and everywhere one has access to a computer. However, the cloud is actually comprised of the networked computers and servers and related infrastructure, so they physically have to be somewhere. One of the assumptions of cloud computing is that location does not matter. That is both correct and incorrect. From a cloud user’s point of view, location is irrelevant in many circumstances; from a cloud provider’s point of view, it can be of critical importance in terms of issues of geography, economics, and jurisdiction. As such, data centers must be somewhere and they can be anywhere so long as physical geography is right, but national geography is not the first consideration in many cases.

Where is the cloud?

Cloud computing is a metaphor, whose origins begin with computer diagrams. The cloud itself is an abstraction and is used to represent the Internet and all its complexity. When network administrators construct diagrams of computer networks the image of a cloud is used to reference the Internet as a resource without needlessly illustrating its complexity. Therefore, the “cloud” in cloud computing represents a complex and powerful resource, which is obfuscated to its users. As the term cloud came from mapping and diagramming and was used as an abstraction, it is highly ironic that the implementation of cloud computing leads us back to talking about locations.

The power of the clouds derives from the countless computer servers that comprise it. Google, for example, is reported to own over one million servers spread across the globe to power everything from search to Web–based applications. Due to the benefits gained from economies of scale, the majority of these servers are concentrated in a handful of large data centers, each hosting tens if not hundreds of thousands of machines. The placement of data centers, each of which represents a significant investment, is a major issue for Internet companies as well as for the organizations and individuals that rely on the services run through those centers. “Data centers are essential to nearly every industry and have become as vital to the functions of society as power stations are”. There are four primary considerations of where a data center is constructed:

1.      Suitable physical space in which to construct the warehouse–sized buildings

2.      Proximity to high–capacity Internet connections

3.      The abundance of affordable electricity and other energy resources

4.      The laws, policies, and regulations of the jurisdiction

Beyond physical considerations, proximity to high–capacity Internet connections is also important, since a data center’s value is measured in the number of users that can rapidly tap into its power. Thus, it is desirable to place data centers close to the “Internet backbone,” or the main “trunks” of network that carry most of its traffic; although some companies may elect to build their own network links. Finally, since data centers consume vast amounts of energy, locations with cheap energy are highly attractive.

Cloud computing deployment models:

Cloud computing services can be private, public or hybrid.

Private cloud: These services are delivered from a business data center to internal users. This model offers versatility and convenience, while preserving the management, control and security common to local data centers. Internal users may or may not be billed for services through IT chargeback.

Public cloud model: It is a third-party provider delivers the cloud service over the internet. Public cloud services are sold on demand, typically by the minute or hour. Customers only pay for the CPU cycles, storage or bandwidth they consume. Leading public cloud providers include Amazon Web Services (AWS), Microsoft Azure, IBM SoftLayer and Google Compute Engine.

Hybrid cloud: It is a combination of public cloud services and on-premises private cloud -- with orchestration and automation between the two. Companies can run mission-critical workloads or sensitive applications on the private cloud while using the public cloud for bursting workloads that must scale on demand. The goal of hybrid cloud is to create a unified, automated, scalable environment that takes advantage of all that a public cloud infrastructure can provide while still maintaining control over mission-critical data.

What makes cloud computing different?

It\'s managed: Most importantly, the service you use is provided by someone else and managed on your behalf. If you\'re using Google Documents, you don\'t have to worry about buying umpteen licenses for word-processing software or keeping them up-to-date. Nor do you have to worry about viruses that might affect your computer or about backing up the files you create. Google does all that for you. One basic principle of cloud computing is that you no longer need to worry how the service you\'re buying is provided: with Web-based services, you simply concentrate on whatever your job is and leave the problem of providing dependable computing to someone else.

It\'s \"on-demand\"

Cloud services are available on-demand and often bought on a \"pay-as-you go\" or subscription basis. So you typically buy cloud computing the same way you\'d buy electricity, telephone services, or Internet access from a utility company. Sometimes cloud computing is free or paid-for in other ways (Hotmail is subsidized by advertising, for example). Just like electricity, you can buy as much or as little of a cloud computing service as you need from one day to the next. That\'s great if your needs vary unpredictably: it means you don\'t have to buy your own gigantic computer system and risk have it sitting there doing nothing.

It\'s public or private

Now we all have PCs on our desks, we\'re used to having complete control over our computer systems—and complete responsibility for them as well. Cloud computing changes all that. It comes in two basic flavors, public and private, which are the cloud equivalents of the Internet and Intranets. Web-based email and free services like the ones Google provides are the most familiar examples of public clouds. The world\'s biggest online retailer, Amazon, became the world\'s largest provider of public cloud computing in early 2006. When it found it was using only a fraction of its huge, global, computing power, it started renting out its spare capacity over the Net through a new entity called Amazon Web Services. Private cloud computing works in much the same way but you access the resources you use through secure network connections, much like an Intranet. Companies such as Amazon also let you use their publicly accessible cloud to make your own secure private cloud, known as a Virtual Private Cloud (VPC), using virtual private network (VPN) connections.

Storage of Data in Cloud:

The storage of data in cloud enables users with a facility to access data as per their requirements irrespective of their time and location. In cloud, data is stored at data centers in form of clusters of raw data servers which enables retrieval of this raw data. Cloud users find it very convenient to move their data to cloud because they no longer have to worry about making any huge investments for hardware infrastructure and their maintenance and deployments. Storage services in cloud involves delivery of data storage as a service. The resources are often provided through a utility computing like basis i.e. the part of the cloud resources a user is using is the only part he is charged for, it can be for a month, a week or just a day. In an optimal cloud storage management system is introduced. The biggest advantage of cloud storage is the geographical independence that it offers i.e. the data in cloud can be accessed from any location at any time using any computing device. For example users can pay the cloud service provider for storing some of the important data that he/she might need to retrieve at a later instance of time at some other location using some other device. Since the data in cloud is stored at some external location away from the users therefore this data is prone to vulnerable attacks by external sources by some unauthorized person or by some internal sources which can be due to some untrusted service provider etc. PosixCloud has been developed which is a storage cloud with enhanced metadata scalability in order to ensure proper hosting of data in cloud an efficient data storage auditing protocol for cloud. Cloud Zone is a cloud data storage architecture based on multi agent system architecture and consists of two layers cloud resource layer and MAS layer .Several protocols are present to ensure access to data present in cloud.

Cloud Database Management system:

DBMS available on cloud can be of varied forms apart from relational. For example Google’s big table is not relational; Microsoft SQL Azure is a fully relational DBMS. Cloud Data base management system is a distributed database that enables computing resources to be made available as service via an internet connection rather than as a product. In cloud dbms applications are connected to a database which is available on cloud. Native cloud data bases are considered to be more efficient, available, elastic and scalable than the traditional ones.

Data is stored in an encrypted form during storage and hence no programming is required for its
encryption or decryption. Some of the key responsibilities at the end of cloud service provider of data are

1. Scalability: cloud database providers must be able to support very large database, with low latency.
2. Data replication: issues regarding data replication are to be handled by the cloud service providers.

3. Recovering from failure: in case of any transactional failures or any other failures due to networks etc. cloud service provider is responsible for detection of such a failure and is also responsible for rectifying it.

4. Allocation or reallocations of servers: Its the responsibility of the cloud service provider to increase or decrease the number of allocated servers as per the users requirements.

5. Availability: cloud database must be made available at all the times irrespective of any failures due to network, or non availability of data centers i.e. cloud services must have high failure tolerance levels.

6. Privacy: privacy of the user’s data must be ensured at all times by the service providers.

7. Securing data: cloud clients needs to just enter their data and after the client has entered his or her data; it’s then the responsibility of database provider to ensure security of the data of its client.

8. Metering: cloud service providers must be able to properly estimate the amount of usage a
particular client.

9. Service availability across geographical locations: The cloud services must be made available
across the world irrespective of the geographical locations.

10. Simple APIs: Simple and user friendly interfaces must be made available to the clients.

11. Operational ease: Cloud based systems must be easy to operate, and must provide the customers with a user friendly interface.

Cloud BigData challeneges:

The rise of cloud computing and cloud data stores have been a facilitator and a great advantage to the emergence BigData. It has significant advantages over traditional physical deployments. However, cloud platforms come in several forms and sometimes have to be integrated with traditional architectures.

1)A decade ago an IT project or start-up that needed reliable and Internet connected computing resources had to rent or place physical hardware in one or several data centers. Today, anyone can rent computing time and storage of any size. The range starts with virtual machines barely powerful enough to serve web pages to the equivalent of a small supercomputer. Cloud services are mostly pay-as-you-go, which means for a few hundred dollars anyone can enjoy a few hours of supercomputer power. At the same time cloud services and resources are globally distributed. This setup ensures a high availability and durability unattainable by most but the largest organizations.

2) Cloud computing employs visualization of computing resources to run numerous standardized virtual servers on the same physical machine. Cloud providers achieve with this economies of scale, which permit low prices and billing based on small time intervals.

This standardization makes it an elastic and highly available option for computing needs. The availability is not obtained by spending resources to guarantee reliability of a single instance but by their interchangeability and a limitless pool of replacements. This impacts design decisions and requires to deal with instance failure gracefully.

3) Large reqiurement of cloud storage for companies.

Three main cloud architecture models have developed over time; private, public and hybrid cloud. They all share the idea of resource commodification and to that end usually virtualize computing and abstract storage layers.

Organizations that are faced with architecture decisions should evaluate their security concerns or legacy systems ruthlessly before accepting a potentially unnecessarily complex private or hybrid cloud deployment. A public cloud solution is often achievable. The questions to ask are which new processes can be deployed in the cloud and which legacy process are feasible to transfer to the cloud. It may make sense to retain a core data set or process internally but most big data projects are served well in the public cloud due to the flexibility it provides.

Getting started: Typical cloud big data projects focus on scaling or adopting Hadoop for data processing. MapReduce has become a standard for large scale data processing. Tools like Hive and Pig have emerged on top of Hadoop which make it feasible to process huge data sets easily. Hive for example transforms SQL like queries to MapReduce jobs. It unlocks data set of all sizes for data and business analysts for reporting and greenfield analytics projects.

Data can be either transferred to or collected in a cloud data sink like Amazon’s S3, and Microsoft Blob Storage, e.g. to collect log files or export text formatted data. Alternatively database adapters can be utilized to access data from databases directly with Hadoop, Hive, and Pig. Qubole is a leading provider of cloud based services in this space. They provide unique database adapters that can unlock data instantly, which otherwise would be inaccessible or require significant development resource.

Ideally a cloud service provider offers Hadoop clusters that scale automatically with the demand of the customer. This provides maximum performance for large jobs and optimal savings when little and no processing is going on. Amazon Web Services Elastic MapReduce and Azure HDInsight, for example, allow scaling of Hadoop clusters. However, the scaling is not automatically with the demand and requires user actions. The scaling itself is not optimal since it does not utilize HDFS well and squanders Hadoop’s strong point, data locality.

Green computing:

\'Green computing\' is a discipline that looks at how to make efficient use of ICTs and minimize their environmental impact. Sustainability has become a major concern in recent years, as it is now calculated that CO2 emissions derived from these technologies account for 2% of the total pollution of this type, which is the same proportion produced by airplanes. However, as Jordi Torres explains, \"The problem is that the increase in energy requirements is exponential, and there are no signs of a change in this trend”. Computers already consume a huge amount of energy and the figure doubles every few years, which could ultimately strain the supply to breaking point.

Many other technologies are being implemented in coming future in cloud computing.

Research paper topic on cloud computing and how it has made a huge change in the way data is stored and processed. 10 pages please.SolutionCloud Computing:Intro
Research paper topic on cloud computing and how it has made a huge change in the way data is stored and processed. 10 pages please.SolutionCloud Computing:Intro
Research paper topic on cloud computing and how it has made a huge change in the way data is stored and processed. 10 pages please.SolutionCloud Computing:Intro
Research paper topic on cloud computing and how it has made a huge change in the way data is stored and processed. 10 pages please.SolutionCloud Computing:Intro
Research paper topic on cloud computing and how it has made a huge change in the way data is stored and processed. 10 pages please.SolutionCloud Computing:Intro

Get Help Now

Submit a Take Down Notice

Tutor
Tutor: Dr Jack
Most rated tutor on our site