In-Depth

Understanding the Cloud: A Primer for Prospective Users

Key technologies enabling clouds, the three types of cloud environments, and options for getting to the cloud.

By Chris Poelker, VP of Enterprise Solutions, FalconStor Software

Cloud computing -- like the ephemeral bits of the sky from which it draws its name -- is not a new concept. The idea of the cloud has blown in and out of the computing community, seeming within reach in the early 90s and then dissipating for some time after the dot-com crash. The clouds are gathering again today, and this time, they are poised to stay. There are numerous reasons why we should heed this forecast.

A Permanent Cloud Cover for the Computing Landscape

There are several key technologies that make cloud computing possible today that were not widely available in earlier eras. These include virtualization, data deduplication, continuous data protection (CDP), and advanced encryption.

Whereas physical servers and storage were once required to provide applications with computing resources and share disks, the widespread use of virtualization today solves many of the problems related to dedicated servers and storage. Virtualization turns a collection of physical components into an abstract pool of computing resources that enables IT to run an application on a virtual server without regard to which physical device it runs on -- making server hardware a commodity. Applications can be run or moved across physical servers from different vendors at will.

Storage virtualization provides the same function for storage resources by abstracting the physical computing elements from data storage, making storage a commodity as well. Data can be moved across different vendors’ storage offerings, moved between different tiers or classes of storage, or replicated between unlike storage resources within or across data centers, all while applications keep running.

Likewise, data deduplication eliminates the problem of moving massive amounts of data between locations while providing the means to store massive amounts of data with limited resources. This innovation removes the need to move or store up to 97 percent of typical structured and unstructured data, clearing yet another hurdle to cloud permanence.

In terms of backup operations, the advent of continuous data protection killed traditional backup operations and shrunk the time and bandwidth requirements for protection and recovery. Unlike the 24-hour or more delays of the past, CDP recovery time objectives (RTO) are typically 15 minutes or less for a downed server, with recovery point objectives (RPO) all the way down to zero data loss.

Finally, the emergence of advanced encryption delivers the final sign that cloud computing is here to stay. Modern encryption enables secure data movement and access so policy-based access to data resources can be adequately enforced.

Three Types of Cloud Environments

With the enabling technologies in place to support them, three types of cloud environments have emerged. The infrastructure cloud abstracts applications from servers and servers from storage. The content cloud abstracts data from applications, and the information cloud abstracts access from clients to data. All three might be delivered via private, hybrid, or public cloud models, depending on the needs and resources of each enterprise.

An infrastructure cloud employs virtual abstraction so servers and storage can be managed as logical rather than physical resources. This is the most basic form of cloud computing. Infrastructure clouds either migrate existing servers and storage to a virtual server solution and storage virtualization solution or migrate to a cloud provider. In the latter model, the provider creates modular cloud elements consisting of the software servers and storage required to provide infrastructure services for applications.

Content clouds are more advanced computing environments, with abstracted data management for all content, no matter the application or access method. The goal of a content cloud is to abstract the data from the applications so that different applications can be used to access the same data and applications can be changed without worrying about data structure or type.

The ultimate goal of cloud computing is in the information cloud model, which abstracts the client from the data. For example, with an information cloud, the user can access data stored in a database in Germany via a mobile device in New York or watch a video housed on a server in Singapore on a laptop in the U.K. With the Internet as its best-known exemplar, the information cloud does it all.

Getting to the Cloud

For those looking to move traditional data centers to cloud models, the steps involved depend upon the current infrastructure state. If an enterprise has recent investments in servers and storage, for example, it might introduce virtualization by creating a private cloud. For those with older equipment and expiring maintenance agreements, a cloud provider might be a more appropriate option for virtualization. In either case, the goal is to achieve better service levels and higher availability with lower costs.

Regardless of the specific path an enterprise chooses, all prospective cloud users should look to virtualization first. Once an organization gains the data mobility and operational efficiency inherent in virtualization, it can implement data deduplication to optimize storage and finally eliminate backup by using continuous protection. With these elements in place, the enterprise can outsource specific functions to the cloud.

The Time is Right for Cloud Computing

Organizations in nearly every sector are evaluating their data-center operations and determining that the elements have never been more inviting for cloud deployment. There are numerous paths by which to reach the cloud depending on individual goals, needs, and circumstances, but the journey is worth taking for any enterprise eager to increase efficiencies and decrease expenditures.

Chris Poelker is vice president of enterprise solutions at FalconStor Software. Poelker is the co-author of Storage Area Networking for Dummies (2009, Wiley) and is a deputy commissioner on the TechAmerica Cloud2 Commission examining the role of cloud computing in the federal government. You can contact the author at [email protected]

Must Read Articles