In-Depth

Top Cloud Computing Trends in 2013

Cloud will set innovating trends for the entire tech industry. Here are three special trends to watch this year.

By Greg Arnette

Cloud computing has gone through a revolution over the past year. Not only has “cloud” become the buzzword of the 2012, it has also gone through many changes that are paving the way as one of the greatest technologies created. The cloud computing market will be booming for 2013 -- setting innovating trends for the entire technology industry.

Prediction #1: The future of cloud applications will be “transformation,” not “migration”

A false conventional notion about cloud adoption is that current on-premises or co-located SaaS applications can be “migrated” to the cloud, which is when cost savings and reliability will be realized immediately. The primary cloud computing benefits are lower overall costs and increased reliability -- a winning combination compared to the status quo.

The real challenge is understanding that the act of moving an application to the cloud is transforming, not migrating. Migration will get you into trouble. Transformation will set you up for success.

The cloud has different “natural laws” than does virtualized hardware. All enterprise apps will need to be transformed (that is, rewritten) to take advantage of the positive attributes and avoid the pitfalls. An example is horizontal cluster scalability. In the pre-cloud world, functional clusters for services such as distributed file systems were designed for a physical infrastructure, which means consistent bandwidth and hard failures. In the cloud, bandwidth is variable and the failures can be transient. Software needs to be tuned for the cloud in a different way than physical infrastructure does, but in many cases the software needs to be transformed, not migrated. This is the single greatest epiphany to understand about how the enterprise will consume the cloud.

Prediction #2: 2013 will be the year of “enterprise 2 cloud” momentum

The modern cloud era began in 2007 with Amazon Web Services and the introduction of compute, storage, and networking in the form of infrastructure-as-a-service (IaaS). Enterprises were skeptical and kept their distance, but it was the startups that jumped feet first into the cloud and created some amazing businesses. Now it is the enterprise’s turn. Now is the time for the enterprise to take a brave step forward and realize the cloud’s potential. This will eventually happen because of the shifting focus from IaaS to platform-as-a-service (PaaS).

PaaS is becoming more popular because the concepts remove the manual burden from provisioning IaaS services. PaaS offerings include relational-database-as-a-service, key-value-store-as-a-service (NoSQL), automated application deployment, automatic scaling, virtual networking/secure computing, and sending-e-mail-as-a-service. PaaS is growing quickly and there are new offerings every six months.

The only reason the enterprises have been hesitant to move to the cloud is because of PaaS. PaaS removes the greatest barrier to enterprise cloud adoption because prior to this, businesses could not take advantage of the raw building-block components that represent IaaS. Enterprises buy solutions. Startups buy tools to build solutions.

PaaS represents the original premise of the cloud: focus on core competencies and let others supply the commodity building blocks. An application developer creating the next hot mobile application should focus on the user experience and domain expertise. If the application developers are distracted creating an automatic deployment system built on IaaS instead of “renting” the PaaS equivalent, then they’ve wasted effort and are not focused on the core mission.

Prediction #3: Better signal-to-noise ratio filters will be applied to enterprise data

The cloud is the perfect place to solve an enterprise “big data” problem. Building upon the PaaS momentum is the next wave: serverless architectures for managing big data. This is the concept that future cloud applications will be built upon the idea that the cloud is a giant mainframe. Current cloud architectures are based on replicating the design patterns from the pre-cloud colocation era, which means dedicated or semi-dedicated virtual instances support applications. To really solve a big-data analysis problem requires new thinking about the way enterprises consume compute and storage resources.

This is a classic “signal-to-noise” ratio problem. The enterprise is awash in “noise” from large data repositories, but within the noise are actionable “signals.” Moving the data to the cloud, and processing with advanced analyzers powered by cloud computing, will reveal the valuable signal data. An example is analyzing e-mail traffic from customer support requests. Within the e-mail is locked away “customer sentiment.” Every CEO should want to know how their customers feel about the product and company. The cloud, serverless architectures, and big-data algorithms can figure this out more economically than any on-premises application in history.

The cloud allows new concepts to be explored at very low cost. Within the cloud there is increasing interest in creating applications that are not server dependent but can function within an application virtual machine environment, melding the best of PaaS and automation with low-cost and dynamic configurations tuned for optimal cost efficiencies. This is the way to find the signal in all the noise.

Greg Arnette is the founder and CTO of cloud archiving company Sonian. He has been a messaging, collaboration, Internet, and networking expert for over 15 years and has founded two previous technology companies. You can connect with him @gregarnette and read his blog for thoughts on cloud computing, collaboration, and startups at gregarnette.com.

Must Read Articles