Abstract: This is Part II in a blog series by 6fusion Co-founder and CEO John Cowan on the emerging trend of Cloud Brokerage and the impact it will have on the technology industry and markets. Be sure to check out Part I of the series here.
Part II – The Cloud Vendor and the Agnostic Intermediary
I gave a presentation a few months back in which I opened with the following statement: The future of cloud computing has little to do with technology. Sounds crazy, right? But before you send me to the virtual nuthouse, let me explain.
What I think is unclear in the emerging Cloud Broker business model is the demarcation point between the business of compute and the technical organization of compute. The era of Cloud Brokerage will see these two concepts treated and considered as two distinct threads. The technical organization of compute is what the myriad of cloud software vendors does today. They are the companies that pitch “hey, buy my software and you too can have a cloud” or “hey, run your apps on our infrastructure and you too can pay for IT like a utility.” The role of this vertical in the industry is about the orchestration and management of the compute resources sitting in customer cabinets and cages across the global data center landscape. It is about, as some of the more vocal pundits and talking heads espouse, a complete paradigm shift in the way IT is delivered.
But cloud vendors, despite their impressive efforts to convince us otherwise, are, and never will be, Cloud Brokers.
Cloud Brokers will focus on the business of compute rather than the technical organization of compute. And the business of compute has nothing to do with cloud computing or the technology driving this revolution. The business of compute is about the commoditization of compute, network and storage infrastructure.
There is a big difference.
Analysts and experts measure the market by examining the adoption rates of cloud technologies private and public. Projections vary widely but I think we can all agree we are already in a Total Addressable Market (TAM) in the billions and on a fast track much higher.
That’s a lot of dough. But the real TAM for Cloud Brokerage is much, much bigger.
Cloud Brokers will play a catalyzing role in the establishment of a commodities market for computing, just like soybeans, metals or corn. One of the principal advantages of any commodity market is for businesses to hedge their risk. It is difficult to argue that such practices could not or should not extend to information technology needs (specifically computing).
In order to truly see this market one must look beyond the resources that are “in the cloud” today. Businesses in the future will hedge their total risk – the needs of the organization both internally and externally. That is – the sum total demand for compute resources. To this end, existing market sizing reports dealing with cloud are inadequate to represent the total opportunity. Based on publically available data on server and storage shipment tracking I believe the TAM for Cloud Brokerage to be north of $1 Trillion.
As a central figure to the emergence of a commodity market for compute infrastructure, the Cloud Broker must be an entirely agnostic intermediary. The job of the agnostic intermediary will be to connect those that consume infrastructure and those that produce infrastructure. It will not be their job to influence, direct, sell or support what the infrastructure is ultimately used for. Nor will it be their job to influence what infrastructure supplier is used.
Herein lies the demarcation point between those facilitating the future business of compute and those enabling the technological organization of compute.
This may come as a revelation to some, but it won’t be players like Amazon, Rackspace and the ‘Magic Quadrant’ of telcos and co-lo providers that form the commodity market for compute. Yes, they will no doubt be suppliers to the market. But they will not be the true market makers.
Today’s leading cloud vendors will learn that you can’t be a retail cloud and an intermediary for a cloud market. The very concept would be like being an arms dealer for the war in which you take up arms. It will not work. While important actors in the emerging story, today’s cloud vendors are merely proving out that there is room for many suppliers in the market.
In Part III of this post I will take a look at the some of the important technologies that will emerge to create what I call “The Market Unified.”
By the time this blog post hits the airwaves, 6fusion will have launched it’s second major software product in the first 45 days of 2012. For those of you that run start up software companies, you know what kind of pace it means. To be perfectly transparent, things are, well, nucking futs inside our little company.
The product we launched simultaneously at VMworld Partner Exchange in Las Vegas and Cloud Connect in Santa Clara is called 6fusion Cloud Resource Meter (for VMware vSphere) – more on the vSphere part later.
I wanted to take a few written words to explain a few things about the product and the product strategy. It’s easy to get lost in the noise of the daily product announcements in the cloud space in general but I think this release deserves pause for consideration of what exactly the product does and what it means, and why we chose to launch and support the VMware ecosystem first (‘cause we could have done it with any number of products, and soon will).
In the simplest sense the Cloud Resource Meter integrates 6fusion’s core intellectual property natively within the VMware vSphere software console. 6fusion’s core intellectual property, of course, is the Workload Allocation Cube (WAC). The WAC is the 2004 brainchild of yours truly and my business partner (whom I often call the best kept secret in the cloud computing business) Delano Seymour. The WAC is a dynamic single unit of measurement that encompasses compute, network and storage utilization. We wrote the original algorithm to simplify our ability to meter and bill our private IT customers on the multi-tenant host we had created with ESX 1.0. I first wrote publicly about the algorithm’s use to ‘profile’ apps and the importance of ubiquitous utility metering three years ago. Since 2008 the WAC has powered 6fusion’s proprietary cloud federation platform, called UC6.
So now anybody that owns VMware vSphere 4.1 or later can download a vApp and have the power of the WAC – for free. The WAC shows the customer the granular resource consumption of every single VM under management – instantly.
The problem with current ‘chargeback’ tools and software is their complexity and lack of scope. Complex to design. Complex to implement. And even when implemented, the methodology is relevant only within the four walls of the customer that programmed the complexity.
The WAC is different because it is the first ‘universal’ algorithm. It meters the consumption and utilization of infrastructure not just within the enterprise IT space, but also the multi-tenant host market emerging as ‘cloud providers’. Why use the WAC to meter and bill your VMware infrastructure? Because the WAC is the algorithm your potential multi-tenant hosts use to price their services.
Ah, yes. The penny drops.
The WAC utilization data provides an instant ‘profile’ of my workload requirements, which can then be easily matched to an appropriate host should I decide it more effective to use “the cloud.” This provides a true apples-to-apples comparison of running your workloads in your environment or in the “cloud” and gives you the knowledge you need to optimize your IT operations.
So, why choose to integrate this technology with VMware vSphere out of the gate? Why not other virtualization platforms? Why not emerging open source plays like ‘OpenStack’?
The answer is really quite simple.
VMware, despite the growing amount of backlash they get for becoming the big vendor on the block, represents the single largest footprint to penetrate the Total Addressable Market (TAM) for metered infrastructure. Say what you want about VMware, but the fact remains that the overwhelming majority of businesses – of all sizes – use VMware technology to virtualize privately owned hardware assets. And for 6fusion, this is a significant portion of the infrastructure TAM that we find compelling.
Having said that, the operative phrase here is “out of the gate”. As 2012 progresses look to see Cloud Resource Meter integrated into a number of virtualization and cloud software stacks!
By John Cowan, 6fusion Co-founder and CEO
By Steven Wolford, Director of Information Security, 6fusion
During the season of politics here in the US, I would like to borrow shamelessly from topics in the political debate with a look towards the state of information security.
According to CNN (Poverty Rate Rises as Incomes Decline), the number of US citizens living below what is considered the bare essentials is on the increase. I believe we can say the same for information security programs. According to SANS, the top security controls can be boiled down to 20 Critical Controls (Top 20 Critical Controls). These are regarded as the “poverty line” for an Information Security Program. The bare essentials needed for a program to live at a level regarded as a minimum standard.
Have you turned down a security control because it was too expensive?
ENSIA (the European Network and Information Security Agency) has stated “the same amount of investment in security buys better protection” (Cloud Computing, Benefits, Risks, and Recommendations for Information Security). We have long understood that scale brings cost optimization. By spreading the cost of controls over a larger number of organizations, Cloud Service Providers (CSPs) are able to either deliver equivalent controls at a lower price or enhanced controls at a similar price.
Work with your CSP to understand the controls already implemented, those that are planned, and those that you require for the assets you are moving to the CSP. The different cloud models (software/platform/infrastructure as a service) will each be able to deliver a different set of controls. You should expect to bring more controls to an IaaS provider than to a SaaS provider. However, you should still expect to see cost efficiencies with IaaS.
What if the chosen CSP doesn’t offer the controls you need? Reinvest the capital expenditure (CAPEX) or operating expenditure (OPEX) savings into providing your own controls or even better negotiate with the CSP to get the controls installed and leveraged across all of their customers. Security is moving from “build your own” to “assemble your own” (that sounds like a blog all on it’s own). There is even a growing industry in Security as a Service (SecaaS or SaaS), which is a cloud computing model that delivers managed security services over the Internet. Technopedia defines Secaas as “based on the Software as a Service (SaaS) model but limited to specialized information security services.” Engaging a SecaaS provider is yet another way to help lower the cost of living at the information security poverty line.
Have you not implemented a security control because your environment is too complex?
Your business does not have to be listed on the NYSE for you to have not implemented a security control because your existing IT feels too complicated to integrate with a control or for the cost of applying a control to become cost prohibitive due to IT sprawl.
Most security frameworks today recommend taking a risk-based approach to identifying the controls that are appropriate for any given environment. In order to first identify risk you must know ALL of the components that collectively create an information system. Often the cost of implementing a proper set of controls spirals out of control when attempting to apply them to a complex or spread out system.
Moving an information system into an IaaS CSP is the perfect opportunity to identify, consolidate, and simplify an information system. Identifying all the components of an information system is potentially the most significant step towards proper control selection; you cannot protect what you do not know about. It is still not uncommon to hear about a critical business system that relies on the spreadsheet saved on a folder on the hard drive in someone’s workstation. As an example, when you plan for the security of your current monthly billing do you in fact remember this critical component or do you go about happily installing the latest IDS on the accounting server; congratulating yourself along the way for protecting the companies financial systems.
Consolidating components is at the same time a risk and a benefit (what in life isn’t a dichotomy?). Personally, I see far more benefits and, with the concept of cloud brokering, there are ways to enjoy the benefits while minimizing the risks. Let’s get the scary stuff over first. The risk is that consolidation puts all your eggs in one basket, so to speak. The target becomes a higher value target because the reward of breeching (or the cost of loss) becomes higher. Enter the cloud broker – enjoy the benefits of consolidation by information system but spread the risk by sprinkling your information systems over different CSPs.
What are the benefits that outweigh the risks? Reduced complexity to install, manage, and monitor the controls used to protect the system. There is a reason why banks put valuables into a safe – same risks identified above but even bankers know it is far easier and less costly to put them into a central location.
That leads us to simplify. By moving your information system to a CSP you are able to simplify the implementation of appropriate security controls. One of the leading causes of delay in detecting and responding to a security incident is an overly complicated control implementation. Even if controls are properly implemented in a complicated system, gathering the control information in one place can be difficult (if your environment was such that getting data in one place was easy you would probably already have the information system simplified).
Craig Balding in his cloudsecurity.org blog even lists centralized data as the number one security benefit of “The Cloud”. I think this understates the real benefits. While Craig believes reduced data leakage and monitoring benefits as the winners, I would extend that to improved knowledge of how the system as a whole works and is architected. Move the financial system into an IaaS provider and you will quickly find that critical spreadsheet on that workstation.
Have you not implemented a security control because it was too difficult?
Many modern security controls require infrastructure just as complex as the information systems they protect. Network, application, data, access, logging, and much more all require technical solutions to be implemented, updated, managed, monitored for relevant information, and then responded to when an interesting event happens. It is not surprising at all that some have had to make the decision that applying all of this is just far too difficult. You make a decision that doing that one thing for security is just too hard to digest into your other business responsibilities.
CSPs can help ease that pain. Many security vendors offer solutions that take advantage of cloud architectures and make the implementation process much easier.
Take antivirus (AV) for example. Most major vendors today offer a cloud ready solution where AV can be offered as a SecaaS or in cloud optimized versions to let you maintain total control over the AV solution. Either way, actually implementing the AV solution can be as easy as install the client in a base image and deploy that client with each and every server turned on. EASY.
As we hear the political messages of the day, I encourage your to consider the “Information Security Poverty Line.” Take a look at your security posture and tolerance for risk. Are you forcing the information security program to live below the poverty line? If so, is there something that you can do about that?
I would say YES! The first step, to paraphrase James Carville, is to remember, “It’s the risk, stupid.” Stay tuned for more on that politically inspired theme.
With the massive growth and attention around cloud and utility computing today, many people are trying to wrap their heads around two primary questions:
1. What exactly is cloud and how does it relate to utility computing?
2. How can cloud computing help my organization?
We’ve come up with a short video that simply explains the trends driving cloud computing and the impact the cloud and utility computing can have on your organization.
Check out the video here and let us know what you think.