By John Hawkins, Senior Cloud Architect, 6fusion
Let’s agree that the mission of an IT organization is to provide an information processing capability that benefits the business, but how do you measure the success of that mission. What are the IT metrics that matter?
Many IT operational metrics may make sense for any given organization. One common IT metric that most organizations must deal with is IT resource consumption – i.e. which IT infrastructure is being consumed by which users in the organization. Some of the more important questions most IT managers get asked by the CFO, CIO or business line executives is:
- Who is using all this expensive IT infrastructure we bought?
- How do you know which parts of your business consume the most IT resources and which consume the least?
- How can you prove who is using what and how much that is costing our business?
Typically, one would suggest that an IT metric that supports a chargeback or showback system wherein costs to run the IT resources are somehow equitably charged to the internal consumers of IT infrastructure. The problem with most solutions that attempt chargeback or showback is:
- How do we best to determine an equitable distribution of costs?
The key to solving this problem is an automated metering of IT resource consumption for physical, virtualized and cloud infrastructures – so you get a comprehensive view of IT infrastructure consumption.
Granted, anyone can manually determine IT resource consumption of their own infrastructure with a set of IT metrics they declare are representative. One approach is to use the six IT metrics that 6fusion collects required for operating any IT system, application or workload:
- Network I/O on the LAN
- Network I/O on the Internet
- Disk I/O
The value 6fusion brings to this problem of equitable distribution equation is in a standardized algorithm that takes into for calculating a single unit of IT resource consumption from these six metrics. We call this standard unit of measure the “Workload Allocation Cube” or WAC.
Most modern machines consume a few thousand of these WACs per month, so we commonly refer to the IT metric in kiloWAC or kWAC. The idea behind the WAC is much like that of the Watt for measuring electric consumption. Both the WAC and the Watt represent an agreed upon calculation of underlying metrics to determine the amount of work consumed by the metered environment.
The nice thing about the WAC is that you can meter any machine in your environment and with a tagging feature be able to create logical meters for any set of machines that you might be interested in seeing total IT resource consumption. For example, all the machines used by a particular application, business unit, office, or region. Tags can be applied at any level that makes sense for a particular organization.
This gets very interesting when you consider the ability to meter any type of machine – physical, virtual or cloud based – combined with the common metering algorithm such that a machine’s kWAC of consumption is the same on any set of infrastructure. In other words, with the common metering of the kWAC, a machine running on Company ABC’s infrastructure would consume the same number of kWACs on Company XYZ’s infrastructure or another company’s cloud offering.
- A WAC-is-a-WAC regardless of whose resources a machine is consuming.
So back to this article’s premise of “IT Metrics That Matter”: Measuring IT resource consumption with a comprehensive, consistent unit of measure is the foundation for IT metrics that matter, because it gives you a baseline to know where you are starting from in order to improve your IT infrastructure efficiency, operational costs, and value to the business.
Get access to the 6fusion platform here and see how this can work for your organization.
With the massive growth and attention around cloud and utility computing today, many people are trying to wrap their heads around two primary questions:
1. What exactly is cloud and how does it relate to utility computing?
2. How can cloud computing help my organization?
We’ve come up with a short video that simply explains the trends driving cloud computing and the impact the cloud and utility computing can have on your organization.
Check out the video here and let us know what you think.
I love this time of year because it is one of those rare occasions during the corporate and product development process where creative ideas and concepts designed to stimulate future success enter the entrepreneurial blood stream. It is that rare moment where you have the benefit of an entire year of business fresh in your mind to build upon and an entire new year ahead of you to set new standards and push the envelope of success.
For our company and for the industry, 2010 was a huge year. We completed our Series A round of venture financing, relocated the company to the coveted North Carolina State University’s Centennial Campus and tripled the size of our team. Meanwhile, the industry took meaningful steps toward maturity as mainstream private sector businesses and governments of all shapes and sizes began giving IaaS a very serious look. If 2010 was the year of formal organization, 2011 will be the year of some serious and meaningful growth. Not just for our company and our technology, but for the IaaS market as a whole.
In a post I wrote recently I did my best to explain some of the core characteristics that would be central to IaaS achieving mass adoption as the technology revolution marches forward. While I think it’s very difficult for anyone to offer up accurate predictions for the year ahead of any fledgling market, there are some specific ‘themes’ that I think, as we look back a year from now, will have clearly emerged as bell weather trends in the industry.
To borrow a format from Peter King, one of my favorite sports writers, here are the six things (6 things, 6fusion, get it?) I think I think (for the cloud biz in 2011):
- Hybridization Will Prove Critical to Enterprise Adoption. I’ve been to the edge and back and I have a few words of wisdom to share with my peers about the Enterprise cloud. Unless what you are doing bridges a gap between what exists inside the four walls of the enterprise data center and what might safely and securely exist outside of those four walls you are just another GUI in the Red Ocean peddling the same wares we’ve seen for years. Hybridization is something enterprise buyers will use to separate the crème from the crop in 2011.
- Regional Clouds Unite. The arms race among regional managed hosting providers to beef up for cloud services was evident in 2010. But the silo approach to building up IaaS on a regional basis will prove difficult if not impossible to compete on scale – and it won’t take long to figure this out. In 2011 expect to see the concept of broad-based IaaS federation become a much more prominent theme as owners of regional facilities and compute partner to create scale and increase market size in the quest to truly monetize their resources and compete with the national players.
- The Ecosystem is Bigger Than the Organism. The IaaS industry is beginning to realize that the creation and quantification of IaaS demand is much more important than the creation of supply. Its one thing to have the capability to power or enable the creation of IaaS resources, but it is entirely another to drive revenue and margin to the cloud. The emergence of business ecosystems will be a consistent theme for the coming year because partnering is the key to success in a nascent market. In 2011 you will see more and more eyebrow-raising deals announced based on ‘synergistic’ partnerships – partnerships that drive mutual revenue and margin between companies that are bound by the common interest of leveraging, distributing and powering IaaS.
- It’s All About the Channel. Building a global business tackling one end-user customer at a time doesn’t scale if your business is supposed to compete with the market pioneers. In order to generate a serious outbound push to globalize IaaS the cost of business acquisition will be too high for almost every player. In 2011 IaaS vendors will wake up to the fact that they need help in order to scale revenues and ultimately generate the ROI they are promising shareholders. Queue the channel gold rush.
- Communities Will Emerge. I subscribe to the notion that one day every business in every vertical will consume a form of public cloud – but we are not anywhere close to this reality. Large scale IaaS operated by a trusted third party and made available to a select group of common-interested stakeholders is a concept that has legs. Trust me on this one. Building out community clouds will emerge in 2011 as one of, if not the most important, concepts to help accelerate IaaS adoption.
- A Course Will Be Charted for an IaaS Futures Market. If you don’t subscribe to the notion that the final destination for this ride is a commodity exchange for compute, stop and take a look around. Spot markets emerged in 2010, much to the surprise of many industry pundits. But spot markets, as novel as they are, do not a true market make. The real money and the real opportunity are in futures trading. There are forces at work on this as I type away, and although you won’t actually see compute on a major exchange in 2011, do expect to see this theme to creep it’s way into mainstream IaaS thinking.
Ok, so with the predictions for themes and threads out of the way, I’ll conclude this post with the 6 things I’ll be watching closer than my wallet at a pick-pocket’s convention as 2011 progresses:
- Shifting Big Iron: Companies like HP and IBM have yet to emerge with serious IaaS plays and if you read the tea leaves they won’t any time soon. I’ll be watching to see if any of the whales in the pool make a splash in the IaaS business.
- Processor Plays: Intel made huge moves in the cloud in 2011 and you don’t need your tarot cards out to see where they are going. Anyone know what AMD is thinking these days? I’ll be watching to see if this gentle giant makes any moves that can rival thier kool-aid-drinking-all-in-pot-committed competitor.
- Government Clouds: The GSA announced a major IaaS initiative announcing a schedule of vendors that could be purchased from their schedule. But will these IaaS vendors truly make any money this way? I’m not so sure. My personal opinion is that the money is at a different level of the Public Sector. Can’t wait to see!
- Hypervisor Competition: KVM is rocketing up the relevance chart. No doubt. I’ll be watching to see how VMware plans to keep it’s toe-hold on the hypervisor market as IaaS enablement begins to drive more and more purchasing decisions.
- Network Providers: The accelerated adoption of cloud services will put a big piece of the pie squarely in the hands of the network operators. I will be watching to see how Network operators jockey to position themselves. I don’t think it is a foregone conclusion that operators will follow the lead of companies like BT and DT.
- Disclosure Watch: As more and more private sector orgs make the move to the cloud, the greater the potential that something somewhere is going to go wrong. I will be keeping a watchful eye on key disclosures and cloud failures which could dramatically stunt the industry’s pace of growth.
6fusion’s first webinar of our 2011 series called: “Make your 2011 New Year’s cloud Resolution Now”. I’ll be elaborating on some of these points and drilling down into how service providers can drive new business to kick the session off. Come join the discussion!
Raleigh, NC – November 11, 2010 – 6fusion, a company that has developed a system to take control of third party computing resources and create a single utility to meet the needs of the IT Service channel, is the latest company to become a partner on NC State University’s Centennial Campus.
The company is occupying space in the Venture IV building on the research park and technology campus.
“We are delighted to have 6fusion on campus,” said Dennis Kekas, associate vice chancellor of the Centennial Partnership office. “With its background in cloud computing and our research in that area, we think they are an ideal partner going forward.”
6fusion has developed an algorithm that radically simplifies the metering, consumption and billing of compute resources, called the Workload Allocation Cube (WAC). The company also has developed a platform called UC6, which provides a single pane-of-glass user interface for customers to dynamically provision cloud workloads internal or external to their organization.
“We spent a considerable amount of time with the team at Centennial Campus after we completed our relocation to the Research Triangle,” said John Cowan, CEO of 6fusion. “Centennial Campus is not only an exciting, intellectually stimulating place to locate an entrepreneurial venture – it’s also a unique venue that allows us to partner on research and development facilities in a campus atmosphere that is more than just office space.”
6fusion makes iNode computing power available exclusively through IT service providers, independent software vendors and managed service providers. The company uses iNodes to build and launch ‘cloud’ based services to its user communities and customers worldwide. The company bridges the gap between supply and demand of utility computing resources with the company’s software technology called UC6. UC6 is a single console that handles all of the metering and billing of the “infrastructure” and deployment and control of customer “applications.”
In addition to the corporate relocation, 6fusion has also partnered with NC State’s Institute for Next Generation IT Systems (ITng) to develop collaborative research initiatives. ITng is also located on Centennial Campus.
“ITng is a perfect fit for 6fusion’s long term R&D program,” said 6fusion co-founder and CTO Delano Seymour.
Integration will help service providers manage their customer cloud systems more simply and easily by centralizing customer information and lower the cost of cloud systems support
Durham, North Carolina – November 3, 2010 – 6fusion, the leading provider of utility billed Infrastructure as a Service (IaaS) for the channel, today announced participation in the ConnectWise Developer Network program. 6fusion will deliver integration between the UC6 cloud platform and the ConnectWise PSA business operating system, offering service providers an integrated workflow and user experience.
The integration between UC6 and ConnectWise PSA will focus on providing integrated workflows and reducing duplication of effort and data by allowing service providers to import customer and user accounts from ConnectWise into the UC6 platform. This will help service providers centralize their customer information, improve the process for keeping customer information up to date and reduce the duplication of effort.
“With the UC6/ConnectWise PSA integration, service providers can drive additional growth and profitability from the cloud by lowering operating costs and improving organizational scalability,” explained Rob Bissett, Vice President of Product Management for 6fusion.
Additionally, UC6 will export all workloads to ConnectWise as managed configurations, which will improve the service provider’s ability to offer their customers exceptional service as well as to include cloud workloads in ConnectWise reports.
“We are excited to be working with 6fusion to provide improved operational support for cloud-deployed workloads,” said Jeannine Edwards, Director of ConnectWise Community. “We are committed to partnering with leading vendors to drive additional value to our community.“
To learn more, visit www.6fusion.com or stop by Booth #422 at the ConnectWise IT Nation 2010 event in Orlando, Florida, November 4-6, 2010.
Contact: John Cowan, 919-917-5150
PR: IaaS Leader 6fusion Launches Comprehensive Cloud Computing Platform for Data Center Operators and IT Service Providers
- Building, controlling and maintaining cloud workloads running on 6fusion’s iNode Network or privately within the customer’s own data center
- Integration of the light weight UC6 Profiler agent, released in 2009, into the UC6 Console dashboard, giving service providers the capability to perform deep pre-migration analysis
- Capability for data center operators and customers to launch new 6fusion Infrastructure Nodes anywhere in the world from a centralized NOC
- Instantly ‘unplug’ workloads from the cloud and redirect them elsewhere
- True metered utility powered by 6fusion’s patent-pending Workload Allocation Cube algorithm
- Integrated granular charge back capability for enterprise resource segments
- A rich set of integrationg capabilities to allow external programs to take advantage of the highly modular design of UC6.
UC6 is a software platform that converts virtualized servers, network and storage into a billable utility and makes the utility computing resources accessible to external users. 6fusion federates independent third party data centers, which comprises its iNode Network. The iNode Network is used by IT Service Providers and Independent Software Vendors to deploy cloud based solutions on behalf of their customers, paying only for the compute resources consumed. 6fusion is the only 100% channel-only focused IaaS enabler in the market.
UC6 can also be deployed inside a private enterprise by 6fusion Solution Partners, which 6fusion has been quietly doing for the past several months. “The ability to create a single interface for Enterprise customers to deploy workloads internally or externally onto the iNode Network is in very high demand in the cloud industry,” said 6fusion co-founder and CTO Delano Seymour, the principal architect behind UC6. “Using UC6 3.0, customers can deploy workloads to either their own private data center or one of our multi-tenant data center partners in a matter of minutes,” he added.
UC6 3.0 was also designed to be hypervisor independent, a key feature for the future of IaaS. “There is a lot of debate going on right now over the viability of virtualization vendors offering full cloud management solutions, but our customers don’t want to be locked in to one vendor,” said 6fusion co-founder and CEO John Cowan. “UC6 3.0 architecture will allow the customer to use their choice of hypervisor without compromising the richness and functionality of the cloud or getting locked in,” he said.
The new 6fusion platform also features the UC6 Profiler, which was introduced to the market a year ago. Since launching the free tool, customers and partners have been using it to analyze the potential cost of moving to the cloud before conducting any actual migration. “Profiler allows our partners to gain valuable insight into the cost performance of customer applications they are thinking about migrating to the 6fusion iNode Network,” said 6fusion Director of Partner Development Doug Steele. “With our new release, the Profiler agent can be deployed directly from the UC6 Console,” he added.
Data residency control and self-service provisioning were considered high on 6fusion’s priority list for UC6 3.0. “When we started our company customers gravitated to us because we could assure them control over where their data sits,” Steele explained. “Now, customers in one geography can ensure that some data remain local and other data can be processed in a completely different geography, without ever having to leave 6fusion’s console to accomplish the task,” he added.
UC6 3.0 usage is based on 6fusion’s patent-pending algorithm, called the Workload Allocation Cube (WAC). The WAC algorithm dynamically blends the critical compute resources required to operate practically every x86 based software application, yielding a single unit of measurement. “The Workload Allocation Cube is the most granular unit of measurement for cloud infrastructure on the market,” said Mr. Seymour. “Our customers have been using the WAC for over three years to meter cloud infrastructure because of our unique ability to simplify the cloud consumption experience,” he added.
UC6 3.0 is first being made available for existing partners and customers, followed by a general public release scheduled for later in the year. For more information about 6fusion UC6 3.0 or other 6fusion related technologies, email info(at)6fusion(dot)com or visithttp://www.6fusion.com
I’ve been off the radar in recent weeks as things around 6fusion have been busy, but few weeks ago we blogged about the Profiler applicationwe’ve been working on. I’ve just come up for a bit of air on the project but while I was in the thick of things we unearthed an interesting sample of real life, multi-faceted, cloud impact. I thought it was worth sharing and my IT Director friend at the company I’m about to tell you about said it was ok to share a tidbit with the world (Special thanks, S.).
Before I get into some of the details of this post, let me give you a little snapshot of the Enterprise operation we’ve been working with:
Location: Caribbean/Latin America
Industry: Financial Services
Data Centers Operated: One internal/One co-lo
Main Challenges: The client faced infrastructure budget restrictions, which stresses application time to production cycles. In addition, the customer needed to reduce the cost of protecting mission critical systems without compromising their established RPOs and RTOs. Most importantly, certain data and applications could not be operated in North America for compliance/regulatory reasons.
Like many IT leaders today, the IT Director of this company was looking to the cloud to potentially address their business requirements.
Using 6fusion’s Profiler technology currently in controlled beta, the customer was able to determine a projected cloud computing cost application by application across their enterprise. Here is a snapshot of their live data output:
What we find intriguing about the report capabilities we’ve enabled is that we are helping the customer cross the cloud chasm by using dollars and cents as the vessel. We believe the price to value ratio trumps even the most innovative of technologies in almost every purchasing decision. So it stands to reason that if you can’t tell a customer ‘what it costs’, you are pretty much just selling to yourself.
Cloud computing is no different.
Newly armed with valuable information about the cost of running their EXISTING production applications in active and passive states in the cloud, the IT Director could make confident business decisions that not only met his technical objectives, but also the objectives in his counterparts in the Finance Dept.
An interesting observation about integrating 6fusion into the enterprise is that our technology helps to blur the line between public and private cloud infrastructure. By turning the client’s own infrastructure into a self-contained cloud (or private cloud) using the same algorithm that powers our public cloud offering, we can effectively create a permanent economic bridge between the two environments (and like a real bridge, it can support free-flowing two-way traffic).
So here is how this individual client is using this economic bridge to drive cloud migration priorities for their organization: They identified a set of workloads they urgently need infrastructure to perform vital test and dev processes. Inexpensively and safely, they can operate those workloads in cost-reduced resources located in the U.S. Next, they identified two mission critical systems – Exchange and SQL – that they can duplicate leveraging the public cloud infrastructure located in the Caribbean/Latin America region. This is critical for the customer because things like email and databases must remain in certain jurisdictions only (excluding the U.S). This issue transcends many enterprise cloud deployment scenarios and the subject has is getting a lot of coverage lately.
The end result for the client:
They achieve public cloud leverage at a financial pace they can handle out of the gate and in the future
They effectively doubled their data center footprint to include utility resources located in the U.S and Caribbean/Latin America regions
Because of the ability to cloud profile, they can make periodic future cloud migration decisions in lock step with the constraints put on by the Finance Dept
With a cloud profile, they know cloud costs well before they spend time and resources tapping into and testing a public cloud
They maintained data residency integrity – a crucial show stopper for any cloud consideration in the past
They have a public cloud infrastructure that runs both their web oriented apps as well as their line-of-business apps, eliminating the need for cloud-silos.
The future for this client, like many others we’ve begun to collaborate with in recent months, is rooted now in cloud efficiency. Here are some of the questions 6fusion and its partners will help address for IT operations:
How can I make my current production applications scale more efficiently so that I can reduce my cloud costs prior to migration?
How far can I push the public / private cloud integration envelope?
Using the cloud like a pure utility, what workloads can I power down during off peak to shrink my cost footprint?
How do my applications in cloud perform against category benchmarks from 6fusion’s ecosystem?
UC6 Profiler is in beta. If you’ve got an interesting set of business circumstances and a serious need to contain or reduce your or your customer’s IT Ops costs, give us a holler.
For the past several months the team at 6fusion has been working directly with a select group of customers using a prototype software tool called the UC6 Profiler. The UC6 Profiler is an agent we created that uses our patent-pending algorithm for measuring utility computing consumption. The UC6 Profiler meters live client applications running in their own offices or data centers, recording resource usage as though the applications were all running within 6fusion’s federated cloud infrastructure. The report output paints a clear cost picture, application by application, giving the customer an unprecedented set of data to guide and support their decision to migrate any or all applications to the cloud. We’ve provided an example of the output report here. It’s early stages yet for this, so the info is pretty raw. We’ll ‘gloss it up’ when it goes into production later in the year.
In our experience, the number one customer question about the cloud is “what does it cost.“ Like others in the field, we see the ability to profile consumption and report running costs to be one of the missing links to cloud computing adoption. As you can see in the sample report provided, this customer can identify that the application called “CUSTMAIL01″ would cost the ‘most’ to run in the cloud. Conversely, the application called “CUSTAAC001″ would cost the least.
Going beyond the customer implications, the UC6 Profiler could also be the missing link for the IT Service Provider community to truly take the reins of the cloud and leverage it to build significant new revenue opportunities. But the implications don’t stop there. Profiling can play a huge role for ISVs looking to plan and price SaaS offerings. I’ll elaborate on this in another blog post. We’ll focus on moving one mountain at a time!
Here are a few other tidbits we can share with you for now:
- The Profiler agents will work with both virtualized an non virtualized applications
- Users can profile web applications or traditional client/server applications
- There are no significant O/S limitations
- The Profiler will be a completely free download for registered 6fusion partners.
Stay tuned for more to come regarding the UC6 Profiler in the coming weeks!
As you may or may not be aware, 6fusion is a channel focused company. That means we work exclusively with IT service providers of one variety or another to make the transition from legacy service models to a service model rooted in cloud computing. Our core technology is an algorithm that creates a single unit of measurement for the computing resources you need to run practically any application. You can check out our site to find out more, as this post isn’t intended to be a product pitch. But I thought I would take a few minutes to let readers know a bit more behind the video we recently linked to our blog site (it’s the one on the top of the list) since we’ve received a few inquiries about it.
I posted the video because it celebrates one service providers complete transition to the world of utility or cloud computing. The video itself is an actual local TV ad they started running a while ago in their local market. It is really the culmination of a lot of collaboration and it really underscores how they are making money from the cloud.
When we met WestTel they were your typical regional IP Services Company. Their ‘bread and butter’ was voice and data network services. And like most small ISPs, they were concerned with the commoditization of their core business lines and the projected erosion of associated.
WestTel became a 6fusion Solution Partner, which gave them access to a world of utility computing infrastructure nodes. Basically, that means multiple data centers running enterprise hardware in tier III facilities on an elastic, pay on demand model in accordance with how the industry currently defines ‘cloud computing’. What made it possible for them to work with 6fusion was really that there wasn’t much risk. We asked for no software licensing fees, no hardware investments, no minimums and no long term commitments. I still remember the quizzical look on their VP of Sales & Marketing’s face when we met, as if to say “what’s the catch?!”
With limitless computing resources at their fingertips, WestTel got busy defining where they thought they could add true value to their customers overall experience while unlocking new vertical product growth strategies. If I would offer any lesson to other companies looking to the cloud I would point to WestTel as a shining example of how to do it. They resisted the temptation to “boil the ocean” with a plethora of products and services and try to become something they are not. They looked inward at what they do best and they built from there. The result was the blending of their new found capabilities with 6fusion and their strong, reliable network to create a foundation for WestTel Utility Computing Services. They determined the best value they could deliver would be to help businesses protect their data and applications using 6fusion technology and their own global networking reach. Very smart.
With a new vision on paper, they relied on 6fusion to make some key introductions to software vendors already operating on the 6fusion platform and that would fit the criteria they defined for their new service line and that customers were demanding. After working out the T’s and C’s with their vendors of choice they took some baby steps into the market to test out their theory. This is another valuable lesson. Go slow when launching new services. Don’t over accelerate. Allow for some bake-in time so that you can catch your breath and work out any kinks. Product development and service innovation is a huge risk for providers. But with a zero risk proposition we were able to afford them, they had nothing to lose and everything to gain.
Not every product development cycle makes it to market. In fact, most do not. I suppose this is one of the reasons why we wanted to share WestTel’s TV ad. It makes the entire 6fusion team proud to know WestTel was right about the market and what their customers wanted and that they didn’t have to be Verizon or British Telecom to get into the market. Taking their services to the level of confident product marketing is a big step and we applaud them for it. They aren’t yet ready to take on the world’ telecom giants, but that really isn’t the point. In a market brimming with hype and hope, WestTel is a great example of how one service provider is making money from the cloud.
Cloud Computing is the next great land rush and it is happening now. All the major technology companies have their offerings. And it seems like everyone is entering the market – even the hosting companies want in on the land rush.
In theory, migration to the Cloud makes business sense; you’re enabling companies to rent computing power that would cost them too much to buy. I won’t bore you with yet another blog post on the ‘what is it’ topic. There is a great synopsis of Cloud Computing published by Mache Creeger and I recommend checking it out. In our model, we’re allowing companies to pool their resources on the supply side of Cloud Computing and leverage a much bigger, better shared infrastructure on the demand side of the equation.
Cloud Computing is about lower costs and greater use of resources. Greater flexibility, more options and overall, more computing power. It’s a shared cost. And it’s based on what you use. Or is it?
One of the areas of Cloud Computing that still needs to be addressed is the issue of pricing. Pricing the Cloud has gone beyond complex and confusing and entered the realm of ridiculous on some levels. We’ve met with countless service providers in the past year and the basic message is clear: Come back when you can give me something that doesn’t need a PhD from MIT to decipher. This message was also pretty clear at last month’s Interop Las Vegas event.
The odd thing is that everyone agrees that Cloud Computing pricing needs to be standardized. Many companies want this to be an industry group that develops standardization. Industry groups and alliances have been throwing this topic around for a long time now – we’ve seen this question come up for more than two years. But why is it that nothing has happened? As a company that has what I would call truly transparent pricing, I’ve been confused about this for a while.
I was recently on a conference call with a potential data center partner when I got insight into what I truly believe is the answer.
Standardized pricing and corresponding tools that allow end user customers to peer into the rack and seriously drill down into the granular cost of the Cloud are simply bad for business.
In fact, the parties on the supply side of Cloud Computing – elastic computing providers, managed hosting companies, platform-as-a-service shops, big iron manufacturers, etc. – don’t have much incentive at all to strive toward pricing transparency and standardization. Why would magic quadrant hosting providers or heavily vested IaaS providers effectively even the playing field by adopting a standard pricing metric when it is their brand that is ultimately buttering their bread today? Is a company like Amazon or Google really going to adopt the same pricing standard as every other company getting into the race? Maybe, but don’t hold your breath in hopes to see them at the front of the line.
I think the work of Cloud standards advocates like Reuven Cohen of Enomoly has been really great for cracking the nut of Cloud interoperability. But it may be a stretch when they dream of Cloud interoperability extending beyond the technical exchange and integration of systems and data. Here is a reality check: All the big Cloud Computing providers in the market are profiting from preventing the very process of commoditization they allegedly support. And even if you aren’t part of that group, pricing is an integral part of the profit picture and thus cannot be decoupled from the discussion. Just because you get together and document some sort of standard or benchmark doesn’t mean you’ve solved the problem for the stakeholder that matters most (the customer). In fact, I think these types of standards groups may only serve to muddy the waters further on the subject because they don’t pay enough attention to the connection with the bottom line for a Cloud operator.
Understanding the profit motivations of the Cloud providers and then dissecting the current modus operandi for pricing exposes a huge gap that I think will shape a big part of cloud development initiatives in the next few years.
Let me give you an example to prove my point:
Cloud Computing service providers seem to believe that they can and should charge for the Cloud on an hourly basis. On the surface that sounds great, because it’s better than paying for a machine for the whole month, isn’t it? But underneath there is a lot more to it. Think about it. If you use a server for one minute of an hour, you’re charged for the whole hour. That’s crazy. One sixtieth of an hour costs you the whole hour? Sure the pricing is reduced, but what are you really getting? I think Allan Leinwand captures broader implications of this silliness quite well when analyzing the state of Cloud pricing. He said, “CPU hours: that’s not something I go buy. I buy a blade server, and the hours are infinite, they’re mine.” Leinwand has a big point and it has a direct impact on the future capability of Cloud Providers to achieve mainstream relevance to the average enterprise. 6fusion’s CEO and co-founder John Cowan analyzes the implications of pricing on the buying community here in a separate post.
And if it were really just as simple as clocking CPU hours and sending out a bill, maybe we could alleviate this pain point in the Cloud and move on. But it doesn’t end there. Invariably, Cloud Vendors have to “tack on” all sorts of ancillary charges and fees to make money. Everything from RAM to storage to bandwidth and even Support get thrown in as separate line items. The pricing becomes convoluted and difficult to predict. It’s a huge mess, but there is no incentive to solve the problem, given there is a lot of money to be made from the confusion.
I have no problem with the supply side making money. After all, that’s what a company is in business to do. What I have a problem with is the lack of transparency or ability to leverage these systems for anything more than just the technical accomplishment of elastic computing (don’t get me wrong, that is a biggie!). When Cloud providers don’t give you proper insight into what you are using, and if you can’t make the mental jump between what you do today (ex, buy more blades) and what the Cloud represents, the advancement of the industry suffers.
Herein lies the gap.
Service Providers must deliver more insight and transparency into the Cloud, not fog the pricing just to earn more margins for a brief time. Customers are far too smart for this to work long term. Ultimately, we believe that in order for the Cloud to succeed, the industry needs to help customers understand their true usage and the true value they are getting before and after they make the decision to use Cloud Computing to run critical IT systems. A granular metering and billing technology that transcends the politics of brand and vertical silos, while satisfying the need to be a ‘profitable’ service provider, will go a long way to helping to clean up the mess that is Cloud pricing today.