6fusion

Blog

Blog

For Sale: Antique Cloud Instances

by Rob Bissett

Late in the post-lunch doldrums of Friday while I was surfing “The Twitter” looking for something to keep me awake, I tripped across this tweet from @thecloudnetwork: “For Sale: Antique Cloud Instances http://ift.tt/1vAvtHm 

I thought this was 1) hilarious for a whole lot of super nerdy reasons; and 2) because I believe the whole notion of cloud pricing is skewed (well, skewed is better than what I was going to say) today, and this is a wicked example of this. The comment about non-liquid markets really hits home.*

[*For those reading the Cliff Notes -- the author refers to trying to sell his reserved instances in Reserve Instance Marketplace, and since there are a ton of sellers and no buyers, it is a non-liquid marketplace (non-liquid = no money flowing...)]

Some history: Companies adopted Amazon EC2 in droves, and discovered that after a while, it gets expensive. Amazon began to offer a method for those customers to lower costs in return for committing to a term. They called these Reserve Instances (RI’s). They are for 1 or 3 years, and are paid up front. The represent approximately 50% price savings over the standard prices (This is a great example of forward pricing — ie, pricing for what your costs will be, not what they are today. Something enterprises don’t do today, but should. Topic for another blog….]

As this blog correctly points out – there are a lot of people that bought these RI’s with the best of intentions, but then either changed their minds and didn’t need them, or they moved on to other instance types to get more performance or whatnot. I mean come on — the cloud isn’t Enterprise IT. We don’t set stuff up and just leave it forever. Cloud is all about flexibility and change….

So what’s the problem? There are two: First, why should you have to commit to a specific instance configuration to benefit from future pricing in the cloud? If your vendor wants you to commit, why would they lock you into a non-liquid market if your business changes? If you are going to cloud, you are probably doing so for financial reasons, but also to take advantage of economies of scale, flexibility, and the ability to adjust on the fly. So why are you forced to make a commitment to a configuration to take advantage of future pricing? Why can’t you just commit to a term and spend amount and retain the configuration flexibility?

The other problem with this model is lock-in. What if your business changes? What if you don’t want to use that vendor anymore? Granted that if you commit to a spend with a vendor they aren’t going to give you a refund if you quit… I mean hey — I’m a vendor, I get that. But why can’t you just unload your commitment to someone else? At least to recover some percentage of your investment. I mean really, could you imagine the loss on the IT investments at MySpace when Facebook hit it big? I bet they could really have benefited from a 20% return on the IT investment they had sitting on the floor.

Amazon’s answer to this was the reserved instance marketplace — a mostly well intentioned way to allay those lock-in fears by telling people, “look, you can resell!”. But the reality is they haven’t marketed it, and don’t invest in driving adoption of it for a lot of reasons, not the least of which is that they have already been paid for those instances, and if they don’t get re-used they can resell the space in spot instances.

The net result is a tremendous lack of liquidity. There are no buyers, driving prices to close to $0. Why? well, you have to be an amazon customer already to use it. And you have to want RI’s. And, well you probably already have some if you want them.

Did you read the Amazon terms of service. Check out 8.5(d). They actually went ahead and made it against the “law” to resell or sublicense the service. so in fact, this account is subject to termination for trying to sell the instance. This pretty much eliminates any possible secondary market from forming. I mean, craigslist is probably a much better vehicle for reselling reserved instances than the reserved instance marketplace.

So what’s the solution? We need a real marketplace for IaaS. A place where users can go and purchase some “cloud.” A market that has a spot price for pay as you go services. A market that provides forward pricing based on a term/volume commitment, but still offers configuration flexibility. And finally, a market that will fully and openly support hedging risk by enabling a secondary resale market.

This type of market will enable organizations to make appropriate, risk managed moves into the cloud using financially sound management — and make IT behave even more like the utility it is supposed to be.

Rob Bissett is Chief Product Officer of 6fusion.

 

Is IT a Utility or Commodity?

Whitepaper by Rob Bissett

Given 6fusion’s mission to disrupt the traditional allocation/configuration based financial model in IT, it is inevitable that we get drawn into many wide ranging discussions regarding utilities, commodities, and the markets that evolve around them.  One discussion that keeps coming up is whether IT is a utility or a commodity.  Invariably, when we begin down this path, the conversation becomes very complex (and heated) and ends in a very unsatisfactory way largely as a result of a lack of clear understanding by what is meant by these terms, and how they will likely apply to IT.

In this paper, we’ll dig into the meaning of utilities, the IT-as-a-Utility approach, and explore the relevance to you as an IT budget holder and decision maker.  We will then bring commodities into the thread, discussing the similarities as well as the critical distinctions between the two, and explore the evolution of the IaaS, or “Cloud” Marketplace.

What’s a ‘Utility’?

The starting point of this needs to be an deeper understanding of “utility” as it applies to IT and computing in general.  In his 2012 whitepaper “Metered IT: the path to utility computing” (commissioned by 6fusion) Dr. Paul Miller builds upon Michael Rappa’s original research from the IBM Systems Journal (PDF) 2004, providing us a useful definition and starting point for utility as it applies to computing, identifying “…six characteristics common to utility services, from water and power to radio, television, and internet access:

  • Necessity. The extent to which customers depend upon the service on a daily basis
  • Reliability. The presumption that, according to Rappa, “temporary or intermittent loss of service may cause more than a trivial inconvenience”
  • Usability. Simplicity at the point of use; for example, users do not need to know how electricity powers lights at the flick of a switch
  • Utilization rates. Coping with peaks and troughs in customer demand, using for example, innovative pricing models that incentivize an even spread of demand
  • Scalability. Benefits of economies of scale, with larger providers typically realizing lower unit costs that can be passed on to customers
  • Service exclusivity. Government intervention that encourages the emergence of a monopolistic provider may be a benefit when utilities have significant setup costs or a particular requirement for scale

Rappa also concludes that a business model for the provision of utilities is “based on metering usage and constitutes a ‘pay as you go’ approach. Unlike subscription services, metered services are based on actual usage rates.”

This definition gives us a good look at a service that is highly reliable, for which unavailability is problematic, which scales as needed, and which is paid for in a “pay as you go” model.  This is wholly consistent with what we have come to expect from computing in most modern businesses – particularly as it applies to cloud.  To close this thread out, utility is often cited as one of the key criteria used in defining “cloud”.

So utility – as it applies to cloud, refers to the model in which IT services are delivered, consumed, and billed.  The definition doesn’t apply to the types of services delivered.  The argument about IT-as-a-Utility and applying utility financial concepts to that technology has nothing to do with commoditization, differentiation, quality of service, or any of the other arguments about the types of services delivered by the cloud vendors.  The argument about commoditization is a completely different discussion.

Therefore, a cloud (or any IT infrastructure) can be a utility, without being a commodity.

Is IT a Utility?

As a result of this train of logic, is it reasonable to look at cloud computing, or any IT service offered in this type and just assume that it is a utility?  The reasonable answer SHOULD be yes… but it isn’t.  There’s one of the key characteristics which Rappa cited which is holding cloud back from being a proper utility.  Today, with IT, we are still using a  subscription-based services model, not a metered model based on actual usage rates.

The traditional model (referred to as a “subscription model” in the previous paragraph) in which on-demand services (cloud) are currently offered is “pay as you go” meets the utility definition, but only on the surface, and here the language becomes tricky.  The challenge is in defining what you mean by “pay as you go”.  For virtually every provider of on demand services today, this means “pay as you contract”, or “pay as you configure”.  This is in actuality a subscription based model.  That is, when I provision a service, say virtual machine instance at a cloud provider, I am given the virtual machine, and billed for 100% of the resources that I provisioned, regardless of how many of those resources are actually consumed.

This is somewhat analogous to connecting a server with a maximum power rating of 1000W to a power circuit, and being billed for 1000W of power, regardless of the fact that the server may only be consuming 100W at any given time.  Heck – you would pay for 1000W even if the server was powered off but still connected.   This is the point made in the quote above as it applies to metering usage, and billing based on that usage. The fact that they are considered “pay as you go” simply refers to the fact that billing commences when you provision, and stops when you de-provision, which you can do at any time without penalty.  This DOES NOT make it a utility.

How do we end configuration-based economics and create IT-as-a-Utility? 

Create a standard unit of measure and meter it.

The challenge for IT organizations in approaching IT-as-a-Utility is to shift from traditional configuration-based billing models to true consumption billing models – shifting from “pay as you configure” to “pay as you consume”.  The issue holding back a larger industry shift in this direction is the definition of a standard unit of consumption.  If you consider virtually any utility that you consume today they all have a standard unit of measurement that defines the utility, regardless of the vendor that you purchase it from.  This is a challenge that 6fusion has chosen to attack directly with the development of the Workload Allocation Cube (WACTM).  The goal of the WAC is to define a unit of consumption that can be used across providers, technologies, services, and locations to measure consumption of IT resources, and provide a basis on which utility billing, utility financial models, forecasting, and economics principles can be applied.

What’s a commodity?

Before we define it, the thing that stands out to me in the discussions I have with an IT audience  is the perception that utility is synonymous with commodities and that all commodities = bad.

A commodity is often defined as a good or service for which there is no qualitative differentiation.  You can think of a commodity as a class of goods for which there is no perceptible difference between providers, offering competition solely on price.  We hear this term a lot, applied to services and physical products both. Some excellent examples of commodities lie in natural resources and food – we treat oil, gas, coal, and other products as a commodity.  In reality things like tomatoes, rice, wheat, and most meat are also commodities.

There are many subtleties and textures to consider when digging into commodities – they tend to be much more complex than most consider at first glance.  The general rule of thumb that I apply to this is that if the differences don’t matter – buy the commodity product. If they do – don’t.  What does that mean?  Take tomatoes:

Tomatoes are generally all the same.  For the most part, they look the same, taste the same, and cost the same.  In most cases they are sourced locally when available, and not when not.  For most purposes and for most people who plan to cut them up and cook them, it makes very little difference where they come from, what type they are, and who grew them.  Thus the commodity version of the product will generally yield the lowest cost and most steady supply, and as a result will be the most used.  (they are most applicable for most people, most of the time).

Now, depending on your business, you might have some specific interest in the kind of tomato, but not care who grew them, or where they came from.  Think of tomato sauce – you probably want to use plum tomatoes.  Since you are a scale provider, you need lots, at low cost, on a continuous basis.  Since there a number of businesses like you, the industry subdivided the commodity “tomatoes” into a subgroup “plum tomatoes”.  These cost slightly more than just “tomatoes” since there are fewer of them and they are more specialized, but still less than negotiating with each supplier directly.

However, if you are a farm to table chef, making a nice premium priced caprese salad special, you probably aren’t interested in using hothouse raised beefsteak tomatoes from chile.  You would like an organically grown, locally sourced roma tomato.  Why the difference?  The application – you are looking for something specific, and the differences between that specific product and the mainstream commodity are important, measurable, and something for which you are willing to pay extra.  This last group is not a commodity product – it is something specific to task, for which you contract outside an organized market with a specific supplier, and pay the price associated with that service.

Why commodities aren’t ‘bad’

The critical lesson here is that the establishment of a commodity market doesn’t mean that all services or products are the same, or that there is no market for differentiated products and services.  What is means is that there is a minimum definition of a “standard” product which will meet the needs of the majority of the market, and which can be sold without differentiation.  This creates massive economies of scale, enables organized markets to drive further scale, and drives down the costs of goods sold making consuming those services as inexpensive for consumers as possible, while providing consistent profits for the suppliers.  In many cases, a particular commodity is often broken down into sub-classes.  We used tomatoes above, but this is true for things like gasoline (premium, regular, etc), Oil (West Texas Intermediate, North Sea, etc), and others.

Producers of non-commodity products, or producers of products in markets that haven’t yet commoditized will often paint the concept as bad – and this is understandable if you are fairly non-differentiated and facing imminent destruction of your margins.  For the market as a whole though, this shift is typically a good thing.  Commoditization drives up volumes as buyers buy more of the “standard” product.  Because there is less friction in the market, more buyers buy more, and the vendors face significantly lower sales and marketing costs, which is good for suppliers.  There are three other benefits of commoditization that I want to highlight – supply chain economies of scale, parallel innovation, supply chain innovation which I will focus on in the next few sections.

Computing and Commoditization

Now – to move to the touchy subject – is computing a commodity?  Will cloud commoditize?  These debates are raging on throughout the industry today, and, without giving my opinion away, I think it is really instructive to begin to take some lessons from other markets and apply them here to really understand what we are talking about before we draw any conclusions.

When we talk about commoditization what are we talking about?  This term gets bandied about quite regularly, but without clear definitions it isn’t that useful.  When we talk about the commoditization of IT, or commoditization of cloud, I prefer to focus this down to something more specific – and that is the commoditization of IaaS services.  The instances that you get – computing, memory, I/O, and base storage.  I think this is important as it is much more specific and narrow in scope than just IT or Cloud.  The question then becomes can we define a generic enough standard that people will buy it, without regard to who is providing it, or what kind of servers it is running on.

I think the answer to this question is yes – at least with respect to the hardware it is running on.  We are seeing this already as virtually no cloud provider publicly discloses data center owners, hardware manufacturers, etc.  Now, this is where it gets much more heated – things like latency, SLA, and other underlying definitions.  I think we can all agree that a webscale provider with no SLA provides a service, that for most people, looks quite a bit different than an high SLA provider running on premium performing hardware.  That being said, if web scale quality and performance is sufficient to your needs, then getting the premium enterprise product is probably a bonus and I wouldn’t think most people would complain assuming the price was the same – again, defining the minimum standard, not the maximum standard is the critical component.  Further though, It may be possible that these types of services can be bucket-ized and define a sub-group commodity within the larger family of “cloud” commodities, providing high performance and webscale buckets.  The question comes down to (a) can we define the minimum standard and (b) will the market treat the providers in those buckets as if they were the same.  These I think are the relevant questions.

Finally – there will be many users who need something specific, be it latency, location, regulatory approval or otherwise.  Those are the specialist chefs who will be buying specific products from specific providers.  I think given this framework it is reasonable to think that a cloud commodity will likely arise, and that it will be good for the market.  I think it is also clear that there will be plenty of non-commoditized services that will be demanded by the market to address specific requirements.

Where this gets interesting will be those three topics I mentioned earlier – supply chain economies of scale, parallel innovation, and supply chain innovation.  These will have a major impact on how the cloud / IT market evolves as we move towards commoditization.

Supply Chain Economies of Scale – as the commodity providers ramp up acquisition of the building blocks of cloud computing, the suppliers of those building blocks will achieve increasing economies of scale.  This lower cost model for things like processors, boards, switches, etc will tend to benefit other constituents in the markets, be they enterprises, bespoke service providers, or others. This can’t help but benefit everyone.

Parallel Innovation – the current market leaders in cloud services got there through innovation.  While the services that they started with may trend towards commoditization, it probably isn’t reasonable to think that they will change their corporate culture from innovation to commoditization. I also wouldn’t expect them to walk away from the scale business (certainly not in Amazon’s case as this is what they were built for).  What I would expect is an increase in investment in parallel innovation – that is the development of premium add on services that are used in parallel to, or in conjunction with to the main commodity service.  We are seeing this already in the space – things like firewall, load balancer, database, desktop, and other services that run on top of the commodity (IaaS) service.  These premium services drive higher margin and return entrepreneurial profit over the commodity service, while still driving increasing demand for the main business.

Supply Chain Innovation – The final market effect that I would anticipate seeing as a result of this shift is what I refer to as supply chain innovation.  That is, as the industry moves to supplying a commodity service, the supply chain will shift to creating non-commodity tools and products that enable the commodity suppliers to produce commodity products more efficiently.  We are seeing this market effect as well.  Traditionally the server vendors built products designed for enterprise with high manageability and flexibility features etc.  The cloud specific vendors, early on, decided not to use those servers as they don’t require flexibility or manageability – they require high volume and low prices.  They turned to the ODM vendors to produce high-volume, low-cost systems with only the features required by the cloud vendors.  We are now seeing the major hardware providers go back to the drawing board and re-design their product lines to deliver extra value to cloud providers enabling them to deliver more, and better IaaS services at better cost models.  Effectively reverse commoditization in the supply chain – using non-commodity hardware to improve the delivery of commodity cloud services.

In any case – whether or not we see true commoditization in the cloud space remains to be seen – and I am sure the arguments will continue.  What is critical to understand in this debate however is what we are talking about commoditizing, and what the impacts of those moves will be.

Hopefully I have been able to make clear that we are seeing utilization in the IT space now as people shift to on-demand, pay as you go models.  This utilization will reach its zenith when “pay as you go” shifts from “pay as you configure” (pay as you subscribe) to “pay as you consume”.  At this point, users will have true visibility to in the cost models of their applications, and IT organizations will have the tools necessary to baseline, budget, and forecast more effectively.

Will this mean we have commoditized?  No – it doesn’t. The industry may move in that direction, but the adoption of utility models, and the commoditization of cloud are most certainly not linked.

Conclusion

This paper laid out (hopefully) a fairly compelling argument on the differences in utilization and commoditization.  I believe that both are inevitable in computing, but that they will happen at different rates, and as a result of different market actions and drivers.

The utilization movement is one that is simply good for enterprises.  We are seeing early market evidence of this within large web scale providers and enterprises.  The current challenges of managing costing, consumption, and procurement across many vendors and technologies is becoming overwhelming for large enterprises and some leading edge thinkers are already moving their suppliers over to utilization based billing and contracting methodologies – both because this is simpler, leads to less wastage, but also because it maps expenses directly to revenue, and better enables a company to quantify it’s market power and negotiate discounts with suppliers.  An example of this is our recent partnership announcement with Switch SUPERNAP.  Switch sees the opportunity to help develop their C.U.B.E ecosystem by helping their enterprise customers quantify their compute needs via the WAC, and to use that data to interact with the suppliers in their SUPERNAP ecosystem.

This trend towards utilization will only grow over time as the visionaries see the benefits and pass those along to the thought leaders in the industry.  At the same time, there the commoditization effect will begin – and through commoditization, utilization will accelerate as all commodities are sold as utilities.

The commoditization ship has sailed – despite the best efforts of naysayers in the industry.  We have seen two commodities exchanges (and the two biggest commodity players in the world at that) announce efforts to begin to trade IT infrastructure services as a commodity.  This will be a tremendously valuable effort for the industry.  These developing markets will help to centralize the (currently) highly fragmented spend in the space and through that centralization will help to define the standard “classes” of infrastructure.  These class definitions will help to clear the picture for buyers making it simpler for the average buyer to compare suppliers.  The markets will also establish price baselines.  These baselines will set the floor for pricing, and help buyers understand their market power, and what they should expect to pay as a result of that market power.  Finally – it will drive the major players to innovate both to create margin with the commodities while also developing differentiated add on services, which will benefit all players in the space, while also creating a healthy market for specialty services.

These are all good things for the market.  There is benefit to both suppliers and buyers in helping organizations turn the corner on the “old” way of doing IT – large multi-million dollar monolithic projects with lots of waste and lead times to much more efficient pay as you consume hybrid projects that deliver better economic performance for buyers.  IT-as-a-Utility, and IT-as-a-Commodity while independent of each other, are the industry’s inevitable future.  Organizations, especially infrastructure suppliers, should embrace these models and develop business strategies that keep pace with the agility of our now on-demand world.

Our goal?  Help enterprises (and providers) evolve IT by moving procurement and management of IT services to that final step of being a true utility and best serving the consumer.

GigaOm –Yes, IT can be sold like a barrel of oil

by 

Oil Barrels
SUMMARY:Commodity traders help set prices for oil and wheat, allowing buyers to hedge their costs. The same thing is poised to happen in the world of cloud computing.

Big companies use commodity contracts to ensure predictable prices for oil, wheat, electricity, metal and other crucial supplies that keep their businesses going. These days, a crucial supply for many companies is cloud computing power — raising the question of whether that too can be bought and traded in the same way as oil or oranges.

A recent partnership suggests the answer is yes, and that we’re heading to a world where companies won’t just turn to Amazon Web Services or Microsoft Azure for cloud services, but to a commodities market that offers the best price, on the spot or in the future, for a range of interchangeable IT infrastructure.

The financial platforms and the raw resource already exist to support cloud as a commodity. So do the people. But the question is whether someone can bring this all together, and overcome some big obstacles that stand in the way.

Cloud computing by the bushel

Earlier this year, a Raleigh, N.C.-based cloud company called 6Fusion signed a deal with the Chicago Mercantile Exchange, the world’s biggest market for commodities and derivatives contracts. If all works out, the deal will mean that buyers and sellers of cloud computing services can do business on a spot exchange and, in a few years, trade derivatives too.

The exchange will be a place to buy hours of “WAC,” a term invented by 6Fusion that stands for Workload Allocation Cube. The idea behind the WAC is to create a standard unit of cloud computing infrastructure that can be bought and sold by the thousands.

Under 6Fusion’s current definition, a WAC hour is composed of six metrics, including ones related to compute, networking and storage, that can be sold at a single price. Here is how 6Fusion portrays a WAC:

6Fusion WAC hour

According to 6Fusion spokesman Ryan Kraudel, the WAC is akin to a watt of power because it provides a standard measure of output, which in turn removes barriers to trading cloud computing as a commodity.

“The fundamental problem no one’s been able to solve till now is ‘what is the barrel or bushel’ [of cloud]? Now, there’s a basis for contracts in the future of infrastructure services,” said Kraudel.

6Fusion is not the only one proposing such an arrangement. In Europe, a company called Zimory is working with the German exchange Deutsche Boerse to sell cloud computing units.

In theory, the creation of these common metrics means companies can now use forward or futures contracts, based in WAC’s, to exercise more control over IT costs, which represent a growing percentage of many corporate budgets. Kraudel predicts that IT-intense enterprises like banks or universities will be among the first adopters.

What this could mean on the ground is that the IT infrastructure of a company like JP Morgan could soon consist of private cloud servers for sensitive data, supplemented by public cloud supplies purchased from an ever-changing roster of third party cloud computing providers. At the same time, such purchases of cloud computing “by the bushel” would also mean lower prices as traders, rather than vendors, start to set the price of key ingredients of IT infrastructure.

Skeptics might note that this idea of cloud computing brokers has been around for a while, but now its arrival finally appears close at hand. Kraudel says a spot exchange for bilateral contracts should be running by the end of the year, and that a derivatives market will be up and running by late 2015 or 2016. But that doesn’t mean, of course, those markets will succeed.

You can build it, but will anyone come?

The idea of WAC’s, and a derivatives market for IT infrastructure, is well and good in theory, but that doesn’t mean it’s actually going to happen.

6Fusion can define WACs and the Chicago Merc can provide a place to sell them, but the plan will only work if a critical mass of buyers and sellers agree they are worth trading. And that could be a challenge.

Unlike a barrel of oil or a bushel of wheat, there is no consensus on what a commodity unit of cloud computing should look like. While 6Fusion has offered a definition, not everyone will accept it and some will challenge the choice of metrics that make up a “WAC hour.” The task of defining the “cloud bushel” is harder still since the industry is evolving rapidly, and even accepted references points like an M3 instance from Amazon, may be soon outdated.

If no one can agree on what to trade, in other words, there will be no trading.

The problem is daunting but not insurmountable and, as it turns out, it’s hardly a new issue in the world of commodities. According to James Mitchell, a former commodities trader at Morgan Stanley, any traded good, no matter how standard it may seem, will be subject to changing definitions.

Mitchell, whose company Cloud Options has advised 6Fusion, points out that oil comes in a variety of standards — Brent Blend, West Texas, etc — and that orange juice contracts include a variety of conditions that let traders adjust the final price based on size, seeds and so on.

The same is likely to hold true when it comes to cloud computing commodities. Contracts for “WAC hour” futures, if the market adopts them, may include adjustment mechanisms for traders to tweak at the end of the deal.

“Everyone hedges against, then trues up against how off-spec it is,” said Mitchell, speculating on what would happen if a bundle of WAC hours didn’t correspond to the exact cloud resources that a buyer had sought to obtain.

“In the truing up process, you might have a disproportionate amount of CPU. If 6Fusion does a good job, they’ll choose a middle ground that doesn’t require a correction.”

Mitchell added that, for now, the biggest impediment to a functioning futures market is that traders and techies are still learning to speak to each other. IT people have a good idea of what a unit of cloud computing resources looks like, but this knowledge is still being translated into standard contract language of a sort that brokers can instantly recognize and trade upon, he said.

800-pound gorillas don’t like to trade

Let’s say the IT buyers and the traders do agree on a common cloud commodity (a WAC or otherwise) and the exchange is up-and-running as 6Fusion promises it will be. We’re still only halfway there since an exchange also needs sellers.

And right now, the cloud infrastructure industry is dominated by a giant called Amazon Web Services that will likely be reluctant to offer up its wares to a commodity exchange. The reason is that commodities, by definition, are interchangeable and sold at a price lower than any one seller can dictate.

So for Amazon, which is already selling cloud infrastructure at fire sale prices, a commodities exchange would not only depress prices further, but invite a host of other competitors to replace its branded AWS products with a generic bushel. But one way to prevent that from happening is for Amazon, and other big cloud service providers like Rackspace or Microsoft, to simply sit this out and try to ensure the commodities is not liquid enough to be viable.

6Fusion’s Kraudel acknowledged that Amazon, which declined to comment for this story, would be reluctant to participate, and noted that the company already offers its own on-the-spot cloud pricing as well as a form of futures called “reserve instances.” Still, he thinks the market will be liquid enough anyways.

“Amazon Web Services is an 800-pound gorilla, but there is a very long-tail to this market,” he said, explaining that there are many other providers capable of offering analogous cloud infrastructure, and that more will enter the market to meet what is still ever-growing demand. (It’s also possible that recent price pressure from two well-financed competitors, Google Cloud and Microsoft Azure, could nudge Amazon towards selling on an exchange).

Finally, the history of commodities markets may once again be instructive in trying to guess the future role of the current cloud gorillas. That history, according to Mitchell, shows that incumbents may dislike the loss of pricing power that comes with commoditization, but sooner or later the traders get the upper hand.

“Exxon tries not to use wholesale price of oil, but that doesn’t dictate the price of oil. It’s traders who are long and short who set the prices, not those like Amazon who are fundamentally long.”

How the 6fusion Switch SuperNAP partnership will transform IT economics

by John Cowan, 6fusion Co-Founder, & CEO 

Last week, 6fusion and Switch SuperNAP announced the first-of-its-kind industry partnership, bringing unparalleled economic insights to Switch’s enterprise and cloud service provider customers.  The partnership, which integrates 6fusion’s utility infrastructure metering platform into the Switch environment, will provide customers with an unprecedented level of cost transparency; bringing IT infrastructure users and providers one step closer to 6fusion’s transformational vision of IT-as-a-Utility, and the realization of the first fully viable IaaS marketplace.

The response from the industry has been overwhelming. While nobody doubts the obvious synergies between the world’s foremost innovator in data center technology and the company that is disrupting the economics of IT, I think it’s necessary to shed some light on how this collaboration got started, what it means for customers and a little bit about what comes next.

The seeds of this relationship were indirectly planted in May of 2013 at an event called Cloud 2020, organized by Ben Kepes of Forbes, and Krish Subramanian of Red Hat, and was conveniently located at the site of the SuperNAP.  At this exclusive thought leader event a lot of fascinating topics were covered.  I had the privilege of joining a panel to discuss “The Economics and Use Case of Federated Clouds.”  I don’t think the crowd on hand quite expected the fireworks this topic would ignite.  Maybe it was some post-lunch energy or maybe it was me standing up there and proclaiming that, in fact, compute, network and storage resources could be traded like coal, oil or other commodities.  And that in my opinion, when market economics could be truly employed by buyers and sellers the industry would see cloud adoption velocity to get excited about.

The debate we ignited spilled over to the blogosphere, as leading thinkers continued to make their case in the weeks following Cloud 2020 (read more background on the topic: here, here, here, and here).  Whether you thought the idea I shared that day was crazy or brilliant it was hard to ignore the groundswell of interest we created.  To foster continued discussion, 6fusion sponsored an invitation only round table on the front end of GigaOM Structure 2013, participants included Joe Weinman, Randy Bias, Paul Miller, Mark Thiele, Bernard Golden, Reuven Cohen, James Mitchell, and other industry luminaries.  For posterity, we recorded that session.

Shortly following the public debate about the concept of a futures market for cloud computing, 6fusion announced it’s first big step in that direction: 6fusion Launches Open Marketplace for IaaS.

If the Workload Allocation Cube (learn more about the fundamentals of the WAC here) is the basis measure of infrastructure like a real consumption utility, then the marketplace is the basis of contract standardization. Contract standardization is a critical building block because it defines the parameters for consumption in a uniform way; it is the foundation we are laying for the eventuality of trading IaaS Compute.

The Marketplace which was built on the Open Market Framework (OMF) launched a few months earlier.  The OMF is important because it is how 6fusion achieves open participation for buyers and sellers.  The basic premise behind OMF is that the underlying software code necessary to meter heterogeneous technology stacks was made open.  Anyone can build and support integration to the 6fusion platform – whether you come from cloud, virtualization or physical operating system perspectives.  You can read more about the OMF here.

One the questions I received since making the announcement with the SuperNAP is “why?”.   If 6fusion built all of the underlying software plumbing and the user interface to settle infrastructure contracts, why is the SuperNAP really even needed?  The answer is quite simple: the first step to building an open, financially settled market, is to organize physical marketplaces. The SuperNAP is a physical marketplace.  At an elementary level, it is world’s largest ‘farmers market’ of IT infrastructure; a massive collection of IT infrastructure operated by buyers and sellers.

By overlaying the 6fusion Open Market Framework and platform denominated by the WAC onto the SuperNAP, we have the potential to deliver unprecedented value to buyers and sellers.  For buyers, 6fusion turning the SuperNAP into an organized marketplace means the creation of buyer leverage and price transparency. Analysts don’t always see eye-to-eye on the world of cloud computing, but one point that escapes nobody is that buyers won’t just pick one execution venue for apps and workloads– they will pick multiple venues spanning internal IT, single tenant managed infrastructure and public cloud computing.  By giving buyers a normalized demand metric in the WAC, we are equipping them to have impactful negotiations, empowering them to understand their Total Cost of Consumption (TCC) KPI’s, and establishing price transparency in the marketplace so that they understand the true power of their demand.

The power of the physical marketplace is immediacy.  I come to the market.  I buy something from the market.  I consume what I buy.  Unlike the farmer’s market, however, the IT market is limited by bandwidth.  While a lot of smart people are working to solve challenges like technical interoperability, the challenge of moving large volumes of data between two disparate points is still very much a physical distance one.  The SuperNAP marketplace solves this issue by operating the most dense aggregation of network services of any data center anywhere on the planet.  What’s more, is that Switch views network services as “value add,” rather than a source of profit extraction.  They understand that by making networks and purchasing leverage freely accessible they drive a concentration on their core competency, which is to build and run world class data centers really, really well.

So thanks to Switch and 6fusion, customers can quantify their IT consumption like they would any other utility— using their actual usage based on real-time metering (versus a fixed allocation or subscription based economic model that the rest of the industry is forcing on consumers), and make rational, meaningful decisions about the distribution of their IT load across multiple execution venues almost instantaneously.

And therein lies the value for the infrastructure seller.

I get a kick out of the so called experts in our industry that tell me infrastructure suppliers would *never* support the idea of a commodity exchange or the normalization of consumption metrics for fear of diluting value propositions thereby creating a race to the bottom on price.  My friends and I at 6fusion have written plenty about that myth, so I won’t rehash it here.  Let me just say that in 10 years of working on the supply side of the market equation the motivation is dead simple:  Suppliers of infrastructure (physical, virtual or cloud) want two things:

  1. They want access to markets that lower their cost of business acquisition and
  2. They want to sell more of what they do, faster.

Simply put, the Switch 6fusion partnership will change the game for infrastructure suppliers by identifying new opportunities to serve their existing clients plus a raft of clients they never would have otherwise entertained.

There are some gaps I’ve intentionally created here so that I may come back to this post as the 6fusion-Switch story unfolds.  Consider this my ‘coming soon’ teaser:  The next step beyond the organized marketplace is a transaction marketplace.  Today, transactions are consummated in the cloud industry on proprietary paper.  Wouldn’t it be cool if that paper was exchangeable?

Like all big developments in the history of our industry, catalytic events mark the elevation of our industry evolution to new planes, new heights and new opportunities.   Switch and 6fusion catalyzing the organized market within the SuperNAP is the first commercial step toward the open market vision Delano Seymour and I documented many years ago and that is now shared by many industry leaders as well as our Partners at the CME (Chicago Mercantile Exchange) Group. For that, I figured it was worth slowing down to share a bit about how we got here and what it means.

6fusion Brings IT Economic Insights to Switch SUPERNAP Customers in Industry-First Partnership

Switch SUPERNAP enterprise customers and cloud providers will have free access to 6fusion utility infrastructure metering platform 

Raleigh, NC, and Las Vegas, NV – August 26, 2014 – 6fusion, the company enabling IT-as-a-Utility, today announced it has partnered with Switch SUPERNAP, the developer of the world-renowned SUPERNAP data centers. 6fusion’s technology and utility methodology will be integrated into the Switch SUPERNAP technology environment to provide SUPERNAP customers with groundbreaking cost transparency through 6fusion’s patented unit of measure, the Workload Allocation Cube (WAC).

“6fusion and Switch SUPERNAP share a common mission of creating marketplaces and ecosystems that are transforming the dynamics around supply and demand for IT infrastructure in profound ways,” said John Cowan, 6fusion CEO and Co-Founder.

6fusion and Switch SUPERNAP will initially focus on enabling enterprise customers and cloud providers with unique economic transparency on infrastructure usage and cost efficiency.  Enterprise IT organizations will use 6fusion technology to better understand their total cost of IT consumption and compare internal cost efficiency against best-of-breed public cloud operators inside SUPERNAP.

“The combination of SUPERNAP’s ultra-scale environment and 6fusion’s capability to measure and quantify IT infrastructure as a utility will deliver unmatched value to buyers and sellers in the industry,” said Jason Mendenhall, Switch SUPERNAP Executive Vice President of Cloud.

Cloud operators, on the other hand, will use 6fusion’s technology to facilitate an “apples-to-apples” transaction language with enterprise customers looking to leverage cloud resources.

Ultimately, Switch SUPERNAP and 6fusion will enable macro insight into IT infrastructure usage trends and patterns critical to business planning and operations.

The companies see the partnership forming a cornerstone in the enablement of the recently announced spot market for IaaS, for which the WAC will serve as the standard unit of measure for contracts and the 6fusion platform will track delivery of contracts between buyers and sellers.

“6fusion’s unique technology and vision for an open market combined with the depth and breadth of the Switch SUPERNAP ecosystem makes this partnership one to watch,” said William Fellows, Vice President 451 Research.

About 6fusion
6fusion enables the delivery of IT-as-a-Utility, allowing organizations to view and manage the Total Cost of Consumption (TCC) of their business services in real time to achieve a higher level of cost optimization, forecasting accuracy and business agility.

6fusion uses a patented single unit of measure of IT infrastructure called the Workload Allocation Cube that provides a common view of IT consumption, agnostic of underlying technology or vendors. 6fusion enables baselining, benchmarking and budgeting of business service consumption across execution venues, and supports dynamic cost optimization strategies that keep pace with the realities of today’s heterogeneous, on-demand world.  For more information visit www.6fusion.com 

About Switch SUPERNAP
Switch SUPERNAP is the recognized world leader in data center and internet ecosystem design, development and mission critical operations, providing unrivaled independent solutions for colocation, connectivity, cloud and collaboration ecosystems. Switch SUPERNAP represents innovation, security and reliability for more than 1,000 global clients, from sophisticated startups to Fortune 100 powerhouses.

Rob Roy, CEO and founder of Switch SUPERNAP, is the technology inventor and designer of the SUPERNAP data centers. Rob Roy’s 218 patented and patent pending claims on data center systems, designs and related industry technologies have changed the technology landscape. For more information about SUPERNAP visitwww.supernap.com.

The OpenStack and Open Market Collision Course

By John Cowan, 6fusion CEO and Co-Founder

The news that 6fusion and the CME Group were teaming up to ignite a spot exchange for cloud infrastructure back in April of this year made public nearly three years of work by the smart team at 6fusion.  Since early 2011 we’ve closely monitored key trends indicating the formation of the market.  Helping to make the emerging spot exchange successful means focusing on markets that will scale and grow quickly.  It means picking the right technology stacks to focus on.

Our answer three years ago was AWS and VMware.  AWS because of their natural leadership in the early formative years of the IaaS market in general and VMware because of the recognizable install base they had accumulated since turning the x86 world on it’s ear some 10 years ago.  As a young start up we moved very fast.  First, back in 2012 we built native integration of our metering technology into VMware’s vSphere.  Since early 2012 we’ve been metering and analyzing thousands of VMware-based VM’s. Second, we developed integration with AWS to apply our metering standard to customer instances, as we announced a little over a year ago.

Right around the same time that we were building out support for metering VMware environments, the OpenStack movement, which began in 2010, was beginning to garner serious attention from the market.  For a company hell bent on building a marketplace on which the CME Group could eventually launch a financial product, OpenStack fed many a fantasy.   An open source project for cloud infrastructure software being adopted by some of the biggest players in the industry?  Hey bartender, l’ll take a double shot of that Kool-Aid please!

Fast forward four short years and take stock in just how far OpenStack has come.  From a parts-and-pieces open source project to a budding commercial community with some of the first legitimate distributions landing in the hands of enterprise customers and cloud operators.  The momentum is undeniable and I witnessed it first hand at the OpenStack Summit in Atlanta a few months ago.  Over 4000 attendees and an entire show floor taken up by real vendors demonstrating real customer solutions.

It hit me in Atlanta that OpenStack was the real instantiation of the economic theory I borrowed (or bastardized!) for the cloud market years ago.  Back when we were raising our A round of venture capital in 2009 I posited that there is very real monetary opportunity in “the long tail of cloud computing.”  Of course I was loosely referring to (or bastardizing!) Anderson’s economic theory of the long tail and the general rule that in the age of the internet business models that sold less of more could win.  In a merchandising sense this is because consumers, given the choice and technological capability to do so, would shop for just what they need rather than pick from the mass market.  To me, applied to cloud computing it meant mathematically the sum of the infrastructure and workloads, smaller in transaction size or more vertically focused, and not necessarily running on the incumbent platform, could equal or overtake the size of the incumbent’s current market of one-size-fits-all web scale architecture.

I am convinced that OpenStack is the first potentially viable harnessing of long tail potential in the cloud market.  The aggregate sum of market share among OpenStack powered distributions could make it a serious threat to all others vying for customer spend in the cloud (public, private or hybrid).

There’s only one catch.  To do that, OpenStack as a community must become an organized market that is a part of a bigger picture.

Enter the need for economic interchangeability.

Here is a snippet from an article I published late last year in sandhill.com.

Economic interchangeability is the ability for a buyer to equate his or her requirements with available supply in the market in real time. It captures the notion of an “apples-to-apples” comparison between buyers’ needs and suppliers’ capacity. Economic interchangeability will significantly reduce transaction friction and support increased business velocity…  

Imagine a world as I do.  One in which a buyer of ANY proprietary OpenStack distribution can quantify his/her total cost of consumption (TCC), compare that to ANY OpenStack public cloud pricing index benchmarked against other non-OpenStack markets, and then negotiate a contract with a broker on the CME’s exchange or directly with a supplier to leverage those external cycles as necessary, openly and transparently.  Reduced friction and increased transaction velocity are important factors in creating liquidity.  In short, money goes to the robust marketplaces.  I see OpenStack ostensibly becoming the ultimate draw between supply and demand, putting the future of the commercial movement squarely on a collision course with an open, interchangeable marketplace.

See you at OpenStack Paris mon amis!

 

6fusion Collaborates on Fourth Annual Future of Cloud Computing Survey

6fusion is pleased to once again be one of the cloud industry leaders collaborating on the 4th annual Future of Cloud Computing Survey and we encourage everyone involved in cloud to participate in the survey. Whether you are a cloud user, cloud provider, cloud broker, or play another role in the cloud you should take the survey. It will only take about 5 minutes to complete and you can access the survey here: https://www.surveymonkey.com/s/6FusionNB

North Bridge will announce the results at GigaOM Structure, one of the leading cloud industry events that annually convenes influential technology experts to both examine and debate the future of cloud computing. North Bridge partner Michael Skok will present the results of the survey at the show on June 18 in San Francisco. If you are interested in attending, we can offer a 25% discount on tickets to the event. Ping us at info@6fusion.com to get the discount code.

 image003

IT Financial Management Week

6fusion is sponsoring IT Financial Management Week in Chicago April 28-30, 2014. If you will be at the show or in Chicago that week, please reach out to us at info@6fusion.com. We’d love to chat live about 6fusion’s impact on IT financial management for you and your team.

Screen Shot 2014-04-23 at 12.01.13 PM

Posted in Events | Comments Off

ZDNet – Soon, cloud capacity will be available in a spot market

Soon, cloud capacity will be available in a spot market

By  for Service Oriented |

In a few months, it may be possible to buy cloud infrastructure capacity on an open market, just as energy or physical commodities are now traded. The net result, hopefully, will be competitive pricing and lower risk for the burgeoning cloud-computing sector.

Data Center at CERN-photo courtesy of CERN Press Office
Photo: CERN Office of Media Relations

6fusion, a cloud ROI tools vendor, and CME Group, a global derivatives marketplace, just announced they have signed a definitive agreement to develop and market an Infrastructure as a Service (IaaS) spot exchange that will list financial products based upon 6fusion’s Workload Allocation Cube (WAC) .

The WAC provides a consistent, across-the-board unit of measurement by which companies can buy and sell IaaS cycles. The Infrastructure-as-a-Service (IaaS) Exchange will offer access to the underlying components of cloud computing: processing power, storage, networks, and systems messaging. The IaaS exchange is expected to be available in beta by the second half of 2014, according to 6fusion.

This type of market could keep the pressure on cloud providers to keep prices in line. (Fierce competition has already created a price war between the big cloud players.) In addition, there may be opportunities for enterprises with large-scale IT assets. Many organizations have excess computing capacity within their own data centers, and this could open the door to assigning real economic value to their systems. Huge investments in under-utilized systems that is only used a few times a year may be recouped if excess capacity could be sold off to such a spot market.

The spot exchange will feature contracts using the WAC as the standard unit of measurement and be available for trading on an electronic platform using technology licensed from CME Group. 6fusion’s UC6 software platform will be used to track fulfillment of physically delivered contracts traded on the spot exchange. The spot exchange beta is expected to launch later this year featuring a host of infrastructure buyers, sellers and partners.

Such an exchange is in line with the way energy is now valued and priced, Reuven Cohen, chief cloud advocate at Citrix, observed in a Forbes commentary. “Data centers are the new power plants,” he says. And, just as is the case with energy resources, such exchanges help reduce the risk in investing in new capacity.  He also adds that Amazon Web Services already has a spot-pricing capability that fluctuates with supply and demand.

Network World’s Brandon Butler provides a look at how such an exchange will work:

“Blocks of cloud computing resources – for example a month’s worth of virtual machines, or a year’s worth of cloud storage – would be packaged by service providers and sold on a market. In the exchange, investors and traders could buy up these blocks and resell them to end users, or other investors, potentially turning a profit if the value of the resource increases. The market for these resources would ebb and flow, just like with any other commodity, based on supply and demand. Perhaps around the holiday shopping season, or directly after a natural disaster, these blocks of cloud resources would be more valuable, for example.”

Forbes – The Next Chapter In The Cloud Brokerage Story

TECH  1,833 views

The Next Chapter In The Cloud Brokerage Story – CME Tech Used To Launch An Exchange

Whenever an item becomes commonplace (some might use the word “commoditized”) we see marketplaces and brokerages rise up to make money from this commoditization. Cloud computing is an area in which brokerages have been long talked about (ever since vendor Enomaly launched its ultimately ill-fated, but arguably prescient marketplace SpotCloud many years ago). The theory behind these cloud brokers is that they allow the relationship between vendor and customer to be optimized such that better outcomes are delivered for both parties.

Or that’s the theory at least – it’s fair to say that brokerages haven’t really taken off. True, the Deutsche Boerse did launch a cloud exchange platform, the so called Cloud Exchange AG but for whatever reason, the concept has never really taken off. Many suggest that this is because there is no common unit of measure for cloud infrastructure. Unlike electricity for example, which comes in nicely bounded kilowatt hour chunks, a unit of compute from AWS isn’t really comparable to a unit of compute from Google GOOG -3.66%Microsoft MSFT -0.97%, Rackspace or whomever. As I stated previously:

… A marketplace for electricity, for example, can resolve all the different methods of generation (hydro, solar, nuclear etc) into a simple measure, the kilowatt-hour. IaaS on the other hand, has no such simple unit of measure. There is no “compute-hour” or “storage-block” that we all accept as standard. This is becoming ever more the case as vendors (most notably AWS) move further up the stack and deliver differentiated services on top of simple compute and storage.

For more on the debate about the validity or otherwise of cloud marketplaces, see my wrap up of a panel I moderated last year in San Francisco. Given all this angst about cloud broking, it’s always interesting to get an update from 6Fusion, a company directly tackling the problem of finding a standard economic measure of IT infrastructure. 6Fusion has the unenviable task in this early and rapidly developing space of ensuring a consistent unit of measure upon which they can enable marketplaces to be built. No mean feat.

It is therefore interesting to see that CME Group, a broad derivatives marketplace, have agreed to collaborate to create an IaaS commodity exchange which leverages 6Fusion’s Workload Allocation Cube (WAC), the standard unit of measure for IaaS that 6Fusion has developed. The exchange, due to be rolled out in beta later this year, will feature contracts using the WAC as the standard unit of measurement and is planned to be available for trading on an electronic platform using technology licensed from CME Group CME +0.8%. In terms of actually fulfilling contracts, 6fusion’s UC6 software platform will be used to track fulfillment of physically delivered contracts traded on the spot exchange.

Every time someone announces a new initiative about cloud brokerage, the same criticisms are raised – that IaaS isn’t fungible, that it’s moving too fast, that buyers and sellers can have a direct relationship and hence an intermediary isn’t needed. All those arguments still hold. I made a comment when looking at marketplaces a year or so ago:

…the bottom line is whether or not there is both the ability for what they offer to occur and sufficient call for what these vendors are providing for them to build viable businesses. Is IaaS sufficiently fungible for marketplaces to flourish? Are there enough willing buyers and sellers, and enough margin to be made, for these financial intermediaries to survive.

While this 6Fusion announcement is interesting, my viewpoint hasn’t really changed. I’d be interested to see an IaaS marketplace really gain traction but I’m not overly confident it’ll happen any time soon.

 

Page 1 of 16