The Road to a Futures Market for IT Infrastructure Services – Part 2

June 17, 2013

By John Cowan, CEO and Co-Founder, 6fusion

In my first post on a futures market for IT infrastructure, I went to great lengths to illustrate why a futures market for cloud computing not only made sense, but is practically inevitable.  But in order to be practically inevitable it needs to be technically possible, so  I am dedicating this post to answering the question of ‘how’ a market could form, exploring the key necessities for that reality.

If you spend any amount of time working with financial traders they will tell you there are two primary characteristics a commodity absolutely must have to be considered ‘tradeable’ on an exchange: standardization and deliverability.


Financial markets and traders define standardization as buyers and sellers ‘speaking the same language’ when it comes to the measurement or definition of a contract. We buy and sell oil by the barrel. We buy and sell electricity by the kilowatt. We buy and sell wheat by the bushel.  Every commodity trades on a standard unit of measurement.

Standardization is the basis of transaction velocity. Imagine there being a market for electricity but everyone in the market measuring and pricing its services differently.  Not only would such a market be impossible to scale, but it would grind down the potential growth of suppliers.

In short, simplicity is key in commodity markets.  I’ve talked about this before, but a commodity is a commodity when as a good or service there is no qualitative differentiation.

Think about that definition in relation to cloud computing. It’s anything but simple and it is seemingly impossible to standardize.

Until recently, people spoke about AWS, Rackspace and maybe a handful of telcos when talking about the relevant players in the infrastructure services business.  Not anymore.  Now you can’t have a conversation without including IBM, HP, Joyent, Google and even VMware (which up until very recently was considered sacrilege!).

The emergence of so many big and powerful new entrants to the market for cloud computing validates the need for an open marketplace if for no other reason than to give buyers a centralized place to meet their needs. @benkepes nicely illustrates this point in a recent post on brokerage and financial intermediation but in order for that demand to be accurately matched to appropriate supply there must be a standard unit of measurement.

So how do we arrive at such a standard?  Well, there are basically two schools of thought.  The first school of thought is that the leaders of the industry will get together and agree on a standard that can be adopted or adhered to for the purposes of the market.  While this route to standardization is what many people like to believe will happen, it rarely ever does and that’s because of the natural conflicts that exist.  While it might be possible for a handful of vendors to get together, every other effort of this type has failed because invariably a good section of the market is alienated from the process. The concept of cloud interoperability is a great example of a brilliant idea whose lifespan could not get past the working group phase of development.

The second school of thought is that over time one commercial option will dominate all others.  There are very strong advocates in cloud computing suggesting the world should concede the standardization battle to AWS so we all move on to other important topics. This isn’t the forum to agree or disagree as to whether the AWS instance format is a good or bad design, but even if AWS’s AMI allocation design for measuring what infrastructure is consumed was the greatest technical solution in the market, there is about a snowball’s chance in hell that you are going to see companies like Google, Microsoft, IBM and others conform to the AWS standard. Welcome to the politics of business.

In my view, standardization can only be accomplished by a technical system that does not impede any proprietary retail charging method of the provider but that normalizes consumption in a way that allows for an output reading to be matched to equivalent supply.   Further, it can only be delivered as a service by a third party with no political or financial capital staked in the cloud computing business.  In short, anyone that owns or operates infrastructure-as-a-service or gets paid to sell or represent retail cloud brands is automatically exempt from any consideration.

Since we are talking about the eventuality of an exchanged traded product for on-demand IT infrastructure, who would be better than an actual financial exchange to decide?

If a financial exchange could settle the debate around standardization they would be halfway to the goal of enabling a futures market for IT infrastructure.  In order to go the full distance an exchange would also have to answer the challenge of deliverability.


In order for a commodity contract to be considered valid by the market it must be deliverable.  Not all commodities – even obvious candidates – are tradable because of deliverability challenges.  Ultimately, someone owns the bushel of corn and that corn must be delivered.  The deliverability of a commodity comprises, among other things, two important factors:

  • Transportation

  • Perishability (or expiration)

For instance, corn is generally viewed as a highly deliverable commodity because of the sophisticated rail transportation system that can move large volumes of the farmer’s harvest.  Blueberries, on the other hand, while transportable perish far too fast to be considered a viable traded product.

IT infrastructure service is time based and continuous, which means that if you don’t use it, you lose it; it expires.  There is no capability to consume yesterday’s computing power and the commodity has an ‘on/off’ switch.  It is not something you can touch or package or put on a shelf.  So is it tradeable?  Absolutely.  IT infrastructure services are ‘delivered’ over private and public networks, much the same way electricity is delivered over wires.

An important factor making a unique product like electricity work as a tradeable commodity is highly predictable demand.  If the physical devices and components of things that consumed electricity were not predictable it would be impossible to make a market because turning ‘on’ such a service at some commit rate in the future would be complete guesswork.  The demand side of the equation would not participate.  This is where the analogous unit of measurement to the kW/hr for compute, storage and network becomes so vital to the process.  The entire vision of building a marketplace begins with buyers being able to reliably quantify their demand in a standardized fashion.

This brings me to my final point.

Detractors to the cloud futures concept find it difficult to rationalize how we can possibly jump to a world where an exchange offers a full blown derivatives product.  They can’t mentally “get there”  and I think I know why.  Nobody expects us to wake up tomorrow morning and be able to buy and sell cloud on an exchange.  The reality is that while we talk about a derivatives market for cloud computing you can’t forget that such a reality is an ‘end game’.  Cloud futures is a view into a secondary market.

A secondary market for IT infrastructure services will not emerge until incubation of a primary marketplace is complete.  Once a primary market has been established and the underlying cash market has reached a critical mass a secondary market – a derivatives market – will emerge, enabling both suppliers and buyers to trade, forecast, speculate and hedge and consume future compute, network and network resources.

A standard unit of measurement to define buy and sell contracts reliably delivered across a network with a major financial exchange providing the clearing process really only requires one other ingredient to become a reality: The willingness of supply and demand to participate.

So is there a willingness to participate?

I, for one, have reason to believe there is.

Comments are closed.