John Cowan was featured on the Digital Nibbles podcast today and the discussion centered on exchange traded IaaS. It was a very interesting discussion because it’s a hot topic these days, but also because it was a discussion between two of the earliest pioneers in the concept of exchange traded compute – John Cowan, co-founder of 6fusion, and Reuven Cohen, a co-founder of Spot Cloud, one of the earliest attempts at creating a spot market for cloud computing.
Check out the episode here: http://www.blogtalkradio.com/digitalnibbles/2013/05/15/digital-nibbles
Oh and some guy named Adrian Cockroft was also on the show talking about Google Glass.
This is an article from SandHill.com: http://sandhill.com/article/6fusion-helps-companies-optimize-their-it-infrastructure-and-cloud-spend/
Editor’s note: Need help in migrating to the cloud? Is your company looking to optimize its IT infrastructure spend in private data centers and public cloud providers? John Cowan, co-founder and CEO of 6fusion, explains the trends in how software and services are addressing these cloud challenges.
SandHill.com: There are increasingly new products and services that help companies migrate to the cloud with less risk and complexity. What trends has 6fusion noted in this area?
John Cowan: This space is obviously moving incredibly fast, but there are some very cool things emerging to help companies migrate to the cloud. One of the categories of technology in particular that seems to be getting a lot of attention these days is the physical movement of workloads between sets of infrastructure, whether on-premises to cloud or cloud to cloud. There are a growing number of companies focused on delivering this capability through software and services.
However, the biggest issue we see is the assumption that the customer is, in fact, going to use cloud resources. We believe a critical first step in moving to the cloud is first getting a clear understanding of what you are running in your existing infrastructure, how efficiently you are running those workloads, and how that compares to your options in the cloud.
The key to being able to answer these questions is using a common framework for evaluating IT consumption across on-premises and cloud environments. 6fusion is helping customers answer these questions with a combination of metering technology and advisory services for evaluating metering data.
SandHill.com: 2012 was the year of the cloud broker. Since we can’t really compare cloud services on an apples-to-apples basis, how is this impacting the market? Do you think cloud brokers will have a prominent role in the market two or three years from now?
John Cowan: It’s an interesting question because cloud brokers, as they’ve been categorized by most analysts, are not actually brokers; they are resellers, integrators or aggregators. There is absolutely a market need for what these companies are doing, but it’s a misnomer to call them brokers because brokers take on structural and financial risk in a marketplace.
There are true cloud brokers emerging that are playing this role, companies like Strategic Blue, for example. We see a very near future where IT infrastructure becomes a tradable, fungible commodity in global marketplaces where cloud brokers are a key part of those marketplaces, providing buyers and sellers a critical path to mitigating economic risks associated with highly unpredictable needs.
SandHill.com: What are the cloud needs and concerns of existing or potential customers that 6fusion has heard over the past year that the industry needs to address?
John Cowan: One of the biggest challenges for cloud customers today is the ability to accurately compare consumption and usage across different sets of infrastructure, including public clouds and private infrastructure. Unfortunately, public clouds have so far emerged as walled gardens, with little to no ability to compare on an apples-to-apples basis.
Just take a quick look at the price lists for some of the top cloud providers. It takes a ridiculous amount of effort to get a clear comparison across IaaS services, let alone to compare any of those to what you are running internally today. This is a big reason 6fusion is laser focused on solving this problem by standardizing the economic measurement of IT infrastructure, regardless of platform, technology or vendor.
SandHill.com: Is 6fusion helping companies resolve interoperability issues? If so, please explain how 6fusion’s capabilities differ from competitors.
John Cowan: 6fusion helps companies optimize their IT infrastructure spend across all the infrastructure they use, in private data centers and public cloud providers. With a patented algorithm that provides a single unit of measure that you can think of like the watt in electric utility consumption, we are giving organizations the ability to answer three primary questions:
- Is your IT infrastructure footprint cost efficient?
- How does your infrastructure profile and efficiency compare against a broader market?
- How can the power of your utilization data lead to perpetual cost efficiency for your infrastructure?
These are important questions to be able to answer with real data, not just estimates or projections; and they provide the foundation for real improvement.
6fusion’s value is in the ability to provide a comprehensive view into your IT infrastructure profile and, most importantly, the ability to act on what the data tells you.
SandHill.com: What do you predict will be the biggest change in the cloud market or provider capabilities in the next two years?
John Cowan: As I mentioned earlier, this industry will see IaaS traded on global commodity exchanges in the near future. We saw this was where the market was going almost 10 years ago and have structured our business and technology platform to make this happen. In fact, in our original investor pitch in 2009 I said, “Cloud will reach its maximum potential as a new paradigm for IT delivery when it can be treated, used and traded like a commodity utility.”
This is going to fundamentally change the delivery of IT because it solves vendor lock-in and reduces risk for infrastructure buyers, delivers global demand for infrastructure suppliers, and introduces transparency to the market and process that is holding this market back.
SandHill.com: The word “commodity” in IT services often connotes services or value at the low end of the spectrum, which doesn’t seem to jive with the greater value outcomes that companies are achieving in the cloud.
John Cowan: I’ve mentioned the word commodity several times in this conversation and it’s important for SandHill readers to understand that commodity should not be a scary concept for their business. A commodity is simply a good or service with no qualitative differentiation. It doesn’t mean a race to the bottom or lack of competition; in fact, it’s just the opposite. Commoditization is the basis of all mass markets, which unlocks huge amounts of value once we get past the FUD (fear, uncertainty, doubt) that the old-guard infrastructure companies are leaning on.
Our mission is to see to a world where enterprise customers can apply the same economic principals to IT that they adhere to for other commodities. I’ve written about this in my blog, and you can get more detail on this topic at6fusion.com.
6fusion is a collaborator in the 2013 Future of Cloud survey hosted by North Bridge Venture Partners, 451 Research and GigaOM.
Click here to take the survey and share your opinions on the future of cloud computing.
John Cowan is co-founder and CEO of 6fusion and co-inventor of 6fusion’s WAC algorithm. He is regarded as the company’s business model visionary. In addition to 6fusion’s day-to-day management responsibilities, John is responsible for the overall strategic vision and commercial direction of 6fusion. A 12-year veteran of business and product development within IT and Telecommunications, he successfully created new business during the period of telecommunications deregulation and developed and launched new technology products and services globally. Follow John on Twitter @cownet or @6fusion.
Kathleen Goolsby is managing editor of SandHill.com.
By Mark Riedeman, 6fusion Director of Software Development
It’s certainly a lot easier to spend money on IT Infrastructure than it used to be. Back in the “old days” (yes, I’m old), I remember having to make a pretty good case that a new server or servers would be necessary in order to properly tackle some development project. That was usually followed by a series of questions:
“Can we solve it differently without buying hardware?”
“Does it have to be such an expensive server?”
“Why do you need that configuration?”
Eventually the sad reality settled in that the project was going to be run off of someone’s old used desktop stuck under a developer’s desk. It wasn’t exactly the “good old days” and a lot of great ideas got shelved because those who get excited about great ideas in software rarely get excited about budgetary justifications and procurement cycles.
Shocking, I know.
In today’s IaaS world, new server infrastructure for projects is easily built with just a few clicks and without the long procurement and justification process. It’s a dream come true for a lot of us old-school developers who really don’t miss having to cost-justify an extra gig of memory or an additional disk drive. The power and awesomeness of this new world though does come at a price. And that price is usually in the form of a bill at the end of the month saying: “Surprise! Guess how much you spent last month?”
While no one wants to slow down a great idea in favor of a lengthy procurement process, the principles of return on investment and cost justifications are still as annoyingly valid today as they were in the old days. It’s really a simple question that has stayed the same even though the variables have changed: “Is this project a good use of our money?” To know that, you still have to know what the benefits are (I’ll leave that to the business folks), and how much it’s going to cost you.
In the cloud world, it’s now the “how much it’s going to cost you” part that can get rather challenging because there are so many more options than just paying for servers up front.
As a result, planning for a project’s infrastructure cost is no longer about figuring out how big the server should be, how much memory, storage or processing power you need. In today’s physical / virtual / cloud world, theoretically you have access to any infrastructure configuration in any location in the world. They key is figuring out how much it’s going to cost on the different platforms or clouds you can run it on and that can get pretty complicated, since it’s just not an apples-to-apples comparison in most cases.
If you’re a developer (or anyone else outside the accounting department for that matter), this is probably the TL/DR moment when your eyes start glossing over and you find yourself back in the land of budgetary justifications you thought you’d escaped from. Well, luckily, there is help out there….
6fusion’s technology provides a single unit of measure (called the WAC) to standardize the measurement of IT infrastructure usage by metering a project’s usage regardless of the infrastructure being used – in-house virtualization, cloud, or even the machine under the developer’s desk. And you can cost-compare across the entire 6fusion marketplace or any infrastructure that can be metered with our technology because everyone is using the same unit of measure, so it’s a true apples-to-apples comparison.
That means you can meter the application in WAC’s and then figure out where the most cost-effective place would be to deploy it without having to do complicated cross-comparisons of every deployment option. And, by continuously metering your applications and infrastructure, you can always make sure you’re running your applications wherever you get the most bang for the buck by doing so. Hopefully for you that’s not the server sitting under your developer’s desk.
As 6fusion’s Director of Software Development, Mark Riedeman has the herculean task of herding 6fusion cats, er, managing 6fusion’s ass-kicking development team.
Deloitte published a great article in the WSJ last week called Elevating the Business of IT, which highlights the importance of applying economic principles to IT operations. 6fusion is a strong supporter of this approach, as we’ve discussed many times in this blog:
It’s worth highlighting a few parts of the article that are particularly relevant:
“In the years ahead, however, IT will likely be judged not only by its support for the business, but also by its ability to improve operational efficiency in IT itself.”
We couldn’t agree more. IT operational efficiency, particularly related to IT infrastructure, is a huge challenge and something we see everyday in our customer base and market. IT is finally gaining recognition as a critical component of business strategy (it’s about time!), which has turned attention to applying some of the same business performance metrics toward IT that have been used to measure manufacturing, supply chain, and other organizational operations.
“That means embracing tools and systems that can capture, report, and manage their full portfolio of projects, vendors, and resources—across planning, implementation, and ongoing operations.”
Tools like ERP systems have been in place for years to streamline operations across the business. Now companies are using similar tools and processes to streamline IT operations. The article notes three approaches and methodologies that are helping IT organizations down this path:
IT Service Management (ITSM)
Project Portfolio Management (PPM)
IT Business Performance Management (ITBPM)
ITBPM is an area of interest for 6fusion, because as the article notes, “ITBPM makes it possible to measure IT capacity more precisely and manage it more effectively”. I would point out that a key success factor that must be incorporated into ITBPM processes is an industry standard for measuring IT capacity, particularly for IT infrastructure, and measuring usage across different user groups – business units, regional groups, operational departments, etc. This enables an organization to truly compare apples-to-apples when looking at consumption of IT infrastructure across private data centers, as well as public cloud services. You can’t improve what you can’t measure, and this is a critical starting point for an ITBPM initiative.
More on that in a future post.
By Douglas Steele, COO, 6fusion
This is the third in a series of posts on cloud brokers, their role in IT infrastructure markets, and their impact on IT infrastructure as a tradable commodity. Here are links to Part I and Part II of the series.
As I mentioned in my last post, there are tremendous advantages to be gained if only we could measure and compare IT infrastructure consumption with consistent and an easy to apply methodology.
It’s important to recall the definition of a commodity: “a good or service without qualitative differentiation.” It’s also important to recognize that commodity doesn’t necessarily mean that all products and services are the same. It simply means the way a product or service is measured is the same.
There is a big difference.
Service quality, product delivery and reliability are what differentiate products within a commodity market and what makes those markets volatile. Standardized measurement is what makes markets fungible. You need both factors in order to have a legitimate commodity traded as such.
Before we come back to the matter of IT infrastructure, let’s consider some of the factors that make electricity a tradable commodity:
The electricity market is a utility because demand is universal. You can go to any geographic region of the industrialized world and the basic problem solved by electricity is common.
However, usage patterns and purposes are highly diverse. End users have untold different ways to use electricity and the amounts they use cover a broad spectrum from an office complex using millions of kilowatt/hours to a small appliance using a fraction of that.
Electricity generators are a heterogenous, independent group of market participants. Supply comes from suppliers large and small, with a variety of capabilities and motivations for generating supply.
There are numerous methods of generating electricity from coal, natural gas, hydro, wind, solar, and more. All of which combine to make up the total supply of electricity available in the market.
These factors alone make trading electricity enormously complicated, but yet it’s done every day. What makes it possible? Many things make it possible, but two factors are fundamentally important to any market:
Common unit of measure
Standardized use of the common unit of measure
Common Unit of Measure
As we all know electricity is measured in kilowatt/hours, which is representative of the two key dimensions of measurement for electricity – the unit of measure and the cumulative usage time measured. Wikipedia has a good explanation of how these work together:
“In terms of electricity, an electrical load (e.g. a lamp, toaster, electric motor, etc.) has a rated “size” in kW. This is its running power level, which equates to the instantaneous rate at which energy must be generated and consumed to run the device. How much energy is consumed at that rate depends on how long you run the device. The unit of energy for residential electrical billing, kilowatt-hours, integrates changing power levels in use at the residence over the past billing period (nominally 720 hours for a 30-day month), thus showing cumulative electrical energy use for the month.”
This is an important concept, because no matter what is consuming the electricity (devices, buildings, lights, etc), we can measure it universally and all agree on how much it has consumed over a specific period of time.
In most cases standards become standards in one of two ways:
1. Formal approval by some governing body – generally less effective and sustainable as a rule.
2. De facto standardization achieved through market adoption – generally more effective and sustainable because users tend to adopt things that work.
The watt became the standard for electricity by de facto adoption and standardization over the course of nearly 100 years before being formally adopted by the International System of Units in 1960. We see a similar de facto standardization pattern emerging with compute consumption, although I don’t expect widespread adoption to take anywhere near 100 years. The de facto standard that is rapidly emerging today for compute consumption is based on the inputs necessary to successfully run an application or workload, which are:
Computing processor cycles – measured in Megahertz
Memory to store temporary work – measured in MegaBytes
Storage to store permanent work – measured in GigaBytes
Networking to communicate with its own and other devices – including disk input/output (I/O), LAN I/O, and WAN I/O, measured in Bits/Bytes per second
You may notice there are elements of computing that are missing from this list, such as IOPS, latency, wait queues, and more. Ask yourself if these metrics are required to operate the workload or are they more a measure of its performance? I would argue those are performance metrics, which are important for many purposes but not for measuring consumption. What we really care about here are the core items that are required to run the most common compute workloads today and measure them.
Continuing the electricity analogy, because computing is consumed over specific time periods just as electricity is, we need to quantify these metrics on an hourly basis as a universal benchmark. We can then combine the consumption metrics and the timeframes into a single unit of measure to make it easy to understand and represent. It may seem simple, but this unlocks the ability to do true apples-to-apples comparisons regardless of hardware, software, deployment model, location, industry, timeframes and much, much more.
Applying this model to compute consumption also lends itself to identifying different types of workload usage patterns (i.e. storage-intensive, memory/CPU-intensive, etc), the aggregate of which rolls up into an organizations “compute DNA”, which can then ultimately be matched with the appropriate infrastructure and suppliers offering the best cost to run it. There are some exciting things going on around the identification and use of different compute types as it relates to IT infrastructure measurement, which I’ll save for my next blog post.
6fusion’s founders John Cowan and Delano Seymour foresaw all of this almost ten years ago when they created the Workload Allocation Cube (WAC). You can read about the WAC, but in summary the WAC is a mathematical calculation accounting for the six resources an application needs to consume in variable amounts to do its job. But the beauty of the WAC is how they combined or blended those consumption variables into a specific algorithm from which a single representative unit value could be derived so that buyers and suppliers of IT infrastructure had a normalized basis to conduct a business transaction.
The WAC, as it turns out, makes IT infrastructure entirely fungible. With that important door unlocked, the path to a true open market is clear and present.
6fusion is hap pyto be a part of the 3rd annual Future of Cloud Computing Survey and we encourage everyone involved in cloud to participate in the survey. Whether you are a cloud user, cloud provider, cloud broker, or play another role in the cloud you should take the survey. It will only take about 8 minutes to complete and you can access the survey here: https://s.zoomerang.com/s/2013NorthBridgeFutureofCloud
This year, North Bridge will announce the results at GigaOM Structure, a cloud industry event that annually convenes influential technology experts to both examine and debate the future of cloud computing. North Bridge partners Jonathan Heiliger and Paul Santinelli will moderate highly anticipated panel discussions while at the show June 19-20 in San Francisco.
A new Singapore partnership marks a change in strategy for Raleigh-based 6fusion.
The cloud software company has reached a partnership deal to deliver its IT infrastructure services to the Asia Pacific, teaming up with Singapore-based CloudFX. Financial terms were not disclosed.
Rob Bissett, vice president of product management, took a break from OpenStack Summit in Portland, Ore. Wednesday to chat about the news.
“So, obviously, we’re based in the RTP area,” he says. “The company has been around for a number of years, and we’ve been primarily focused, historically, on the North American marketplace.”
But that’s historically.
“What we’re finding though, obviously, it’s a cloud-connected world,” he says.
And the demand for cloud services extends beyond mainland America. To keep up, 6fusion has already deployed a European partner. Asia Pacific is the next logical step – Singapore, in particular.
“It’s a really innovative country,” Bissett says. “The government there is highly invested in being leading-edge, so we were sort of targeting Singapore as the base of an Asia-Pacific strategy for us.”
IDC puts the Infrastructure-as-a-Service (IaaS) market at $6.2 billion in 2013, growing to a $13.5 billion market in 2016. It’s growth 6fusion plans to capitalize on, and it plans on wasting no time.
“Ultimately, what we’re trying to do is build a significant business and demonstrate our scalability,” he says.
6fusion, incorporated in 2008, relocated to the Triangle in August of 2010 following a round led by Durham-based Intersouth Partners. The company currently has more than 30 employees, the majority of which are based in Raleigh. 6fusion’s plan is to hire about 10 people in the calendar year, particularly in the area of software development.
Rob Bissett is helping expand 6fusion’s international footprint.
Lauren Ohnesorge covers technology, biotechnology and Durham County.
We are excited to officially announce expansion of 6fusion operations into Asia Pacific with our partner CloudFX. We have worked together in the region for sometime now and 6fusion technology together with CloudFX consulting and delivery has proven to be a powerful combination.
Here’s more detail on the announcement from the WSJ if you are interested.
I had a great conversation recently with a 6fusion customer Leo Soto from SotoNets, a cloud service provider, about his decision-making process in selecting an IaaS solution for their cloud backup service. Here are a few notes from that discussion:
What were the primary business problems or opportunities that caused you to look for an IaaS solution?
When we were ready to enter the cloud backup service space, we needed an infrastructure-as-a-service (IaaS) offering that would offer security, easy deployment of workloads and a true utility model for resource usage. Cloud backup is resource intensive activity that requires a safe and secure data center environment.
Who did you evaluate for this solution?
What business problems are you solving?
What has been the impact to your business solving those business problems?
There have been several reports recently about the public cloud markets, showing AWS clearly leading the market, but numerous other very large competitors in the mix. A Synergy Research Group report recently showed AWS with approximately 35% market share in 4Q12, with IBM next in line at 5%.
Barb Darrow at GigaOm had a good article today on the state of the public cloud markets speculating that Google Compute Engine is the next public cloud to watch.
Interestingly, Google didn’t come up at all in 6fusion’s recent survey data on 2013 cloud usage, but with so much shadow IT usage going on and all the competitors in public cloud today, it’s not much of a surprise.
The other interesting aspect of watching this market play out is that it highlights the need for metering infrastructure consumption across environments, whether you have workloads running in several different public clouds or it’s metering your internal IT infrastructure. Paul Miller of GigaOm wrote a great white paper on that topic - check it out.
While the discussions around the public cloud market share are interesting, they represent a fraction of the total addressable market for the IT infrastructure industry, as 6fusion Co-Founder and CEO John Cowan outlined in a blog series last year. You will see some very interesting activity around these markets unifying and exchange traded compute soon. Stay tuned!