6fusion announces Advisory Group for IT Economic Benchmarking

6fusion, the company delivering IT financial intelligence for the Enterprise market, is pleased to announce it’s new Advisory Group for IT Economic Benchmarking. The Advisory Group, launched at the TBM 2015 Conference in Chicago, is chartered with creating standard benchmarks for IT Economics that will enable critical transparency and market insight among Enterprise organizations adopting a utility approach to IT Financial Intelligence.

6fusion’s patented standard unit of measure, the Workload Allocation Cube (kWAC) will serve as the foundational measurement unit for benchmarking and consumption economics for Advisory Group member organizations.

The Advisory Group will be Co-Chaired by Richard Donaldson, Director Infrastructure Management & Operations, eBay, a key proponent of quantifying IT Economic Efficiency, and leading voice for industry standards and benchmarks to help Enterprise IT groups justify investment and demonstrate their value.

“Workloads are diverse but when expressed in a ‘common currency’ like the WAC we gain apples-to-apples comparability,” said Mr. Donaldson. “We are excited to work with the community to establish standardized benchmarks to enable us to compare ourselves against an important community like TBM.”

The Advisory Group will be comprised of member companies and IT economics experts committed to moving beyond static IT Financial Management to the real-time dynamics of IT Financial Intelligence.  Advisory Group membership fosters opportunities for members to connect with peers and work collectively to set the new global standard for IT efficiency benchmarking and to understand their organization’s financial picture in the context of a connected utility marketplace.

This initiative will be co-lead by Rob Bissett, Chief Product Strategy Officer, 6fusion and Richard Donaldson, Director Infrastructure Management & Operations, eBay.

To learn more about the Advisory Group, or to be considered for charter membership go to: http://www.6fusion.com/benchmarking.

For press inquires, please contact jgraham@6fusion.com

About 6fusion:

6fusion creates IT Financial Intelligence and enables fact based IT Infrastructure decision making using the power of standardization to measure IT as a utility. With 6fusion’s patented Workload Allocation Cube (kWAC), Enterprise IT can get real time metering, reporting, and forecasting through the 6fusion online platform.

The WAC provides a common measurement of IT consumption across the underlying technology or vendors used to deliver services. For the first time, it is possible to create apples to apples comparisons of consumption, across vendors, technologies, and across the industry.  Like the kWatt or MPG, the kWAC creates the standard currency the industry needs to define, measure, and benchmark IT Economic Efficiency, and move beyond static IT Financial Management to dynamic IT Financial Intelligence.

To learn more, visit: http://www.6fusion.com

Posted in Press Releases | Comments Off on 6fusion announces Advisory Group for IT Economic Benchmarking

451 Research perspective: 6fusion & Apptio partnership

6fusion partners with Apptio and addresses opportunities beyond cloud exchange

Analyst: Owen Rogers

Much of our coverage of 6fusion over the past year has focused on its plans for a cloud exchange with partners UCX and CME Group, but the company sees shorter-term opportunities in using its Workload Allocation Cube (WAC) unit of consumption to assess enterprise expenditure performance pre- and post-cloud migration. It recently announced a partnership with Apptio, with the WAC being another data source integrated with Apptio’s broader IT financial management offerings….

For full report go to: https://451research.com/report-short?entityId=85755

SWOT Analysis



6fusion has unique IP in the form of the WAC. Partnerships with the likes of CME Group and Apptio show its credentials. There is no OpenStack support, which is surprising considering that we hear such negative feedback on the open source project’s own meter, Ceilometer. This is perhaps an untapped opportunity, and 6fusion should be careful not to let it pass by.




Comparing best execution venues is a challenge, what with different performance characteristics, price methods and costs. Enterprises want to understand and optimize their consumption – the WAC provides a suitable way of doing this. The WAC is still a proprietary standard, and it’s a long way off from being as commonplace as the kWh, for instance. With projects like OpenStack reinvigorating open source, end users are increasingly liking openness. Are enterprises ready to bet on the WAC?


Posted in Blog Posts | Comments Off on 451 Research perspective: 6fusion & Apptio partnership

Taking the Complexity out of Cloud Infrastructure Decision Making

The Challenge:

Shifting from existing on premise infrastructure models to cloud based models is all the rage in the Enterprise space today.  The first generation of Enterprise buyers – the innovators – simply picked some apps, moved them, and then course corrected based on experiences and costs.  As we move past those first incubator customers to more mainstream adoption, we are seeing a maturation in how the process is being handled.  These follow on enterprises saw the challenges and surprises that the first flight migrations entailed, and want to take a more methodical process – to reduce stress, complexity, and most importantly, the pricing surprises.

6fusion is working with a multinational conglomerate that wants to standardize services across the entire organization, including a corporate IT organization and nine business units with their own IT departments.  The office of the CIO is tasked with cost-effectively moving applications to the cloud, and then tracking those savings over time.  They’re also responsible for calculating fair cost allocation and chargebacks for the shared infrastructure across all of the business units.

The Solution:

Determine organizational baseline to understand current cost state and optimize for future cost state, using a single unit of IT economic measure, the Workload Allocation Cube (WAC).

By translating the company’s infrastructure into a single unit of measure for an apples-to-apples comparison, consumption and cost data can be normalized across all of the business units. This provides a baseline on where an organization stands and where they can improve. The WAC offers an open, impartial, and consistent view of IT infrastructure resource consumption and the output yields a single representative unit value of actual consumption. This is particularly important in hybrid or cloud environments, where differing economic models are used, and consumption can be highly variable.

With a mandate to move more than a thousand applications to the cloud in 2015, one of the organization’s immediate needs is to understand which of the applications in their data center and on-premise make the most sense to shift to a public cloud. There are financial and technical challenges to the these shifts. An economic analysis must be completed first to prioritize the applications that yield the greatest savings. Then a technical analysis is made from a security, regulatory, compatibility and performance aspect.

Once the organization determines which applications and workloads to migrate, they will then have the ability to compare what savings were expected to the savings actually achieved, continually monitoring consumption.

In most situations, companies can expect to see between a 20-40% cost savings from moving to the cloud

Project Expectations:

–       Visibility and cost transparency across business units

–       Insight into which applications should be moved and when

–       Tracking and managing application costs with a single unit of measure inside and outside the organization.

–       The ability to make ongoing cost comparisons and analyze savings versus the expected.

To find out more about how 6fusion can help your organization effectively manage infrastructure shifts to the cloud, visit www.6fusion.com or follow us on twitter @6fusion.

Posted in Blog Posts | Comments Off on Taking the Complexity out of Cloud Infrastructure Decision Making

The Utility Economics of IT: A response to Bernard Golden & Mark Thiele

By Rob Bissett, Chief Product Officer, 6fusion

Recently Bernard Golden wrote a very interesting blog (What Economists Can Teach us About Cloud Computing).  Mark Thiele then wrote a well reasoned response to that (Cloud Adoption Trends: A Response to What Economists Can Teach Us by Bernard Golden).  Both presented very compelling and well-crafted arguments about how they see cloud adoption proceeding, drawing on economic theory drawn from nobel laureate work done on other commodity markets. One generally in favor of public suppliers as economically advantaged (Bernard Golden), and one with a slightly counter view (Mark Thiele).

As a technologist with a background in economics, it’s these debates that really get me excited about the future of our industry.  We are finally aligning the dynamic growth of the technology space with how the business operates in a way that goes beyond technology and begins to treat the underlying infrastructure as a utility – the raw material input that businesses consume to create “products” – the apps that drive the business. This is the well-documented path of tech maturity where emerging technology evolves to commoditization and utility markets.  In keeping with this pattern, the cloud debate has fundamentally shifted and now we can discuss the merits of the models without vendor stripes and speeds and feeds.

As Bernard outlines in his article, there is a significant body of work that has been done on consumption economics – much of it Nobel caliber work.  The two that he calls out are significant for utility goods and are hugely relevant to the cloud space.  This body of work deals with how markets, individuals, and business consume utilities – what I call utility economics.  It continues to amaze me how, after ~15 years of virtualization and cloud, that these topics are only really being discussed openly now.  Nevertheless, finally seeing the industry rally around this cause now is very rewarding and edifying.

Why does it make sense?  Simple, as I addressed in a blog several months ago, Infrastructure services really are a utility. A utility input to the business.  There are volumes of research about how to optimize, leverage, and hedge utilities to drive efficiency and economies of scale.  Strategies from leveraging spot markets for burst, to long term forwards, hedges, and capital acquisition to manage volatility, risk, and provide long term price stability.  Services that are now becoming available in the IT Industry (see AWS Spot Market, RI’s, and the work being done by the CME Group co-sponsored cloud exchange, UCX).  These same strategies enable organizations to move from cost accounting for internal IT to management accounting models for internal IT – by providing real consumption data to better cost allocate inputs to revenue lines, optimize consumption, predict future needs, balance risk (internal vs external allocation decisions), and create industry and market comparisons.  In what other sectors do companies NOT know:

1) How much do I use, and what do I pay for it on a per unit basis?

2) What does everyone else pay for substantially similar services?

3) What would it cost if… (I bought new kit, invested in improved processes, outsourced…)

These are fundamental questions that can ONLY be answered through the application of utility economics to IT infrastructure.  That can only happen through a shift in the industry – from buyers and supplier moving from legacy proprietary models and methods to tried and true economic models – consumption metering standards, market data aggregation, and bringing the finance and business planning processes into IT.

I won’t get into the debate that Bernard started about public providers inherently having advantages over internal providers both for agency reasons as well as those of scale – that’s a separate debate – and I think we should do it on a stage somewhere with a live audience (maybe Cloud Expo this Fall!).  I will however state that without a shift to utility style economic planning and management of IT infrastructure organizations, no one will be able to prove anything either way – much to the the detriment of the entire ecosystem.

Thanks guys for broadening the debate, and I look forward to carrying it forward!

Posted in Blog Posts | Comments Off on The Utility Economics of IT: A response to Bernard Golden & Mark Thiele

6fusion hires former HP Executive to lead Enterprise Accounts

Jacob Woods is the former Chief Cloud Technologist for HP Canada’s Helion Cloud Group.  Recently, he joined 6fusion as the Director of Enterprise Accounts and sat down with 6fusion’s Director of Culture to discuss his decision to leave his national leadership role at HP in favor of working for a cloud startup.

Q. Jacob, the first thing that people want to know is why you chose to leave an influential leadership position at HP and join a startup, can you tell us about that?

Ha, the million dollar question.  When you’ve reached the executive ranks at an organization like HP and all the trappings the go with it, it’s not an easy decision.

Previous employers aside, I have collaborated as a partner with 6fusion and the core team for several years, so really this was less about leaving HP, and more about me coming home to a company that I’ve passionately believed in since their inception.

Q. What attracted you to 6fusion?  Do you remember your initial reaction to the 6fusion mission of enabling IT-as-a-Utility?

I do. I was working with a Gartner MQ leading cloud backup and recovery company and developed the strategic joint-GTM with 6fusion to host our software. My initial excitement for this new and first true utility billing for infrastructure really resonated with our clients.

I think the better question is, “what’s kept the attraction years later”?  And that’s a company that continued to disrupt itself before the market could. That type of attitude is refreshing in a space (tech) where multi-million dollar marketing budgets are at risk of masking reality and breeding complacency.

Q. What will the state of cloud be in the next 5 years.  Will a ‘hybrid’ solution be the best practice?

Hybrid cloud will become the status quo, and viewed as a ‘best-practice’ enabling organizations to most effectively manage their risk. That said, the transition to get there is going to be hard-going.  Right now, hybrid is viewed in concept as a great idea, but more often than not, there are organizational/ non-technical challenges to implementing, and change is slow coming. And while a lot of companies including my previous organization, do a great job with consulting and advising on cloud readiness, application modernization, rationalization, security and regulatory constraints for cloud migration– the single biggest void in the market is how do I decide what to move to the cloud; what’s the best place to move an application to reduce my TCO (or more accurately my Total Cost of Consumption, TCC).  It’s that question and the inability of the massive tech institutions to currently answer it that will ultimately drive a hybrid cloud strategy.

Q. What do you think the biggest challenges that CTOs in large Enterprises are facing today?

Having had the benefit of connecting with countless CTO’s in my career, whether it was Public Sector, Financial Services, Retail, Oil & Gas (utilities), every CTO seemed to face the same challenge. “Where do I start with cloud”? “How do develop my hybrid strategy”?

Once they get past application dependencies, security, licensing…the biggest challenge they face is, “Where am I going save the most or have the biggest impact on my TCC.”

Q. What technology has you the most excited?

While I still think it is two to three years away from being as pervasive as the marketing would suggest it already is, OpenStack and the enterprises doubling down with their hardened enterprise distributions of it, is going to be a major thorn in the sides of institutions like VMWare and Oracle.

Aside from that, companies like Apptio, (and others) are doing great things in the Enterprise disrupting the territory typically owned by the very large tech firms.

Q. What professional accomplishment are you most proud of?

I’ll let you know when I retire!

Q. When you aren’t revolutionizing cloud computing, what would we find you doing?

Kayaking, tennis, boating, golf, cooking, wine.

About Jacob Woods:

Jacob Woods currently leads 6fusion’s global Enterprise Sales efforts. 6fusion is the pioneer in standardizing the economic measurement of IT infrastructure and cloud services, and providing IT Economic Transparency to the global market.

Prior to serving as 6fusion’s Director of Enterprise Accounts, Jacob led HP Canada’s Helion Cloud Group, as their Chief Technologist. Jacob’s focus was to influence, shape and drive HP’s Helion Cloud Strategy in the Canadian market through technology leadership and innovation in the cloud portfolio. Jacob also led Enterprise Cloud Services Sales Strategy across both Public and Private sectors for Canada.

Jacob’s work experience also includes successful tenures with e-ternity, Asigra, and Modular Data Protection Services. Throughout all his previous roles, including HP, Jacob has been an advocate for 6fusion, working to drive adoption of 6fusion’s global standard for economic measurement and cloud metering technology.

Jacob brings approximately 10 years of experience in cloud consulting & strategy services, disaster recovery and business continuity. He has a strong track record of connecting strategic thinking with technology execution to achieve operational excellence and bottom line results.

Follow him on Twitter @outofthewoodsj


Posted in Blog Posts | Comments Off on 6fusion hires former HP Executive to lead Enterprise Accounts

Case study: How a thriving Data Services company built utility metering on containers

By Tim Kuykendall, 6fusion Director of Product Development


6fusion is pioneering the consumption of IT as a Utility movement industry wide. Tackling one of the industry’s most complex challenges means building the data collection and analytics tools to support an entire market. To do this, we rely on a sophisticated workflow that collects platform-specific raw data, and upon transmission to the 6fusion console, centrally translates it into a standard unit of measurement called the Workload Allocation Cube (WAC) for reporting and analysis.

The Challenge:

Given the complexities and volume of data 6fusion meters, it is vital that we stay in touch with the leading cloud technology trends to ensure that we provide metering utilities for as many platforms as possible with as little development and management overhead as possible. Due to increasing customer demand from our Fortune500 clients, we needed to quickly implement a new meter to support the latest versions of VMware; to better support larger-scale infrastructures, to rely solely on our newly released REST API, and to achieve better modularity of services and isolation of individual processes within our software.  This case study will focus on just one specific example of our devops team rebuilding our flagship meter, the UC6 Meter for VMware.

The Solution:

6fusion quickly elected to utilize Docker technology for the VMware metering project. The team felt that Docker containers stood out as the perfect vehicle for encapsulating individual services in an easy-to-deploy and easy-to-upgrade unit. There were several benefits we discovered in working with Docker:

More control:

Environments vary widely from organization to organization. Operating systems, configurations, and architectures will all depend on the policies, practices, and preferences of those managing the infrastructure. This fact makes it quite terrifying to know that your product will be run in an environment over which you have very little control – by someone who isn’t you! Containers afford us the ability to define exactly the configuration of the platform running within each container. This provides much more confidence that the appliance will run as expected.

Ease of configuration and upgrade:

By starting with a bare-bones instance of CoreOS and adding only the services necessary to run a particular service, we end up with a final product that is much more streamlined. Additionally, we can guarantee that once a meter is installed in a customer’s system it will have all of the services and settings it needs without the need for extensive reconfiguration of the infrastructure. Additionally, when an upgrade of a component is necessary, that individual service can be replaced with a newer container without the need for an end-to-end update of the entire appliance.

Faster development:

Certain behaviors of a meter are consistent, regardless of the platform being metered. For example, once a set of machine readings is collected from a hypervisor or OS, it is pushed to the same REST API in the same JSON format. By encapsulating that behavior into its own container it can be reused when developing future meters, avoiding the need to recreate the same code for all subsequent projects. This allows us to focus research and development effort on the details and nuances of the new platform to be metered, and to turn out new products much more quickly.

Streamlined testing:

A containerized approach to development means that as individual components reach code-completion, they can be handed off to QA for testing immediately, without waiting for the entire appliance to reach completion. Therefore, any defects found are much easier to identify and isolate to a particular service, allowing a developer to find and address the issue much more quickly. With smaller slices of functionality, the feedback loop between QA and development becomes much shorter.

Better scalability:

Once the performance of a meter has reached a threshold, that bottleneck can be narrowed down to an individual process or processes which can be scaled via additional containers of that type, rather than spinning up multiple instances of the entire application. This flexibility allows us to scale only the needed components without adding unnecessary load.

Separation of concerns:

The architecture of a 6fusion meter is broken down into the following logical components:

  • Inventory Collectors – These services connect to the hypervisor or operating system and identify the inventory of workloads currently running on that infrastructure.

  • Metrics Collector – This service connects to the hypervisor or operating system on a 5 minute interval and pulls the individual metrics for the workloads identified by the inventory collectors.

  • Local Data Store – A MongoDB instance is used as a data store to temporarily hold information that is shared among all other services.

  • UC6 Connector – The UC6 connector facilitates communication with the 6fusion console, handling authentication against the 6fusion API and submission of machine metrics to the appropriate endpoints.

  • Missing Readings Collector – In the case that communication with the hypervisor or operating system is impaired and a data point fails to update in the local data store, this component identifies that gap and re-schedules the collection of that data, ensuring consistency and completeness of the information submitted  to the 6fusion console.

UC6 Meter Design - VMware.png

This break-down of the architecture and identification of the logical components made it perfectly clear how containerization would align. Each of these processes are encapsulated into its own Docker image and each image can be instantiated independently from other processes.

Docker hub:

As a development department, we live and breathe version control. The workflow of cloning a repository, writing code against it locally, and then submitting changes/additions back to the central source is second nature to any developer today. In the same way that Github facilitates the storing and sharing of code repositories, Docker hub provides a platform on which developers can publish container images for use by others to address similar needs.

In Summary:

Metering micro-services on containers has provided 6fusion the control, flexibility, and scale that we require to better support larger-scale infrastructures and the expectations of our F500 customers.  It has also allowed us to achieve better modularity of services and isolation of individual processes within our software.

6fusion is looking for top development talent that has also drunk the ‘container kool-aid’, to learn more follow us on twitter @6fusion or email us at info@6fusion.com

About 6fusion

6fusion is standardizing the economic measurement of IT infrastructure and cloud services, and providing IT economic transparency to the global market.  With 6fusion’s UC6 Platform, organizations can view and manage the Total Cost of Consumption (TCC) of their business services in real time and achieve a higher level of cost optimization, forecasting accuracy and business agility.

6fusion uses a patented single unit of measure of IT infrastructure called the Workload Allocation Cube that provides a common view of IT consumption, agnostic of underlying technology or vendors. 6fusion enables baselining, benchmarking and budgeting of business service consumption across execution venues, and supports dynamic cost optimization strategies that keep pace with the realities of today’s heterogeneous, on-demand world.  For more information visit www.6fusion.com

Posted in Blog Posts | Comments Off on Case study: How a thriving Data Services company built utility metering on containers

Beyond the cathedral & the bazaar: OpenShift Commons initiative

Join us at Red Hat Summit, Boston for this live event, June 24th

Diane Mueller — Director, Community Development, Red Hat
Delano Seymour- CTO and Co-Founder 6fusion
Jonathan Arrance- CTO and Co-Founder TransCirrus
The success of OpenShift Commons in building out a peer-to-peer network and virtual ‘commons’ space for sharing best practices about OpenShift is introducing a new model for community collaboration. This model can help facilitate more effective, transparent, and useful relationships among participants in different projects.
Today’s reality is one of interdependence and collaboration across open source technologies. When innovative minds working on different open source projects collaborate, the technology wins.
In this session, you’ll learn:
  • How OpenShift was built through collaboration across open source communities, including OpenStack, Heat, Solum, Docker, Google/Kubernetes, Puppet, .Net and others.
  • Why we developed OpenShift Commons, an inclusive ecosystem of contributors, users, service providers working together to develop the next generation of Platform-as-a-Service.
  • Lessons and best practices for collaborating across open source communities.
Posted in Blog Posts, Events | Comments Off on Beyond the cathedral & the bazaar: OpenShift Commons initiative

Metering Microservices on OpenShift

by Delano Seymour, 6fusion Co-Founder and Chief Technology Officer

As posted to OpenShift community blog: https://blog.openshift.com/metering-microservices-on-openshift/

Containers have become an extremely popular technology and for good reason.  They provide software owners a consistent way to package their software and dependencies, while infrastructure operators benefit from a standard way to deploy and run them. Many developers are using containers to build services in a microservice architectural style, which divides a single application into a suite of small services, each running in its own process and communicating with lightweight mechanisms, often an HTTP resource API.

With the upcoming release of Red Hat’s Openshift 3, infrastructure owners have a platform that can be used to orchestrate and manage microservices at scale.  This shift, and the new technologies, give developers very powerful tools to take advantage of what the cloud provides – simpler provisioning, scale, flexibility, and agility (public, private, or hybrid).  Containers, with their dynamic usage patterns and deployments platforms present a problem for the industry – how do software owners track the usage of their services for licensing and billing purposes?

Physical host based licensing, per core licensing, per instance licensing doesn’t make sense is a container world – application owners and developers will be able to completely abstract the software from the underlying hardware and operating systems. They also have the ability to scale up and down by the tens of thousands in seconds.  Solving these issues will be core to the success of containers and PaaS in the enterprise, and doing so will require a radical departure from traditional models, to one that inherently applies utility theory.

One path to solving these issues is to “meter” consumption.  Metering fills this gap by tracking the usage of the three major resources consumed by software services, compute, network and storage on a per container basis, and then providing a single metric that can be used for reporting consumption. Metering also provides a means to benchmark, compare and optimize software costs and by extension, profit. Further, if done right it can provide visibility into the consumption patterns and highlight the main contributors to operating costs.

Before containers, software services were wrapped in a virtual machine and usage data could only be collected at the machine level. Containers enable  more granular metering by providing a wrapper around processes and enabling the collection of usage data at the container level.

Using the meter data collected at the container level, meters can then interrogate the inventory from OpenShift and match the usage data with pods and services thereby providing microservice metering.  Given that each container can be uniquely defined, and each container only contains a single service, you can precise quantify overall consumption AND consumption by each particular microservice, regardless of the number of container instances of that microservice.

By building out a suitable metering architecture, a user can meter consumption of resources on a per container basis, allowing for unlimited agility and flexibility, and simply bill users on a per-unit-of-consumption basis.  This model puts no flexibility limits on the user while still ensuring that there is economic alignment between user and supplier.  The user can aggregate the costs into apps to provide cost transparency for the app.  Taking this one step further, that information can be used by ISVs to bill for their software simply by providing a per unit cost of the software, thus using the metering system to drive commercial billing.  The utility is again disrupting the economics of IT.  What 6fusion is positing here is that the ability to provide granular metering will have a direct and profound impact on business and pricing models.  We want software owners to create services that not only run efficiently, but can be monetized in a simple and convenient manner.

To learn more about Metering Microservices on OpenShift, check out my Red Hat briefing with OpenShift Director of Community Development, Diane Mueller:


Or join us for the 6fusion presentation at Red Hat Summit, Boston: http://www.redhat.com/summit/


About Delano Seymour, 6fusion Co-Founder and Chief Technology Officer

Delano is a founder of 6fusion and co-inventor of 6fusion’s WAC algorithm. He is the principal architect of 6fusion’s UC6 software platform. As Chief Technology Officer, Delano is 6fusion’s technical visionary and responsible for all aspects of the company’s technology development. A gifted engineer and software developer, Delano has served as a Senior IT resource within both public and private sectors. Also a co- founder of a Bermuda Managed Services Provider, he has served as that company’s President and Principal Consultant during its most active growth periods.

About 6fusion

6fusion is standardizing the economic measurement of IT infrastructure and cloud services, and providing IT economic transparency to the global market.  With 6fusion’s UC6 Platform, organizations can view and manage the Total Cost of Consumption (TCC) of their business services in real time and achieve a higher level of cost optimization, forecasting accuracy and business agility.

6fusion uses a patented single unit of measure of IT infrastructure called the Workload Allocation Cube that provides a common view of IT consumption, agnostic of underlying technology or vendors. 6fusion enables baselining, benchmarking and budgeting of business service consumption across execution venues, and supports dynamic cost optimization strategies that keep pace with the realities of today’s heterogeneous, on-demand world.  For more information visit www.6fusion.com

Posted in Blog Posts | Comments Off on Metering Microservices on OpenShift

Page 1 of 23