Sign up for our newsletter and get the latest big data news and analysis.

Interview: Bernd Harzog, CEO and Founder of OpsDataStore

I recently caught up with Bernd Harzog, CEO and Founder of OpsDataStore, to discuss why data-driven IT operations are a prevailing theme in big data management. Bernd is responsible for the strategy, execution and financing activities of the company. Bernd founded OpsDataStore because every customer that he spoke to still had horrible service quality and capacity utilization problems despite a massive investment in either purchased or home grown tools. The core strategic principle underlying OpsDataStore is the belief that the pace and diversity of innovation is so high that no single vendor can possible keep up, and that therefore a best of breed ecosystem anchored by a common high speed big data back end is the only viable solution to modern management problems.

Daniel – Managing Editor, insideBIGDATA

insideBIGDATA: What is data-driven IT operations?

Bernd Harzog: Data-Driven businesses use big data to improve top line results (revenue) while reducing business costs, thereby driving improvements in market share, revenue growth and profitability. Data-Driven IT Operations means that IT uses its data to improve the top line and bottom line of day-to-day IT operations. The top line of IT is performance, stability, and scalability of the critical services delivered in software to the business, as well as agility in enhancing those services in response to market conditions. The bottom line of IT is the cost of delivering those services. Data-Driven IT Operations allows IT to drive improvements in both simultaneously.

insideBIGDATA: Why is it important for Enterprise IT to embrace and implement data-driven IT operations?

Bernd Harzog: IT is under several conflicting mandates that are impossible to simultaneously achieve without embracing Data-Driven IT Operations:

  • Be agile and responsive in supporting the delivery of new services to the market and continuously enhancing existing one.
  • Support the diversity in the application stack driven by developer’s needs to rapidly bring new software based services to market, and continually enhance existing ones.
  • Virtualize everything in the infrastructure (compute, networking storage) in order to drive agility and efficiencies
  • Accomplish all of the above while cutting costs

Without employing Data-Driven IT Operations to make Data-Driven IT Decisions, it is not possible to achieve all of the objectives above concurrently.

insideBIGDATA: What is the data of IT?

Bernd Harzog: IT’s data consists of the following elements:

  • The end-to-end performance (response time and latency) of every transaction, application, and layer of software and hardware infrastructure that supports these transactions and applications
  • The end-to-end throughput (work done per unit of time) of every transaction, application, and layer of software and hardware infrastructure that supports these transactions and applications
  • The contention for resources (the depth of the queues) across the entire software and hardware stack that supports these transactions and applications
  • The resource utilization of all of the virtual and physical resources involved in supporting these transactions and applications
  • The errors across the entire software and hardware stack supporting these transactions and applications

insideBIGDATA: What do you see as the biggest obstacle to achieving actionable insights from operational data?

Bernd Harzog: Today the “data of IT” is locked in the proprietary databases of N tool vendors at each enterprise. Each tool has an owner, who tends to understand the data of their tool, but not the data of the rest of the tools. So the first and biggest obstacle is for someone in the enterprise to rise above the fragmented “tool centric” view of monitoring and capacity data, and start to look at how combinations of this data can benefit the enterprise.

insideBIGDATA: What is the process for implementing Data-Driven IT Operations?

Bernd Harzog: Crawl, walk run. Start by identifying the most broadly applicable data sets in your enterprise. For enterprises who use VMware vSphere, the data in vCenter about the operation of the virtual and physical resources is an excellent place to start. Then look for the most valuable sets of data in IT. For most enterprises, this would be the application and transaction performance data collected by tools like AppDynamics, Dynatrace, ExtraHop, and New Relic. Finally look for the gaps in the data that are the most valuable to fill. For most organizations, this would be data about the behavior of the physical storage arrays, and the physical and virtual network.

insideBIGDATA: What common mistakes do you see being made as organizations try to implement this strategy?

Bernd Harzog: There are three common mistakes we regularly see:

  • For organizations that already have a Hadoop based business data lake, it is tempting to try to reuse that infrastructure for the IT data. This does not work because business data lakes built around Hadoop are built for batch processing. It needs to operate in real-time with data constantly arriving, and being made immediately useful in an analyzed form.
  • Some organizations try to build their own big data lake for IT Operations. The problems with this are that this puts the enterprise in the position of having to maintain connectors to the N tools in the enterprise, and also puts the burden on the enterprise to figure out what is related to what before the data is written into the big data back end.
  • Sometimes people focus on just getting the data into a big data back end and forget about making it consumable in whatever commercial tool used to consume the data. There is a real art form to making live big data consumable to commercial query tools and dashboarding tools like Tableau, Qlik, SaS, R, Splunk and Excel.

insideBIGDATA: What are the benefits of Data-Driven IT Operations?

Bernd Harzog: The benefits are considerable and multi-faceted:

  • The quality of service (response time, throughput, availability) of critical online services can be improved through end-to-end visibility and root cause analysis. This directly drives top line revenue for revenue generating applications and enhances the reputation of IT for all applications
  • Costs can be dramatically cut by using the data to improve utilization levels in the physical and virtual server estate
  • Access to and analysis of IT data can be turned into a self-service exercise where the consumer of the data is responsible for building their own dashboards and reports in the tools of their choice. This is only possible if you use Data-Driven IT Operations to free the data from the tools that produce the data. This frees IT from responding to endless requests for one-off chunks of data and lets IT’s constituents access better results.

 

Sign up for the free insideBIGDATA newsletter.

Leave a Comment

*

Resource Links: