Sign up for our newsletter and get the latest big data news and analysis.

Validated Data Supports Accurate Decision Making and Rapid ROI

In this special guest feature, Zahl Limbuwala, Co-founder & Executive Director at Romonet, explores how data center managers could verify the accuracy of their data, strengthen the company’s commitment to improving energy efficiency and track ROI on critical infrastructure investments. As co-founder of Romonet, Zahl is deeply passionate about the data center and IT industries. Educated as an engineer in Analogue and Digital Electronics, Zahl’s early career was spent at Microsoft, Cisco and one of the City of London’s first B2B Internet service providers. He was the founding chairman of BCS Data Centre Specialist Group and consultant to the EU Code of Conduct for Data Centers. He is a regular keynote speaker at industry events around the world and holds board advisory positions with a number of other European and US based technology companies.

For many teams, analytics tools have become critical to success, however businesses often overlook the fact that solutions are entirely dependent on the quality of data fed into systems. If data is not validated as accurate, an analytics tool cannot deliver trustworthy and useful results. This potentially creates serious issues for the business. This may be a bold statement but all too often, due to the misunderstanding that analytics can somehow magically solve the issues of poor data, companies fail to realise their business objectives when deploying an analytics solution.

This applies to all areas within a business, including the data center.

Warehouse or strategic asset?

Given that data centers are the trusted repository for many organizations’ critical information, you would expect them to be first in line to reap the benefits of analytics. In reality, all too often they are overlooked. This issue is becoming increasingly urgent. While companies are running sophisticated analytics programs and interrogating large data sets to extract business value and increase corporate performance, the same principles are not always applied to the physical environment in which the data sets are being housed.

The demands on global data center operations to demonstrate their financial control, operational performance and environmental credentials are growing, especially as CEOs and CIOs look to outsource more and more capacity to Cloud providers. For these stakeholders, accurate analytics enable demonstrable ROI and financial predictability of these large assets.

But, and this is the crucial point, analytics tools only deliver accurate results when the input data is cleansed and validated as correct.

Determining data accuracy

There are many factors that can impact data’s accuracy. Sensor networks are efficient at collecting data from hundreds of thousands, sometimes millions, of data points across multiple sites. However, they are not fool-proof and on occasion transmit inaccurate data or even no data at all, sensors often need regular re-calibration too, a task that is often overlooked.

So how can the C-suite ensure that the information they are basing mission-critical decisions on is reliable and of the best quality? They need confidence that the analytics tools and systems the business uses have the capability to clean and validate the raw data before processing and analysing it.

Specialist certification processes are available and solutions exist that accurately analyze data through a combination of proven methodologies, Artificial Intelligence and Machine Learning. These processes are an insurance policy that senior executives can rely on, not just to verify the accuracy of information for internal purposes, but also to justify vital environmental and CSR claims and to identify energy inefficiencies.

Getting it right from the start

Accurate, validated data can also have a major impact even before a data center is built or selected. It can be used to predict the most suitable design for a new facility, providing insight on the potential performance of the site and if a specific build will meet its business objectives.

The growing adoption of hybrid data center models — combining virtual or private clouds alongside traditional hosted facilities, colocation, SaaS (Software-as-a-Service) and IaaS (Infrastructure-as-a-Service) applications – demands an ability to accurately compare and contrast performance across different approaches. The necessary changes, particularly in relation to IT infrastructure can be complex and expensive, and weighing up different investment options must be supported by accurate data and analysis.

Once up and running, operating expenses can be reduced by continually analysing metered data against predictive models and validating performance against an accurate performance baseline. This level of data analysis is vital – not just for those with analytics responsibility or business leaders charged with crucial corporate and financial decision making, or even to prove ROI, but for data center managers who must accurately predict performance under fluctuating workloads and climate conditions.

Put simply, data center managers must have a clear operational understanding of capacity, energy usage and operational costs, but equally the ability to foresee and avert any equipment degradation and failure risks to meet SLAs and maintain peak performance.

Granular visibility delivers far-reaching benefits          

The reliance on analytics and data is set to become even greater as organizations extend their use of IoT technologies and increasingly move to flexible hybrid estates. The benefits of new and even ‘bigger’ data, collected by machines, will have to be balanced by reassurances that it has been checked and validated before being analysed. This will require smarter machine learning tools and a different approach to that currently being taken by many AI platforms. Decisions made with data from analytics tools need to be auditable and fully explainable to satisfy board members, investors and other stakeholders.

Focusing on the quality of these massive data sets will stand companies in good stead as they move forward with projects in the data center and beyond.

 

Sign up for the free insideBIGDATA newsletter.

Leave a Comment

*

Resource Links: