Starting Small with Big Data

Print Friendly, PDF & Email

Tony_FisherIn this special guest feature, Tony Fisher of Progress Software gives focus to starting small with your big data initiative – define your processes and manage your processes for success on manageable amounts of data and then add to it incrementally. Tony Fisher is Vice President Data Collaboration and Integration at Progress Software, a global software company (PRGS) that simplifies the development, deployment and management of business applications. In this role, Tony is responsible for the Progress products that provide organizations with the ability to access and manage data.  Prior to joining Progress, he was President and CEO of DataFlux Corporation. He earned a Degree in Computer Science and Mathematics from Duke University.

I have this neighbor that has a fantastic workshop and a wide array of top of the line tools. But his workshop is always a disaster with tools everywhere and the debris from past jobs everywhere and no sense of order to his tools. He had an inspired solution to this problem, build a bigger workshop. Now he has a big workshop that is always a disaster. The same thing is true for big data. If you had poor data habits before big data, you’ll have poor data habits after big data. Only they will be bigger. Some things you might want to consider:

Big data is only going to get bigger. Today, big data is considered a differentiator. Soon, it will be considered a commodity.   The more data you have to drive decisions, the more accurate your decisions will be. And there will be more data: social data, machine data, IoT data, consumer data, and so on. In the data lies the secrets to better customer service, increased revenue and optimized operations. So, you need to start now to get your data in order. It sounds counterintuitive in a big data conversation – but start small. Define your processes and manage your processes for success on manageable amounts of data and then add to it incrementally.

Find the right talent. There is no point in amassing large amounts of data if you don’t have the right people that can analyze the data in the context of your business. When you find these people, make sure you do what you need to do to hang on to them because they are in high demand and short supply. According to McKinsey & Company, there will be a shortage of around 150,000 big data analytics experts by 2018.

Practice good data governance. Poor data quality will lead to poor data analytics. Haphazardly amassing all your data without providing context will lead to poor decisions. Putting all the data into a big data store doesn’t make the data fit for purpose for your business. Today, we spend lots of time and energy validating, enriching, joining data sources so the data is contextually accurate for the business needs. Just because you are moving toward big data technology doesn’t mean you don’t have to do this. Big data doesn’t aggregate metadata, just data. So, what was the meaning of the data when it was created? Who has access to the data? How do you track and enforce access without this context? How do you know what the data represents without this context?   If your big data is going to be useful, it needs to be orderly and accessible – just like my neighbor’s workshop.

 

Sign up for the free insideBIGDATA newsletter.

Speak Your Mind

*