IBM: The Optimal Storage Platform for Big Data

White Papers > Big Data > IBM: The Optimal Storage Platform for Big Data

The Integration of Big Data into Business Activities

There is a lot of hype around the potential of big data. That hype stems from the very real enthusiasm for the potential applications. Organizations are hoping to achieve new innovations in products and services with big data and analytics driving more concrete insights about their customers and their own business operations. They want to move from a world where decision making isn’t decided by the “highest ranked person in the room,” but instead by the “most accurate understanding of the situation.” For many, this objective requires both cultural changes and technology changes to be successful.
Organizations are investing in these changes. ESG’s 2015 IT Spending Intentions Survey explored the priorities of investments across all of IT, and found that the combined field of big data, business intelligence, and analytics was one of the fastest growing segments. Contrary to the pessimists who claim the hype isn’t sustainable, not only did enterprise and mid-market respondents indicate they were continuing to focus on analytics, the percentage of companies increasing spending was accelerating as compared with the previous year.

This spending is being distributed across a range of data-oriented technologies, including traditional relational databases and data warehouses; newer data platforms like Hadoop and NoSQL databases; advanced analytics; and intuitive visualization and reporting tools. Much of this growing investment is going to the foundational infrastructure required to support these diverse analytics approaches. Interestingly, just as there is much debate about the many choices for software, there is a lack of consensus on how to build the ideal underlying storage environment. Today, there is a wide spectrum of options in architectural models and components, including commodity versus purpose-built, on-premises versus cloud, open versus proprietary, and dedicated versus shared. Each of these choices can have a significantly different impact on the overall capabilities of the composite solution.
The sheer scope of big data is driving increasingly demanding requirements, as many customers are now beginning to recognize. At many customers, big data is now approaching the extremes of the traditional high-performance computing (HPC) space, which used to be the domain of only well-funded advanced research labs and government data centers. Many of the lessons learned in these extreme environments carry over into the broader big data world. Large enterprises are discovering that they too need to be able to perform analytics at massive scale to achieve their goals.

    Contact Info

    Work Email*
    First Name*
    Last Name*
    Zip/Postal Code*

    Company Info

    Company Size*
    Job Role*

    All information that you supply is protected by our privacy policy. By submitting your information you agree to our Terms of Use.
    * All fields required.