Data Virtualization Goes Mainstream

Print Friendly, PDF & Email

Suresh Chandrasekaran--VP North America--DenodoIn this special guest feature, Suresh Chandrasekaran of Denodo Technologies examines the rise in popularity of data virtualization and the important trends that are contributing to its growth. Suresh Chandrasekaran is Senior Vice President, North America at Denodo Technologies, and is responsible for global strategic marketing and growth.

As data virtualization continues to grow into an established technology there are several trends that are contributing to its growth. The main business driver is clear — it is business agility in the face of complexity and massive amounts of data available. Those businesses that can leverage the data as it arrives not only for analytics, but also for quick insights into changes are the ones that will thrive and succeed. Aided by several complementary technology trends and the easier access to world-class data virtualization tools, Data virtualization is becoming an essential capability and part of modern data architecture at any organizations looking to gain more from their data assets.

At its core, data virtualization is a powerful technology that provides abstraction and decoupling between physical systems/location and logical data needs. As a result, it provides access to disparate data as abstracted, virtual data services that makes it easy for business users to discover, explore and consume canonical data assets for business intelligence, analytics, single view applications and other use cases. Data virtualization includes tools for semantic integration of structured to highly unstructured data, and enables intelligent caching, including in-memory and selective persistence to balance source and application performance.

This year, the most common trends that are driving organizations to adopt complete data virtualization platforms to meet their ongoing data needs include:

  • Separation of Logical/Physical Drives Disruptive Cooperation: Disruptive cooperation is a term that implies that solutions should challenge the way things are done, but continue to leverage existing heterogeneous IT solutions infrastructure. This is a good idea that appeals to CFOs and CIOs with a myriad of current and future platforms on-premise and in the cloud. Data virtualization enables separation of logical from physical integration such that hybrid techniques (Real-time, Cache and Batch) can leverage the advances in real-time push-down query optimization, in-memory caching and selective ETL, thus reducing the wasteful replication of traditional ETL.
  • Analytical and Operational Silos Re-Unite in Software Solutions: The primary goals of the new business intelligence and analytics initiatives (including Big Data and predictive analytics) is to make that intelligence actionable through systems of engagement with customers, partners, employees, operators, etc. And because the cycle of business is happening so fast, the traditional lines of difference between informational and operational IT platforms has started to blur. You see this at all levels in the stack, from in-memory data stores that have capabilities to analyze operational data on the fly, to embedded analytics in application UI. Likewise, at the middle tier, data virtualization provides the unified canonical virtual data services that can support broad-spectrum data virtualization for analytical, operational, cloud and data management use cases.
  • Future Capability of Unified Data Layer with Current ROI: CIO’s are seeking solutions that not only address current and defined problems with good ROI, but create new capabilities that can tackle unforeseen problems. Data virtualization does exactly this by creating a few common virtual data services to support a particular project, say for agile business intelligence and reporting. Then, these same canonical business views can be re-exposed as RESTful data services to enable a future project for a cloud or mobile application. Organizations realize and are demonstrating the value of reusing the same common data layer and are architecturally moving in that direction, but ensuring that they deliver tactical ROI first on initial projects.
  • Turning Data into “Information as an Asset” Moves to the C-Suite: Today there is a nexus force causing convergence among top executives, as they seek to maximize the value and use of information for business; from strategic to the utterly tactical. The CEO and COO want to exploit new opportunities, reduce costs, improve customer service, value and/or reduce risk. The CFO is tired of paying for data assets and not seeing them used to the fullest potential. CFOs are now measuring on usage and pushing for a demonstrable return on their data asset investments. Finally, CIOs have found themselves on the defensive largely because they have been unable to keep up with the business, and are seeking to become strategic again by delivering ahead of their business’ need.

 

Sign up for the free insideBIGDATA newsletter.

Speak Your Mind

*