Why Your Database Must Be Operational and Transactional

Print Friendly, PDF & Email

Sinan Baskan MarkLogicIn this special guest feature, Sinan Baskan, Solutions Director, CTO Solutions at MarkLogic, discusses why your database must be operational and transactional, but probably isn’t. Sinan is the Solutions Director for Financial Services in the CTO Solutions Group based in New York City. In this role, he currently focuses on business development, alliances and go-to-market strategy, with over 20 years’ experience in capital markets and banking. Previously, Sinan served as the Head of Global Capital Markets at SAP, Head of Risk IT, Americas at HSBC Corporate and Investment Bank, as well as Director in the Risk Analytics Practice at KPMG. He holds Master of Science Degrees in Mechanical Engineering and Systems Engineering from Lehigh University and Columbia University, respectively, as well as an MBA from Columbia University Graduate School of Business.

Now that social media and Web 2.0 technologies are ubiquitous we are generating more information by orders of magnitude and faster than ever before – information that is outside the transaction processing system most rely on. More and more of this information is proving to be quite relevant to the bottom line. Examples below span from retail to finance, although there are plenty more in healthcare, government, media, etc.:

  • A retail company would like to track fashion trends on social media in order to fine tune design, and optimize supplier delivery schedules and marketing campaigns by generation and geography seasonally. In fact, H&M’s claim to fame is that they have mastered precisely this kind of harvesting of behavioral patterns off the media.
  • A consumer bank would like to track credit card transactions with respect to travel and recreational spending patterns, while an investment advisor would like to correlate investment choices to clients’ key life events, such as a wedding or college enrollment. Though the information can be collected over time through other means, a NoSQL platform as an aggregator and discovery point will capture both transactional and operational information continuously off the data streams. Customizing and offering a better service package to the customers improves per-account profitability at lower cost points.
  • A prime brokerage would like to integrate operational and transactional information so that sentiment analysis from social media and newsfeeds may drive trading models and strategies for their clients. The same degree of integration between trading transactions and trader communities (i.e. internal online chat programs) might help with fraud detection and “bad actor” surveillance.

None of this information would have been captured through the traditional channels. Now that the new data is available and relatively cost effective to harvest; the imperative is to determine how best to incorporate it into the corporate information architecture and leverage it to improve business performance.

Full integration of all data is necessary in order to, for instance: manage  heterogeneity of  inbound data, monitor both operational and financial metrics, and attribute P&L results by category in an accurate and timely manner. Integration should also enable discovery, aggregation and analytics against the holistic view of the universe of transactions, consumer behavior, economic indicators and capital budgeting.

In this new environment, data management platforms that are fundamentally different in design and features, generally categorized as NoSQL Databases (not-only-SQL), are proving  up to the task.

These new databases can ingest both transaction flows and unstructured data “as-is,” use a built-in search engine, and deliver rapid discovery capability. In addition, integrating data from disparate sources into an analytical context in a timely manner improves transparency and business agility. Transaction compatibility, in the ACID sense, maintains the transaction as a unit of work and facilitates audit and reconstruction of pedigree of transactional data. Bi-temporal features further enhance data governance and monitoring of time-dependent events. Tagging at ingestion different data makes it possible to define relationships and correlations across different data streams. None of these would be possible through the use of relational databases, SQL and traditional ETL based data movement.

As data is captured in a schema-agnostic, “as-is” database, use of semantics and defined ontologies would enable a different kind of analytics that exploit discovery of patterns and relationships that might not have been possible by purely mathematical/statistical methods or OLAP/MOLAP solutions.

Mature, enterprise scale, data center capabilities should also include scalability on commodity servers, tiered storage, encryption of data, a tamper proof platform with high availability, and designed-for-the-cloud features. This range of deployment delivers added flexibility to the customer.

Integration of operational data flows and transactions is achievable with state of the art NoSQL platforms, and there is a real business imperative to reduce and – to the extent feasible – eliminate data silos and incompatible data technologies.

 

Sign up for the free insideBIGDATA newsletter.

 

Speak Your Mind

*