Big Data problems cannot be solved by in-memory computing alone, says Teradata CTO Stephen Brobst.
Speaking to attendees at the company’s analytics event in Melbourne last month Brobst said: “anybody who talks about putting all the data in memory and big data in the same sentence, has no idea what they’re talking about. From an architectural point of view, of course accessing data from memory is faster than accessing it from a disk drive, but the cost of putting data in memory versus traditional storage is an order of magnitude larger. One of the things you need to look at is how to strike the right balance of performance vs. cost.”
Brobst talked about the experience at T-Mobile, which used Big Data to predict customer retention. T-Mobile used geo spatial analytics with latitude and longitude to analyze performance on the carrier’s network. The company looked at the ratio of dropped calls in various locations to predict potential customer losses, and took action by improving service in those locations by adding new towers.
To obtain the needed data, T-Mobile had to analyze 3 billion call records. Records from a transactional database were used with the SQL-MapReduce toolkit and the Aster Data data-discovery platform to capture call center interactions with customers which were stored as unstructured text fields.