Above the Trend Line: machine learning industry rumor central, is a recurring feature of insideBIGDATA. In this column, we present a variety of short time-critical news items such as people movements, funding news, financial results, industry alignments, rumors and general scuttlebutt floating around the big data, data science and machine learning industries including behind-the-scenes anecdotes and curious buzz. Our intent is to provide our readers a one-stop source of late-breaking news to help keep you abreast of this fast-paced ecosystem. We’re working hard on your behalf with our extensive vendor network to give you all the latest happenings. Heard of something yourself? Tell us! Just e-mail me at: daniel
There were plenty of new products, services and solutions announced in the past week starting with digital experience and Artificial Intelligence (AI) company Conductrics, announcing its third major release of its universal optimization platform built expressly for marketers, developers and IT professionals. The new platform, which shares the company’s name, is a cloud based adaptive testing and decision engine which in real-time discovers and then applies digital intelligence to deliver the best possible digital experience. The Conductrics 3.0 cloud platform increases personalizing and conversion rate optimization (CRO) in part through advancements in today’s machine learning. Additionally, the new platform significantly eases the programmatic requirements of machine learning to give non-data scientists the automation, intelligence and intuitive interface to produce and implement accurate predictive models within their applications … Hazelcast, a leading open source in-memory data grid (IMDG) with hundreds of thousands of installed clusters and over 16 million server starts per month, and Striim, provider of an end-to-end, real-time data integration and streaming analytics platform, announced the launch of Hazelcast Striim Hot Cache. This integration enables real-time, push-based propagation of changes from the database to the cache. For organizations that manage high volumes of data, Hazelcast Striim Hot Cache ensures continuous synchronization between the cache and its underlying database, providing consistency with the system of record. With Hazelcast Striim Hot Cache, companies are able to reduce the latency of propagation of data from a backend database into the Hazelcast cache to milliseconds. This provides organizations with a flexibility to run multiple applications off a single database, keeping Hazelcast cache refreshes up-to-date while adhering to low latency SLAs … Antivia, an industry-leading business intelligence (BI) software company and subsidiary of insightsoftware.com, today announced that the latest version of its modern business intelligence platform, DecisionPoint version 4, R11, is generally available and now includes native integration into the Salesforce® platform. This new version of DecisionPoint provides access to more data sources, increases app designer productivity and features a significant internal re-architecture of the calculation engine to strengthen and streamline its core foundation. DecisionPoint is a fast application creation tool that lets non-programmers turn data into beautiful information applications and interactive dashboards that can be shared with anyone on a web browser, tablet or smartphone, giving business people fingertip access to the information they need to make informed decisions. A highlight of R11 is a native cloud connector which provides access to Salesforce® and Sage Live data from both DecisionPoint Designer and DecisionPoint Server … Tableau shared its intent to enter the data preparation market – with a future product offering that is separate and distinct from Tableau Desktop. As a vendor that has been delivering a data prep platform longer than any other, and to more than 40,000 organizations globally, Datawatch is uniquely positioned to offer some advice to Tableau. Datawatch CMO Dan Potter has a unique take:
First, understand that data preparation is essential for both analytic and operational use cases. It’s not just about delivering data to Tableau BI users. The real value comes when you can prepare data once and deliver it to a wide variety of users. And to do this, you’ll need the cooperation of your key competitors (like Qlik, IBM and Microsoft) to support output to their native BI file formats. You’ll also have to partner with other vendors including predictive analytics tools, data warehouse vendors, ETL vendors and more. Second, don’t make the cost of data prep a barrier to deploying throughout the organization. Respondents in the last Gartner BI Magic Quadrant expressed dissatisfaction with Tableau’s license cost, earning you the dubious score as the second highest license cost in limiting broader BI deployment. Users need the flexibility to license data preparation based on the different use cases and pricing that matches the value delivered. Third, don’t shy away from dark data. The vast majority of enterprise data is not structured and is locked in a variety of formats. Often it is multi-structured in formats like PDF reports, web pages, log files, XML and more. Unlocking and blending this data is often essential for providing more complete insights and better predictive models. Finally, welcome to the party Tableau. Your entry will help further educate the market that there are robust data preparation tools available that can significantly reduce the time and cost to deliver insights throughout the organization.”
In the financial results category, insideBIGDATA learned that Cray Inc. (Nasdaq:CRAY) announced financial results for the third quarter ended September 30, 2016. Revenue for the third quarter of 2016 was $77.5 million, which compares with $191.4 million in the third quarter of 2015. Net loss for the third quarter of 2016 was $23.0 million, or $0.58 per diluted share, compared to net income of $10.9 million, or $0.27 per diluted share in the third quarter of 2015. Non-GAAP net loss was $19.5 million, or $0.49 per diluted share for the third quarter of 2016, compared to non-GAAP net income of $19.5 million, or $0.48 per diluted share for the same period of 2015. Overall gross profit margin on a GAAP and non-GAAP basis for the third quarter of 2016 was 30% and 31%, respectively. For the third quarter of 2015, GAAP and non-GAAP gross profit margin was 34% and 35%, respectively. We also heard commentary about NVIDIA’s latest results:
NVIDIA had a stellar quarter and the company is hitting on all cylinders. GPUs and Tegra were up a combined 54% YoY driven by huge increases in gaming, datacenter and automotive with steady progress from pro graphics,” said Patrick Moorhead of Moor Insights & Strategy. “It’s hard to spot a hole right now in their lineup and as self-driving cars and machine learning become more popular, I believe NVIDIA will just keep improving.”
The big data ecosystem also saw some significant customer wins starting with supercomputer leader Cray Inc. (Nasdaq: CRAY) announcing the Department of Defense (DoD) High Performance Computing Modernization Program (HPCMP) has awarded the Company with a $26 million supercomputer contract for a Cray® XC40™ supercomputer and three Cray Sonexion® storage systems. The Cray systems will be located at the U.S. Army Engineer Research and Development Center DoD Supercomputing Resource Center (ERDC DSRC) in Vicksburg, Mississippi. As the research organization of the U.S. Army Corps of Engineers, ERDC conducts R&D in support of the soldier, military installations, and civil works projects, as well as for other federal agencies, state and municipal authorities, and with U.S. industry through innovative work agreements. ERDC will use its Cray XC40 supercomputer and Cray Sonexion storage systems in support of its mission to develop innovative solutions for a safer, better world … MapR Technologies, Inc., provider of the Converged Data Platform, announced that Fishbowl, a leading customer engagement platform provider for the restaurant industry, is using the MapR Converged Data Platform with Apache Drill as the foundation of its multi-tenant software as a service (SaaS) solution, for analytic data storage, transformation and interactive end-user analytics interfaces. Fishbowl empowers restaurants to engage in a more intelligent, relevant and effective manner by enabling them to understand their guests’ desires, preferences and needs. More than 70,000 restaurant locations rely on Fishbowl’s platform, experts and strategic partnerships to improve brand preference and amplify same-store sales. With mounting requirements for scalability and the need to support growing varieties and volumes of data along with daily data refresh, Fishbowl needed a solution to achieve the desired performance levels and zero downtime. Ingesting and aggregating data from multiple disparate sources, Fishbowl wanted to provide a comprehensive view of restaurant guests to its clients at a lower total cost of ownership, ensuring affordable scalability. MarkLogic Corporation, a leading operational and transactional Enterprise NoSQL database provider, revealed that ABN AMRO Bank N.V., the international bank based in Amsterdam, The Netherlands, has chosen the MarkLogic® database for its Trade Store. ABN AMRO selected MarkLogic because of its ability to integrate complex trade data across asset-classes and several trading systems quickly and cost-effectively. ABN AMRO is using MarkLogic to bring vast amounts of unstructured and structured trade data into one central operational trade data store. With a consistent, transparent record of every order and trade event, ABN AMRO is able to comply with internal and external reporting requirements in a fast and flexible manner, now as well as in the future. ABN AMRO needed a database that could provide a fully integrated, consolidated view of data to provide a single source of truth for reporting purposes as well as to provide alerts when behavior is outside normal parameters. MarkLogic also fit the bill because its fast development could support the bank’s timeline. MarkLogic bitemporal capability will allow ABN AMRO to minimize risk through “tech time travel”—time-stamping and rewinding trades.
In funding news, we learned that Treasure Data, a leading cloud platform to make all data connected, current, and easily accessible, unveiled the close of a $25 million Series C funding round, led by SBI (formerly known as SoftBank Investment) and INCJ (Innovation Network Corporation of Japan) with additional investment from existing investors Scale Venture Partners, Sierra Ventures, AME Cloud Ventures, Dentsu, IT-Farm, Bill Tai and others, bringing the company’s total funding to $54 million. Treasure Data raised this round primarily to support the market introduction of the first Live Data Management platform. With the power to instantly collect and unify data from varied sources across the enterprise, along with intuitive, self-service analytics, Live Data Management opens greater access to real-time data insights to more users throughout an organization.
Partnerships and industry alignments continued marching forward starting with – SnapLogic announcing its new partnership with Snowflake Computing to simplify and accelerate data integration and analytics in the cloud. The partnership will include technology integration and joint go-to-market activities, and aims to help organizations harness all data in order to gain new insights, make better decisions and advance business outcomes … KPMG announced that it has joined the Industry Affiliates Program at the Data Science Institute (DSI) at Columbia University. The Institute develops technology to unlock the power of global data to help solve some of society’s most challenging problems, while educating the next generation of data scientists. KPMG data scientists will team with some of the world’s leading faculty and students in data science at DSI to develop innovative solutions for client challenges and promote interdisciplinary research that advances both theory and application … Focused on ensuring business leaders have timely access to mission-critical data to make informed decisions, Rosslyn Data Technologies, and Dun & Bradstreet, announced that they will partner to provide procurement professionals with self-service insights when, where, and how they need them. The joint offering, available now, uses the RAPid cloud-based platform self-service model to integrate global business data – including line item transactional details and category classifications into customizable spend analytics – to help procurement professionals better identify and manage ever-changing business risks and market opportunities. The solution aims to help organizations find greater savings faster and gain deeper insights into their supply base … Impexium, a global provider of membership management technology, announced that it will be offering Zoomdata, developers of the world’s fastest visual analytics platform for big data, to its customer base of local, state, and national associations, non-profits, professional societies and member-based organizations. Zoomdata’s analytics suite will be offered as an App in Impexium’s association management system (AMS).
Black Friday/Cyber Monday is upon us again, and a couple of big data vendors express their views on how to obtain an optimal experience:
As the busiest days of retail – online and in-store – approach, retailers are doing their best to predict what will fly off the shelves and what might not. This requires an analysis of historical sales data paired with real-time information about the sales of thousands of separate stock keeping units (SKUs) across thousands of stores. The data is large and very dynamic. Shoppers aren’t the only people who are busy during the holidays. Data scientists are hard at work doing things like dynamic pricing and product promotion. Buying patterns of shoppers will affect the content you see online and how it’s priced. Careful shoppers can play the game and get good deals, but for many, the work by data scientists could hand retailers a win by driving impulse buys and using low margin items as bait to drive high margin additional purchases.” — Mike Upchurch, Founder and Chief Operating Officer, Fuzzy Logix.
While the big focus every year around Black Friday and Cyber Monday has been on how companies can keep their websites and check-out counters up and running with the influx of shoppers, there is huge potential that many retailers struggle to take advantage of. That is understanding and analyzing the massive amount of data generated from customer purchasing and their habits. Often this goes unrealized because the disparate systems and diversity of data make it difficult to bring shopping and purchasing information together – to say nothing of incorporating unstructured data (such as that from social media sites) to gauge how shoppers are feeling and responding. Retailers that are able to unify their data and leverage cognitive search and analytics to gain insight will have a definite advantage to hit their revenue targets – by understanding their customers and tailoring their pricing, messaging, and campaigns for the remainder of the holiday shopping season.”- Jeff Evernham, Director of Consulting, North America at Sinequa.
And finally, our Vendor of the Week is Qlik®, a leader in visual analytics announcing a strong, strategic partnership with the United Nations bringing the power of data analytics to global humanitarian efforts to impact efficiency and efficacy. The United Nations, through the Office of Information and Communications Technology (OICT), is leveraging Qlik’s visual analytics platform to create applications advancing United Nations’ missions by aggregating and presenting information in an easy-to-use way that provides valuable insights for prompt action. This partnership comes as part of the Qlik Corporate Social Responsibility “Change Our World” program, which provides software and services to make the good work of humanitarian organizations better by making sense of complex information.
Sign up for the free insideBIGDATA newsletter.