Hazelcast, a leading open source in-memory data grid (IMDG) with hundreds of thousands of installed clusters and over 17 million server starts per month, today launched Hazelcast Jet – a distributed processing engine for big data streams. With Hazelcast’s IMDG providing storage functionality, Hazelcast Jet is a new Apache 2 licensed open source project that performs parallel execution to enable data-intensive applications to operate in near real-time.
The latest big data news and articles
Software AG released it’s top six predictions for Application Integration in 2017. According to David Overos, director of product marketing for integration at Software AG: “Integration is no longer your IT department’s problem; it is everyone’s problem.
Syncsort, a global leader in Big Iron to Big Data solutions, announced the results from its annual “State of the Mainframe” survey, which underscore the rising importance of mainframe data as a critical component of enterprise-wide strategies that leverage modern data architectures for Big Data analytics.
In this contributed article, Smita Adhikary, Managing Consultant at Big Data Analytics Hires, provides a whirlwind overview of machine learning technology and why it’s important to increasing the value of enterprise data assets.
Sumo Logic Delivers Multi-Tenant SaaS Security Analytics Solution with Integrated Threat Intelligence
Sumo Logic, a leading cloud-native, machine data analytics service, announced the availability of the industry’s first multi-tenant SaaS security analytics solution with integrated threat intelligence. This, coupled with new security apps for monitoring and compliance and a milestone certification for PCI DSS 3.2, demonstrates Sumo Logic’s strong momentum and commitment to providing leading-edge security analytics capabilities and compliance standards to customers.
In this special guest feature, Carla Leibowitz, Head of Strategy and Marketing at Arterys, discusses how deep learning tools can aid physicians in determining a patient’s condition more quickly and accurately and what promise this holds for personalized care.
In this contributed article, Bobbie Kilberg, President and CEO of the Northern Virginia Technology Council (NVTC), discusses the misconception that tech is all about tech, and which areas employment prospects should be focusing on. More and more businesses and governments are capturing, analyzing and interpreting huge amounts of data to boost organizational performance, promote new discovery and understanding, enhance decision making, and tackle public policy and societal challenges.
IBM announced IBM Machine Learning, the first cognitive platform for continuously creating, training and deploying a high volume of analytic models in the private cloud at the source of vast corporate data stores. Even using the most advanced techniques, data scientists – in shortest supply among today’s IT skills* – might spend days or weeks […]
Researchers use text mining tools to extract and interpret facts, assertions, and relationships from vast amounts of published information. Mining accelerates the research process. However, despite the many benefits of text mining, researchers face a number of obstacles before they even get a chance to run queries against the bigger body of literature. Read on as Michael Iarrobino, Product Manager at Copyright Clearance Center, explains the key challenges for commercial text miners.
This is the first entry in an insideBIGDATA series that explores the intelligent use of big data on an industrial scale. This series, compiled in a complete Guide, also covers the changing data landscape and realizing a scalable data lake, as well as offerings from HPE for big data analytics. The first entry is focused on the recent exponential growth of data.