The insideBIGDATA IMPACT 50 List for Q3 2023

The team here at insideBIGDATA is deeply entrenched in keeping the pulse of the big data ecosystem of companies from around the globe. We’re in close contact with the movers and shakers making waves in the technology areas of big data, data science, machine learning, AI and deep learning. Our in-box is filled each day with new announcements, commentaries, and insights about what’s driving the success of our industry so we’re in a unique position to publish our quarterly IMPACT 50 List.

Video Highlights: Modernize your IBM Mainframe & Netezza With Databricks Lakehouse

In the video presentation below, learn from experts how to architect modern data pipelines to consolidate data from multiple IBM data sources into Databricks Lakehouse, using the state-of-the-art replication technique—Change Data Capture (CDC).

AI for Legalese

Have you ever signed a lengthy legal contract you didn’t fully read? Or have you every read a contract you didn’t fully understand? Contract review is a time-consuming and labor-intensive process for everyone concerned — including contract attorneys. Help is on the way. IBM researchers are exploring ways for AI to make tedious tasks like contract review easier, faster, and more accurate.

Accelerating Training for AI Deep Learning Networks with “Chunking”

At the International Conference on Learning Representations on May 6, IBM Research will share a deeper look around how chunk-based accumulation can speed the training for deep learning networks used for artificial intelligence (AI).

Movies, Neural Networks Boost AI Language Skills

When we discuss about artificial intelligence (AI), how are machines learning? What kinds of projects feed into greater understanding? For our friends over at IBM, one surprising answer is movies. To build smarter AI systems, IBM researchers are using movie plots and neural networks to explore new ways of enhancing the language understanding capabilities of AI models.

Advancements in Dynamic and Efficient Deep Learning Systems

We’re seeing much hype in the marketplace about the potential of AI, especially with respect to computer vision systems and its ability accelerate the development of everything from self-driving cars to autonomous robots. To create more dynamic and efficient deep learning systems, that don’t compromise accuracy, IBM Research is exploring new and novel computer vision techniques from both a hardware and software angle.

The Top 3 Industries AI Will Disrupt in 2018

In this contributed article, Neil Sahota, an IBM Master Inventor and World Wide Business Development Leader in the IBM Watson Group, discusses the top 3 industries AI will disrupt in 2018. As AI continues to mature, there are many new and exciting ways companies across all industries can implement the technology. But Neil believes that there are three specific industries — legal, hospitality, and real estate — that will see the most impactful change through AI in the year ahead.

Operationalizing Data Science

In the video presentation below, Joel Horwitz, Vice President, Partnerships, Digital Business Group for IBM Analytics, discusses what it means to “operationalize data science” – basically what it means to harden the ops behind running data science platforms.

IBM Introduces New Software to Ease Adoption of AI, Machine Learning and Deep Learning

IBM announced new software to deliver faster time to insight for high performance data analytics (HPDA) workloads, such as Spark, Tensor Flow and Caffé, for AI, Machine Learning and Deep Learning. Based on the same software, which will be deployed for the Department of Energy’s CORAL Supercomputer Project at both Oak Ridge and Lawrence Livermore, IBM will enable new solutions for any enterprise running HPDA workloads.

IBM Expands Watson Data Platform to Help Unleash AI for Professionals

IBM (NYSE: IBM) announced new offerings to its Watson Data Platform, including data cataloging and data refining, which make it easier for developers and data scientists to analyze and prepare enterprise data for AI applications, regardless of its structure or where it resides. By improving data visibility and security, users can now easily connect and share data across public and private cloud environments.