Sign up for our newsletter and get the latest big data news and analysis.

Tecton Announces Line-Up for First Annual Machine Learning Data Engineering Conference – apply()

Tecton, the enterprise feature store company, announced the line-up for apply(), a virtual conference that it is hosting on data engineering for applied machine learning (ML) April 21 – 22. apply() is a practitioner-focused community event for data and ML teams to discuss the practical data engineering challenges faced when building ML for the real world

At John Deere, ‘Hard Iron Meets Artificial Intelligence’

Intel and John Deere developed an integrated, end-to-end system of hardware and software that can generate insights in real-time, at levels beyond human capability. When using a neural network-based inference engine, the solution logs defects in real-time and automatically stops the welding process.

Video Highlights: Unleashing DataOps Keynote

In this keynote presentation from the DataOps Unleashed virtual conference, innovator Kunal Agarwal, CEO of Unravel Data, describes how companies large and small are using DataOps to make their technology stacks hum, get more done at a lower cost, and improve both customer experience and the bottom line.

How Big Data Helps Us Understand Denial of Service (DoS) Attacks

In this special guest feature, Dr. James Stanger, CompTIA Chief Technology Evangelist, highlights how big data is a concept that can provide insight into DDoS attacks and equip companies with the tools they need to effectively combat this threat. Another important tool for mitigating DDoS attacks is the use of multiple, redundant systems and cloud-based data scrubbing platforms that can filter out DDoS traffic. However, hackers have businesses beat when it comes to the early implementation of big data methodologies.

New to AI Adoption? Don’t Let Data be Your Achilles Heel

In this contributed article, Jeff White is the founder and chief executive officer of Gravy Analytics, discusses the realities of big data: no data source is perfect, and despite your best efforts, issues with new technologies like machine learning and AI are bound to occur. By understanding how your underlying data is collected, cleaned, verified and assembled, organizations can derive maximum value while optimizing internal resources, improving the customer experience, and avoiding costly mistakes along the way.

The Fate of Feature Engineering: No Longer Necessary, or Much Easier?

In this contributed article, editorial consultant Jelani Harper believes that features are the definitive data traits enabling machine learning models to accurately issue predictions and prescriptions. In this respect, they’re the foundation of the statistical branch of AI. However, the effort, time, and resources required to engender those features may become obsolete by simply learning them with graph embedding so data scientists are no longer reliant on hard to find, labeled training data.

The Big Data Dilemma

The Big Data Dilemma made a huge splash in the last Fundata Film Festival – the best-kept secrets of both the film and data industries – and grabbed the Official Selection designation of 2021. Shattering the ‘data-driven’ hype and revealing the coin-flipping truth, The Big Data Dilemma brings together global data anti-vangelists to tell it as it is, or at least as they believe it is.

Is there a More Environmentally Friendly Way to Train Artificial Intelligence?

In this special guest feature, Omri Geller, Co-founder and CEO at Run:AI, takes a timely and interesting look at one of the most pressing issues facing the computing industry by an accomplished data scientist. The issue relates to how machine learning is developed. In order for machine learning (and deep learning) to be able to accurately make decisions and predictions, it needs to be “trained.”

Gaining the Enterprise Edge in AI Products

In this contributed article, Taggart Bonham, Product Manager of Global AI at F5 Networks, discusses last June, OpenAI released GPT-3, their newest text-generating AI model. As seen in the deluge of Twitter demos, GPT-3 works so well that people have generated text-based DevOps pipelines, complex SQL queries, Figma designs, and even code. In the article, Taggart explains how enterprises need to prepare for the AI economy by standardizing their data collection processes across their organizations like GPT-3 so it can then be properly leveraged.

“Above the Trend Line” – Your Industry Rumor Central for 3/30/2021

Above the Trend Line: your industry rumor central is a recurring feature of insideBIGDATA. In this column, we present a variety of short time-critical news items grouped by category such as M&A activity, people movements, funding news, financial results, industry alignments, customer wins, rumors and general scuttlebutt floating around the big data, data science and machine learning industries including behind-the-scenes anecdotes and curious buzz.