Sign up for our newsletter and get the latest big data news and analysis.

Video Highlights: Why Does Observability Matter?

Why does observability matter? Isn’t observability just a fancier word for monitoring? Observability has become a buzz word in the big data space. It’s thrown around so often, it can be easy to forget what it even really means. In this video presentation, our friends over at Pepperdata provide some important insights into this this technology that’s growing in popularity.

How to Optimize the Modern Data Stack with Enterprise Data Observability

In this sponsored post, our friends over at Acceldata examine how in their attempt to overcome various challenges and optimize for data success, organizations across all stages of the data journey are turning to data observability where they can get a continuous, comprehensive, and multidimensional view into all enterprise data activity. It’s a critical aspect of optimizing the modern data stack, as we’ll see. 

Data Quality Should Keep You Up at Night (But There’s an Antidote to Data-Induced Insomnia)

In this sponsored post, our friends over at Acceldata examine how integrating data observability into your business operations will create the necessary environment and feedback loop needed to improve data quality, at scale, on an ongoing basis. It will also help your enterprise make the most out of all the data quality best practices your data team adopts, and will also probably enable you to get a peaceful night’s sleep.

What Is Data Reliability Engineering?

In this contributed article, Kyle Kirwan, CEO and co-founder of Bigeye, discusses Data Reliability Engineering (DRE), the work done to keep data pipelines delivering fresh and high-quality input data to the users and applications that depend on them. The goal of DRE is to allow for iteration on data infrastructure, the logical data model, etc. as quickly as possible, while—and this is the key part! —still guaranteeing that the data is usable for the applications that depend on it.

Looking Ahead | Observability Data Management Modernization

In this contributed article, Karen Pieper, VP of engineering at Era Software, discusses how organizations today use real-time data streams to keep up with evolving business requirements. Setting up data pipelines is easy. Handling the errors at each stage of the pipeline and not losing data is hard.

Enterprise Strategy Group (ESG) Cloud Observability Survey, Sponsored by Yotascale, Reveals Enterprises are Challenged to Keep Track of Cloud Costs and Need Better Visibility into Cloud Spend

Yotascale, a leader in dynamic cloud cost management, announced the results of an ESG Observability Survey it sponsored to survey IT, DevOps, and AppDev professionals responsible for evaluating, purchasing, managing, and building application infrastructure. Out of 357 professionals, 64% agree that the adoption of public cloud or multiple public cloud providers has made observability significantly more difficult; this increases to 74% for the technology industry. The survey results highlight a strong need to gain better visibility into cloud costs while reducing the burden to the affected DevOps / IT teams that are responsible for those costs.

Observability: What Does the Future Hold?

In this special guest feature, Abel Gonzalez, Director of Product Marketing, Sumo Logic, lays out where Observability is going for the enterprise as well as explaining where we’ve been and why it’s important. At the end of the day, it’s critical to connect observability back to the end goal of the business—to serve its customers, community, and shareholders. Because that’s really what it’s all about. 

How Governing Observability Data is Critical to ESG Success

In this contributed article, Nick Heudecker, Senior Director of Market Strategy at Cribl, discusses how observability data comprises the logs, events, metrics, and traces that make things like security, performance management, and monitoring possible. While often overlooked, governing these data sources is critical in today’s enterprises. The current state of observability data management is, at best, fragmented and ad hoc. By adopting an observability pipeline as a key component in your observability infrastructure, you can centralize your governance efforts while remaining agile in the face of constant change.