The Modern Problem of Collecting Network Data

Print Friendly, PDF & Email

Big data isn’t just about a collection of pre-existing information. It’s not about throwing digital content into a database to sort through later. It also involves deploying systems that can be used to collect more assets. This means extracting stats, numbers, user patterns and details from various sources. But not just sources you immediately know — it’s also about adapting the platform to cover and monitor new channels. Finally, it includes taking that data and making it practical through analytics and research.

By all rights, big data refers to the entire field or industry. It is the act of understanding a particular platform and everything that goes into its operation. But you may be wondering: Why is this an important distinction to make? Because it highlights one of the most challenging aspects of the technology — adapting it for use in other areas, such as network management.

How Big Data Changes Network Infrastructure

It’s no secret that advertisers heavily rely on big data to make decisions. Marketing professionals must understand their audience and customer base, and to do that, they need a ton of detailed information. It makes sense. It also makes sense that other industries are catching on and finding new ways to implement the technology. Retailers, for example, are using data to deliver personalized and targeted product recommendations to shoppers. But these industries are making changes to implement and adapt to big data, not the other way around. With networking and network management, it’s the element that is doing the changing —  not the data or related systems.

How so?

As a direct result of the emergence of big data, IT infrastructure and network properties are changing. Where previously, the hardware only needed to support a conventional traffic load, now it also needs to support monitoring and the storage of collected data. In other words, the entire infrastructure introduces more concerns, work and requirements when big data enters the playing field.

An Enterprise Management Associates study made it clear that nearly half (45 percent) of enterprises involved in big data projects experienced a boost in network traffic, as a direct result of the collection and storage of data. The study also revealed that this shift is forcing network managers to reconsider their infrastructure planning and design processes. In other words, network managers and IT infrastructure professionals must work to meet the requirements of a market change as everything moves to the cloud. Not only do they have to consider all the conventional traffic and usage scenarios, but they also have to factor in the elements of a deployed big data system.

Can the network handle all the incoming traffic? Can it handle heavier workloads and more active processes? Network managers must factor in everything and anything, making the entire process much more involved and demanding.

OK, we get it. So, the professionals have to change and do more work, but what about the field of IT infrastructure and networking itself?

Big Data Network Requirements

Big data places more demand on capacity management, network operations, performance and more importantly, security. The latter is a major concern, especially with the current state of the cybersecurity climate —  we don’t need to look far to find recent examples of, high-profile attacks. It’s not the fact that big data places more emphasis on the above-listed that makes things challenging, however. It’s the fact that you must be well-versed in all of the above to even consider working with the technology.

Here’s a simpler way to look at it. Marketers can view and analyze their data without having to understand the system behind the collection process —  they have an IT team for that. This freedom allows them to take the collected data and results and put it into action faster and more efficiently.

When it comes to networking, IT infrastructure and big data combined, anyone working with the collected information or the system as a whole must be familiar with the entire process. For instance, consider data that shows a serious network failure under heavy loads. Right away, it tells developers —  or anyone, really —  there’s something wrong. But in order to get to the bottom of that problem, you also need to understand the active system, how it’s being used and how that affects performance. If a system goes down and you don’t have a team that truly understand how to fix the problem, you’re in big trouble. Every time there’s a network failure or significant period of downtime, it costs nearly $8,000 per minute, and it’s been suggested that as much as 56 percent of businesses in North American don’t have recovery plans in the event of a failure.

It’s complicated, and that’s why many companies seek out professionals who understand the full scope of the work they’re involved with. Therein lies the problem of collecting network data and operating these systems: An experienced professional must handle everything related to the network and IT infrastructure in question. Many brands outsource this work because they don’t have the resources to devote to such a thing, nor the time to actively train an experienced, reliable team.

Contributed by: Kayla Matthews, a technology writer and blogger covering big data topics for websites like Productivity Bytes, CloudTweaks, SandHill and VMblog.

 

Sign up for the free insideBIGDATA newsletter.

 

 

 

Speak Your Mind

*