Can High Energy Physics Change How We Do Big Data?

Print Friendly, PDF & Email
Dr. Frank Wuerthwein

Dr. Frank Wuerthwein

Over at the Memories of a Product Manager Blog, Dr. Frank Wuerthwein discusses his upcoming talk on at the ISC Big Data’13 conference in Heidelberg.

Wuerthwein teaches at University of California in San Diego and is currently “developing, deploying, and now operating a worldwide distributed computing system for high throughput computing with large data volumes” for the Large Hadron Collider at CERN. Today, “large” data volumes are measured in Petabytes. By 2020, he expects this to grow to Exabytes.

I want to have the maximally broad exposure, so I can have a maximum of avenues they can engage with me. If I talk only about Dynamic Data Centers, I can miss out ten other conversations which are worthwhile having. For example we have this dichotomy between structured and unstructured data. On one side you have Oracle like structured data, and to other extreme data unstructured, where you don’t even know what to look for until you actually look for a specific purpose. I want to position my hundreds of Petabytes of data from particle physics in this continuum, I don’t see this as a either / or. There is a lot of grey between.

Read the Full Story.

Speak Your Mind

*