Conway: To Meet Big Data Markets, HPC Architectures Need to Balance Out

Print Friendly, PDF & Email

Over at Scientific Computing, IDC’s Steve Conway writes that the rise of high-performance data analysis (HPDA) Big Data applications will require unprecedented memory and I/O capabilities. To address this market opportunity, vendors will need to return to their roots and architect more balanced supercomputing systems. The questions is: Have we seen the end of machines optimized for Linpack?

Storage and data movement can no longer remain secondary considerations. The goal of using available budgets to maximize peak and LINPACK flops (machoflops) will need to give way over time to a more singular focus on user requirements for sustained performance and time-to-solution. Already, one leading site, NCSA, has declined to submit high-performance Linpack numbers for the Top500 rankings. The Top500 list will remain valuable for census-tracking large systems and trends affecting them over time, but the shift away from strong compute centrism will make it even more important to develop more balanced benchmarks for HPC buyers/users.

Read the Full Story.

Speak Your Mind

*