Government Use of Data Analytics: Case Studies

Print Friendly, PDF & Email

The insideBIGDATA Guide to Data Analytics in Government provides an in-depth overview of the use of data analytics technology in the public sector. Focus is given to how data analytics is being used in the government setting with a number of high-profile use case examples, how the Internet-of-Things is taking a firm hold in helping government agencies collect and find insights in a broadening number of data sources, how  government sponsored healthcare and life sciences are expanding, as well as how cybersecurity and data analytics are helping to secure government  applications.

Government Use of Data Analytics: Case Studies

In order to illustrate how government agencies are rapidly moving forward with the adoption of the big data technology stack, in this section we’ll  consider a number of use case examples that have benefited from these tools. In addition, these project profiles show how big data is steadily merging with traditional HPC architectures. In each case, significant amounts of data are being collected and analyzed in the pursuit of more streamlined government.

Dubai’s Smart City Initiative

Dell EMC has provided an enterprise hybrid cloud platform to the technology arm of Dubai’s smart city initiative. Smart Dubai Government deployed the cloud platform to establish an Information Technology-As-A-Service infrastructure that will work to aid the delivery of new services for individuals and organizations. The cloud platform is designed to provide self-service and automation functions, deeper visibility into performance and capacity usage of infrastructure resources, and a unified backup and recovery system.

The hybrid cloud platform is based on VMware’s vCloud Suite and Dell EMC’s VPLEX and Vblock converged infrastructure and ViPR Controller software-defined storage automation. The platform was built to integrate scalability, speed, and agility of public cloud platforms with the control and security of private systems.

San Diego Supercomputer Center

The San Diego Supercomputer Center (SDSC) was established as one of the nation’s first supercomputer centers under a cooperative agreement by  the National Science Foundation (NSF) in collaboration with UC San Diego. The center opened its doors on November 14, 1985. Today, SDSC  provides big data expertise and services to the national research community.

In 2013, SDSC was awarded a $12 million grant by the NSF to deploy the Comet supercomputer with the help of Dell and Intel as a platform for performing big data analytics. Using the Intel® Xeon® Processor E5-2680 v3, SDSC uses parallel software architectures like Hadoop and Spark to perform important research in the field of precision health and medicine. It also performs wildfire analysis and clustering of weather patterns as well as integration of satellite and sensor data to produce fire models in real-time.

National Center for Supercomputing Applications

Established in 1986 as one of the original sites of the National Science Foundation’s Supercomputer Centers Program, the National Center for  Supercomputing Applications (NCSA) is supported by the state of Illinois, the University of Illinois, the National Science Foundation, and grants from  other federal agencies. The NCSA provides computing, data, networking, and visualization resources and services that help scientists, engineers,  and scholars at the University of Illinois at Urbana-Champaign and across the country. The organization manages several supercomputing resources, including the iForge HPC cluster based on Dell EMC and Intel technologies. One particularly compelling scientific research project that’s  housed in the NCSA building is the Dark Energy Survey (DES), a survey of the Southern sky aimed at understanding the accelerating expansion rate  of the universe. The project is based on the iForge cluster and ingests about 1.5 terabytes daily.

Tulane University

As part of its rebuilding efforts after Hurricane Katrina, Tulane University partnered with Dell EMC and Intel to build a new HPC cluster to enable the analysis of large sets of scientific data. The cluster is essential to power data analytics in support of scientific research in the life sciences and other  fields. For example, the school has numerous oncology research projects that involve statistical analysis of large data sets. Tulane also has researchers  studying nanotechnology, the manipulation of matter at the molecular level, involving large amounts of data.

Tulane worked with Dell EMC to design a new HPC cluster dubbed Cypress, consisting of 124 Xeonbased PowerEdge C8220X server nodes, connected through the high-density, low-latency Z9500 switch, providing a total computational theoretical peak performance of more than 350 teraflops of computational power. Dell EMC also leveraged their relationship with Intel, who in turn leveraged their relationship with leading Hadoop distribution Cloudera allowing Tulane to do data analytics using Hadoop in an HPC environment.

Using Cypress enables Tulane to conduct new scientific research in fields such as epigenetics (the study of the mechanisms that regulate gene activity), cytometry (the measurements of the existence of certain subsets of cells within a kind of tissue in the human body), primate research, sports-related concussion research, and the mapping of the human brain.

Translational Genomics Research Institute

To advance health through genomic sequencing and personalized medicine, the Translational Genomics Research Institute (TGen) required a robust, scalable high-performance computing environment complimented with powerful data analytics tools for its Dell EMC Cloudera Apache Hadoop  platform, accelerated by Intel. For example, a genome map is required before a personalized treatment for neuroblastoma, a cancer of the nerve cells,  can be designed for the patient, but conventional genome mapping takes up to six months. Dell EMC helped TGen reduce the time from six months to  less than four hours and enables biopsy to treatment in less than 21 days. Now TGen is widely known for their comprehensive genome analysis and  collaboration through efficient and innovative use of their technology to support researchers and clinicians to deliver on the promises of personalized  medicine for better patient outcomes.

Summary

Data analytics for government is a rapidly evolving field, offering exciting opportunities that, when explored and applied, can help fragile states uncover powerful and effective methods for optimizing governance. Furthermore, it is clear that government agencies have an appetite for embracing big data technologies to help transform the public service experience for citizens and employees. Government leaders need to focus on delivering  value and on adopting these emerging technologies while creating the kind of internal conditions that will inspire employees to embrace change.

If you prefer, the complete insideBIGDATA Guide to Data Analytics in Government is available for download in PDF from the insideBIGDATA White Paper Library, courtesy of Dell EMC.

Speak Your Mind

*