The Future of Federal Agency Data Access, Integration, Sharing and Security

Print Friendly, PDF & Email

In this special guest feature, Gavin Robertson, CTO and SVP of WhamTech, discusses the future technology of data federation and protection within federal agencies. These new technologies could have a profound positive impact on the effectiveness and efficiency of future government programs. The benefits will be significant, including cost savings, accelerated implementations, improved and new capabilities, leveraged existing IT investments, improved data security, flexible future options and increased stakeholder satisfaction. After earning a BSc (Hons) in Chemical and Process Engineering and a MEng in Petroleum Engineering both at Heriot-Watt University, Edinburgh, Scotland, Gavin spent more than 15 years in the domestic U.S. and international oil and gas industry, often working with data-related projects. After a period as a consultant to WhamTech, Gavin joined as CTO and Senior VP in 1999, responsible for product design and development, and technical sales and marketing.

In today’s digital world, large volumes of data have amassed in individual agencies and created silos, posing a critical problem – how to effectively and efficiently manage and leverage this data for day-to-day operations, provide high performance access to high quality data to maximize insight to improve operations, and drive and optimize strategies. Compounding this problem further, there is a need to access and exchange data within and across federal, state and local agencies at multiple levels and integrate data from non-government, commercial and public sources.

There are three well-tried conventional approaches to sharing/exchanging data within and across government agencies: 1) copying data from multiple sources to centralized data warehouses, 2) federating queries across multiple data sources, and 3) using unstructured text search. However, each of these approaches have proven unsuccessful from specific technical points-of-view and more general cultural/resource points-of-view. The latter is because (a) many data source owners are reluctant to send, or are prohibited from sending, data or certain types of data to federal agencies, (b) the onus has always been on the data source owners to provide clean, high quality data without them necessarily having the resources and tools to do so, and (c) distrust arises since “sharing” data and information is seen as a one-way arrangement; in that federal agencies will not share back with state and local agencies, and, in many cases, take credit for the data and information provided. Each of these approaches leaves government agencies continually searching for better solutions to this long-standing and increasingly serious problem.

There are new, innovative technologies available that combine the best and overcome the worst of the three conventional approaches mentioned above. These technologies can be highly flexible, with a hybrid approach that addresses fundamental data management processes upfront of data discovery, profiling, security, quality, standardization, aggregation, calculation and joins, master data management, relationship mapping/graph database, data source monitoring, event processing and etc. The technology will be a virtual data management layer that is transparent to both applications and data sources, and complements and leverages existing IT infrastructures, systems, tools and applications.

Future technologies like these can leave data in sources in the various federal, state and local agencies, build and maintain adapters with indexes that reside behind an agency’s firewall, and can execute queries on these indexes from within and from other agencies, with permissions, without interfering with or slowing down local data sources. Only results data is read from sources. These technologies have other options to use either copied, raw data or indexed/clean data as a source, whether stored locally behind the firewall of an agency or centrally stored somewhere else such as a data center. They could be seen as virtual data warehouse or advanced virtual or physical data lake, depending on the configuration.

Several of the key features of these future technologies are keeping a seamless and automatic integration of, and updates to, all-important master data, providing virtual single person and other entity views of all data. Standards are used throughout, including standard data views such as National Information Exchange Model (NIEM), and healthcare’s HL7 and FHIR REST APIs, standard query language (SQL) and other query languages, some with conversion, standard drivers and web/data services, and support almost any standard application, including operational, CRM, BPM, reporting, BI and analytics. Additionally, they can support true interoperability with write back to data sources, virtual data warehouse, event processing and new workflows to support smartphone apps running against legacy data sources.

Improving operations/interoperability, analytics and data security are the three main IT objectives of almost any organization, including government agencies. Leaving data in sources, tending towards real-time and addressing data management fundamentals upfront, yet, enabling high performance data warehouse capabilities go a long way in realizing these objectives. Many past efforts addressing data access, integration and sharing have failed to live up to expectations. Solutions are needed that address technical, legal and cultural hurdles. Data analysts and scientists are both scarce and expensive, and they are not spending their time in the most effective or efficient way. Studies show that they spend around 80 percent of their time preparing data when they would be better utilized analyzing that data, which is continuing to grow. The annual Digital Universe study by IDC predicts that the amount of data globally will grow 10-fold by the year 2020. Data analysts and scientists need to be freed from managing these stockpiles of data, so that they are able to focus their efforts on evaluating and gaining valuable insights for improved outcomes.

Policies for managing and sharing information are vastly inconsistent between federal agencies and inhibit innovation, especially since a majority of the data has to remain secure. However, one of the most important aspects of each organization’s duties is to share data with other agencies, meaning data security will play an increasingly important role in managing, sharing and storing government data. Sixty-nine percent of senior executives reported they are already re-engineering their cybersecurity efforts, according to an annual Forbes survey from 2017.

Providing integrated and safe access to data that is updated in near real-time is the future of government agency technology solutions. A data management layer that is transparent to applications and data sources to keep the data readily accessible and secure, is the future. Effectively addressing fundamental data management upfront and allowing existing processes to be improved, new workflows and end-user apps to be developed on legacy systems, and data analysts and scientists to focus on analyzing data, instead of searching for and preparing it to gain important insights, is the future.

Several government contractors are becoming familiar with and implementing these new technologies, as it could have a profound positive impact on the effectiveness and efficiency of future government programs. The benefits will be significant, including cost savings, accelerated implementations, improved and new capabilities, leveraged existing IT investments, improved data security, flexible future options and increased stakeholder satisfaction.

 

Sign up for the free insideBIGDATA newsletter.

Speak Your Mind

*