How the Federal Government Can Get Ahead of the Impending Data Revolution

The federal government -- and its BI community -- needs to go real-time. The time to prepare is now.

By Marc Demarest, CEO and Principal, Noumenal, Inc.

Editor's Note: Marc Demarest will be presenting a session on The Streaming Data Revolution for BI Design and Implementation at the TDWI 2011 Government Summit on April 5, 2011 in Crystal City, Virginia. ()

How data is collected, analyzed, and used is changing rapidly as real-time, sensor-based monitoring applications grow in popularity. In response, the federal business intelligence (BI) community must prepare for and embrace that change if it the government is to remain fully effective in its regulatory, oversight, law enforcement, and public safety roles.

Sensors and monitoring applications -- whether placed on a street corner, on a car, inside the electric grid or inside a dialysis machine -- produce streaming data that is live, continuously changing, structurally invariant, and voluminous.

The military, homeland security, financial, manufacturing, and energy sectors increasing rely on streaming data; collection devices are appearing in the health care, transportation, and public safety sectors at astounding rates.

"Smart" will soon be a staple of the daily lexicon. Smart buildings are outfitted with sensors to gauge energy usage and carbon emissions and smart smart hospitals track medication and supply inventory.

For government, the challenge in adapting to this new reality will be to "unlearn" what it has been taught about BI. Basic skills all translate exactly: BI professionals will still have to design and implement data warehouses and marts and integrate client-side visualization environment.

If BI professionals apply long-established design and implementation methods, they may experience spectacular failure because streaming data is fundamentally different from traditional BI in three key ways:

-- Extraction, transformation, and load (ETL) is no longer needed. Streaming data is loading constantly so there are no "updates," only inserts. Sources are widely distributed (there could be tens of thousands of them), and the target schema is usually simple.

-- Traditional business intelligence tools are not used. With streaming data, analytics must be performed in or close to the database programmatically, rather than relying on ad hoc query and reporting tools. Streaming data analysis' primary mission is to spot anomalies and outliers for further investigation (then perform a more iterative analysis to determine the root cause of the abnormality, what it means, and what actions are needed as a result).

-- Data visualization must be dynamic. The sliding time windows and multiple data dimensions required for effective sensor-based BI aren't effectively illustrated by static pie charts and bar line visualization tools in spreadsheets. Heat-map and cat-whisker diagrams that change with incoming data are needed.

The government BI community still has time to adapt to this new reality, as the technology industry works to create more powerful database management systems and the networking infrastructure with the processing capacity, flexibility, and scalability required to effectively analyze massive amounts of data in real-time.

BI professionals should begin now to educate themselves, visit colleagues focused on applications for the military, energy management, and other agency programs that are ahead of the curve in using sensors and streaming data -- or even consider switching jobs into one of those environments.

As is often the case, there likely will be cultural resistance to this change. However, if BI professionals view streaming data as an exotic data type that must be dealt with only now and then, they risk being drowned in what the National Science Foundation has characterized as a coming "tsunami" of sensor-based data.

Worse, failure to respond effectively to the new technical challenges posed by streaming data could render federal government agencies ineffective and even irrelevant in fulfilling their growing functions to regulate and oversee markets and ensure the well-being of its constituents.

Every area of our lives is preparing to go real-time in their respective day-to-day functions. The federal government -- and its BI community -- needs to go real-time too. The time to prepare is now.

Marc Demarest is CEO and principal of Noumenal, Inc., a principals-only private intellectual capital consulting firm specializing in information technology, biotechnology, nanotechnology, and clean technology. You can contact the author at

Must Read Articles