Hearing of the Technology Subcommittee of the House Science, Space, and Technology Committee - Next Generation Computing and Big Data Analytics

Hearing

Date: April 24, 2013
Location: Washington, DC

Good Morning. Today we are examining an issue that we hear a lot about. "Big
Data" is a popular new term that can mean a lot of different things.

The scientific community has generated and used Big Data before there was Big Data. In fact, in 1991 this Committee authored the High Performance Computing Act, which organized the federal agency research, development and training efforts in support of advanced computing.

Individual researchers have always been faced with difficult decisions about their data: what to keep, what to toss, what to verify with additional experiments. As our computing power has increased, so has the luxury of storing more data. Today, managing this data allows for better-informed experiments, more exact metrics, and perhaps significantly longer doctoral theses. Incorporating computer power to process more scientific data is transforming laboratories across the country.

At the same time, the ability to analyze large amounts of data across multiple networked platforms is also transforming the private sector. Through Big Data applications, businesses have not only revealed previously hidden efficiency improvements in their internal operations, but also uncovered entirely new types of business built around data that was previously not accessible due to its size and complexity.

Today's hearing will examine the hype around Big Data. Is the United States the most innovative nation in Big Data? Is our regulatory system creating any burdens on businesses? Could public-private partnerships with the federal agencies be improved to allow for more data innovations?

I thank our witnesses for their participation today and look forward to hearing their testimony.


Source
arrow_upward