Skip to content
Daisie RegisterAugust 2 20162 min read

Big Data Evolves and Gains Momentum in Federal Government

“Big Data” is not a term that’s very popular with some of the federal government’s program managers working on initiatives bearing that label. Or so they said at the Advanced Technology Academic Research Center’s semi-annual Big Data Summit, held at the end of June in Washington, DC.

“I actually hate the term ‘big data,’” said Martin Trevino, Ph.D., senior strategist and organizational architect, National Security Agency. “It’s so genericized, so overused, [it’s] devaluing everything we do.”

Trevino suggested that “total data” might be a better term, since the outputs and outcomes being sought frequently are produced by combining large data sets with smaller, more esoteric ones. “That’s when you get excellent, or unexpected, results,” he said.

Mark Krzysko, deputy director for enterprise information at the Defense Department, oversees one big data initiative – the Defense Acquisition Visibility Environment – also known as DAVE.

“We talk about it as a capability,” Krzysko said. “We support four major processes with DAVE – statutory reporting to Congress, support budget discussions, make decisions on acquisition programs, and track program status.”

He said DAVE brings the concept of big data into a very narrow lane that Pentagon executives care about – acquisition – and makes information available to those who have the authority to use and act on it.

Jeff Chen, chief data scientist for the Department of Commerce, said the Department has set up incubators internally, where employees can try out ideas for combining data to generate new information. “Right off the bat we got a few dozen applications,” he said.

In a second panel, Dan Morgan, chief data officer for the Department of Transportation, also noted that big data isn’t just big data. “We cobble together 40 or more different components to make a big data platform,” he said.

In DoT’s “innovation sandbox,” Morgan described a 4-6 week project that is gathering data from sensors in pavement and combining it with data from the National Oceanographic and Atmospheric Administration to help figure out how to design better pavement.

A higher-profile DoT project was the department’s Smart City Challenge, soliciting grant proposals for how smart technology could transform the nation’s cities. There were 78 initial applications, and seven finalists were chosen to submit more detailed proposals. In June, Morgan said, Columbus, Ohio won the $40 million grant competition.

Kris Rowley, GSA’s chief data officer, told the audience there actually is something called a “Big Data Cabinet,” led by D.J. Patil, the Chief Data Scientist of the U.S. (and deputy CTO for data policy in the White House Office of Science and Technology Policy).

The makeup of the cabinet is pretty broad, Rowley said. “There are CDOs, also data analytics, data scientists … The discussions right now are pretty broad – everyone has a slightly different angle right now.”

This points to another misconception about big data – the whole concept of “data scientist” is still pretty undefined. Bob Landreth, DISA’s program manager, Big Data Platform, said there isn’t exactly a skills gap or a tools gap, but that the knowledge needed to be a “data scientist” cuts across a variety of fields. “It’s not necessarily one particular skillset that’ll get us through this,” he said.

But one thing that is clear, managing data and using it in meaningful ways continues to be a priority for federal agencies.

COMMENTS