News

Intelligence community requires efficient data storage tools

Dave Raffo

FAIRFAX, Va. -- A data center director in the intelligence community told colleagues at a Storage Decisions conference this week they need to embrace new efficient data storage management technologies

    Requires Free Membership to View

such as data deduplication, tiered storage and data archiving to manage exabytes of data with shrinking budgets. He also called on storage vendors to keep new tools coming.

David Jones, director of data center services for the National Geospatial-Intelligence Agency (NGA), delivered the keynote Tuesday at the Storage Decisions National System for Geospatial Intelligence (NSG) Summit 2011 for data storage professionals in the intelligence community.

In his speech, “Managing Requirements via Technology Implementation under Budget Constraints,” Jones said storage systems and technologies can help the intelligence community meet the challenges it faces from unprecedented data growth in the face of budget cuts.

“Information technology is a key enabler to the NGA’s mission, and storage may be the most important component,” he said. “We face a daunting increase in the amount of data we have to store and process to make the intelligence products we do today at the NGA.”

Jones admitted that talking about keeping up with storage demands during budget cuts was “on the surface a scary story,” but he said new data storage technologies can help, and he hopes storage vendors will come up with more advances.

The intelligence community’s data is rapidly approach exabyte level, he said. It has to keep critical data for 30 years, and its information-gathering sensors generate a great deal of data that must be tagged and frequently searched. Like other U.S. government agencies, the NGA is expected to reduce the footprint and power requirements of its storage and data center.

To manage and make the most of the data, the NGA needs to deduplicate to slow growth, tier data to keep high-value data on high-performing storage and move the rest to cheaper tiers, and use metadata tagging to determine its value for retention, he said.

“We’re going to be transforming the structure of our data,” Jones said. “We’ll look at how data is discovered and accessed across cloud environments, how we’ll be able to share data across the community, how capacities are managed virtually, and we’ll look at global namespace. Bringing storage bits down, and reducing size and power is another priority.”

Cloud storage might help, he said, but it needs to be more clearly defined.

“There is a particular type of cloud that’s called fog, and fog tends to obscure what you’re really seeing,” he said. “So we have to be careful about the way we describe the cloud, and really understand the features it provides, so we can make decisions that are appropriate to our needs.”

Another speaker at the event, consulting analyst Arun Taneja from Taneja Group, highlighted storage technologies that could probably help the NGA and said others are on the way.

“Innovation in storage is alive and well, and still kicking,” Taneja said in his session on the most useful new storage array technologies.

As “cool technologies,” Taneja listed thin provisioning, space-efficient snapshots, RAID advances such as erasure coding, volume-level and automated storage tiering, and primary storage deduplication and compression.

But Jon Toigo, a consultant at Toigo Partners International, drew attention to technologies that he doesn’t consider so cool. In his “What’s killing virtualization?”session, Toigo pointed to research from Gartner saying that less than 20% of servers are virtualized, and he blamed the way storage interacts with virtual servers for stalling virtualization projects.

“I sound like an old codger, but I just don’t believe it when a vendor tells me their silver bullet technology is going to solve all the problems,” he said. “If it sounds too good to be true, it usually is.”

Toigo maintains that virtualization causes as many or more problems than it solves.

“Server virtualization underscores problems you’ve always had with your storage,” he said. “We’ve always had a hassle around capacity management, performance management and data protection management. Today, we need instant capacity scaling. We need to dynamically provision capacity to an application when it needs it. The whole idea of virtualization as a means for rapid application deployment makes perfect sense, but not if the storage isn’t there to support it.”

Toigo said the ultimate answer may be more -- or at least smarter virtualization -- in the former of virtualized networks and virtual desktop infrastructure (VDI).

“Virtualization is inevitable,” he said. “There’s a smart way to do this, and a not-so-smart way to do this. We’re at the beginning of this process, not at the end.”


There are Comments. Add yours.

 
TIP: Want to include a code block in your comment? Use <pre> or <code> tags around the desired text. Ex: <code>insert code</code>

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
Sort by: OldestNewest

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to: