Ezine

This article can also be found in the Premium Editorial Download "Storage magazine: Best practices for cloud backup integration."

Download it now to read this article plus other related content.

Access. As companies get better at understanding the potential of big data analysis, the need to compare differing data sets will bring more people into the data sharing loop. In the quest to create business value, firms are looking at more ways to cross-reference different data objects from various platforms. Storage infrastructures that include global file systems can help address this issue, as they allow multiple users on multiple hosts to access files from many different back-end storage systems in multiple locations.

Security. Financial data, medical information and government intelligence carry their own security standards and requirements. While these may not be different from what current IT managers must accommodate, big data analytics may need to cross-reference data that may not have been co-mingled in the past, which may create some new security considerations.

Cost. “Big” can also mean expensive. And at the scale many organizations are operating their big data environments, cost containment will be an imperative. This means more efficiency “within the box,” as well as less expensive components. Storage deduplication has already entered the primary storage market and, depending on the data types involved, could bring some value for big data storage systems. The ability to reduce capacity consumption on the back end, even by a few percentage points, can provide a significant return on investment as data sets grow. Thin provisioning,

Requires Free Membership to View

snapshots and clones may also provide some efficiencies depending on the data types involved.

Many big data storage systems will include an archive component, especially for those organizations dealing with historical trending or long-term retention requirements. Tape is still the most economical storage medium from a capacity/dollar standpoint, and archive systems that support multiterabyte cartridges are becoming the de facto standard in many of these environments.

What may have the biggest impact on cost containment is the use of commodity hardware. It’s clear that big data infrastructures won’t be able to rely on the big iron enterprises have traditionally turned to. Many of the first and largest big data users have developed their own “white-box” systems that leverage a commodity-oriented, cost-saving strategy. But more storage products are now coming out in the form of software that can be installed on existing systems or common, off-the-shelf hardware. In addition, many of these companies are selling their software technologies as commodity appliances or partnering with hardware manufacturers to produce similar offerings.

This was first published in April 2012

There are Comments. Add yours.

 
TIP: Want to include a code block in your comment? Use <pre> or <code> tags around the desired text. Ex: <code>insert code</code>

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
Sort by: OldestNewest

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to: