Access "External storage might make sense for Hadoop"
This article is part of the February 2014 Vol. 12 No. 12 issue of Storage magazine chooses 2013 data storage products of the year
Using Hadoop to drive big data analytics doesn't necessarily mean building clusters of distributed storage -- good old external storage might be a better choice. The original architectural design for Hadoop made use of relatively cheap commodity servers and their local storage in a scale-out fashion. Hadoop's original goal was to enable cost-effective exploitation of data that was previously not viable. We've all heard about big data volume, variety, velocity and a dozen other "v" words used to describe these previously hard-to-handle data sets. Given such a broad target by definition, most businesses can point to some kind of big data they'd like to exploit. Big data is growing bigger every day and storage vendors with their relatively expensive SAN and network-attached storage (NAS) systems are starting to work themselves into the big data party. They can't simply leave all that data to server vendors filling boxes with commodity disk drives. Even if Hadoop adoption is just in its early stages, the competition and confusing marketing noise is ratcheting up... Access >>>
Premium Content for Free.
Best data storage products 2013: Products of the Year
by Andrew Burton, Todd Erickson, Sonia Lelii, Ellen O'Brien, Dave Raffo, Carol Sliwa, Sarah Wilson
This "Sweet 16" roster of storage products represents the leading technical innovation of the past year.
Big file storage scales for large data applications
by Eric Slack
There are two sides to the big data story: analytics using vast numbers of small files, and dealing with storage for really big files.
- Best data storage products 2013: Products of the Year by Andrew Burton, Todd Erickson, Sonia Lelii, Ellen O'Brien, Dave Raffo, Carol Sliwa, Sarah Wilson
10 mistakes to avoid in your disaster recovery planning process
by Jon William Toigo
Don't make your DR planning process harder than it is by trying to do too much or cutting corners. Careful planning is key to a successful recovery.
New storage architectures slowly making inroads
by Rich Castagna
Our latest survey charts the storage architecture alternatives readers are using in their storage shops.
- 10 mistakes to avoid in your disaster recovery planning process by Jon William Toigo
Data storage industry: Buy, buy or bye-bye?
by Rich Castagna
Cloud closures, flash-in-the-pan solid-state vendors … storage might seem a little more dangerous these days, but it just might be innovation at work.
Is a helium drive just a lot of hot air?
by Jon William Toigo
Filling drives with helium doesn't advance the art of hard disk design, it just makes it possible to stuff more old tech into a new package.
Virtualize servers for better data protection
by Jason Buffington
There aren't many reasons not to virtualize your servers, but there are plenty of compelling data protection reasons to virtualize them all.
External storage might make sense for Hadoop
by Mike Matchett
Using Hadoop to drive big data analytics doesn't necessarily mean building clusters of distributed storage; a good old array might be a better choice.
- Data storage industry: Buy, buy or bye-bye? by Rich Castagna
More Premium Content Accessible For Free
A lesson in flash caching
Solid-state storage is proliferating as a replacement for hard disk drives, where it offers a quick shift into the fast lane of storage processing...
Storage performance management: Ways to maximize your environment
Making your storage perform to the very best of its ability is an age-old problem with a long list of ways to approach it. But how should you start ...
Hypervisor vendors up the ante with new storage features
There's a struggle between doing storage the "old way" and the new demands that virtual servers put on networked storage. Storage vendors have ...