News

Symantec tackles Hadoop storage, 'big data' analytics

Sonia Lelii

Symantec Corp. today announced an Apache Hadoop add-on capability for its Veritas Cluster File System to help run "big data"

    Requires Free Membership to View

analytics on storage area networks instead of scale-out, commodity servers using local storage.

Symantec has written a Hadoop Connector for the Hortonworks Data Platform that resides on top of Veritas CFS and sits on a SAN. The goal is to give data used in Hadoop analytics such enterprise features as high availability, snapshots, deduplication and compression.

Typically, Hadoop storage consists of distributed, scale-out processing nodes, because the Hadoop Distributed File System (HDFS) turns each node into a larger file system. Symantec Hadoop Connector is a software layer that sits between the cluster file system and the Hortonworks Hadoop stack so HDFS can run on networked storage instead of direct attached storage. This enables a SAN to serve as Hadoop storage.

More on Hadoop storage

Hortonworks to offer Talend Open Studio for Big Data

Cleversafe to offer Hadoop computation in its cloud storage software

Enterprise-ready Hadoop storage is goal of EMC Isilon

Ask the expert: How is Hadoop architecture getting better?

"Why build an all-new server [infrastructure] when you can use a perfectly good SAN?" said Dan Lamorena, director of product marketing for Symantec's Storage and Availability Management Group. "We say, 'Let data reside where it is and run analytics there.' Why create a new DAS environment?"

Symantec defines big data as large customer records that require heavy analytics rather than large files used for media, entertainment and genomics. The Hadoop connector can be downloaded for free by CFS customers, Lamorena said.

Much of the data running on Veritas CFS is stored on a SAN and it's the type of data customers want to use for data analysis, said Mike Matchett, senior analyst and consultant at Hopkinton, Mass.-based Taneja Group Inc.

"HDFS is designed to work over DAS," Matchett said. "But HDFS doesn't protect data very well. It's difficult to back up. You can't take snapshots off it, and it's difficult to replicate over a WAN. Hadoop usually has no high availability and it's hard to access data from HDFS." The Symantec connector means CFS customers "can still run the Hadoop cluster and instead of using HDFS on each node, you point Hadoop to the Veritas Cluster File System running on a SAN," he said.

Matchett said there may be a performance tradeoff when using a SAN versus distributing processing to run Hadoop. Data performance on the Veritas CFS may be better or worse depending on the algorithm used. "Some algorithms improve performance when run over local storage," he said.


There are Comments. Add yours.

 
TIP: Want to include a code block in your comment? Use <pre> or <code> tags around the desired text. Ex: <code>insert code</code>

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
Sort by: OldestNewest

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to: