Hadoop technology has been discussed hand in hand with big data for some time now, but IT professionals still don't know the full extent of what the technology can do or how to use it.
The open source Hadoop framework is based on Google's MapReduce software and can process large data sets at a granular level. It offers analytics at a low cost and high speed that some analysts say can't be achieved any other way. Essential to the effectiveness of Hadoop is the Hadoop Distributed File System (HDFS), which allows parallel processing by spanning data over different nodes in a single cluster and provides fault tolerance.
However, HDFS is the source of one of the main issues users see with Hadoop technology: expanded capacity requirements due to Hadoop storing three copies of each piece of data in case a DataNode fails or is taken offline. That failover setup is necessary because each NameNode that controls the copy and distribution process of data is a single point of failure. Other complaints point to the complicated technology stemming from Hadoop's Java framework.
Despite the hurdles with Hadoop technology, analysts and users say the benefits are worth it. To help you determine that for yourself, this guide will walk you through the basics of what Hadoop technology can achieve, lay out the main concerns about the technology, and outline how it works with storage and the cloud.
Understanding the basics of Hadoop technology
Hadoop technology is not a single entity -- it consists of different open source products such as HDFS and MapReduce. While Hadoop software is free, some vendors also offer their own Hadoop distributions with support and maintenance add-ons. To find out how all the components work and what they can do, take a look at the links below.
John Webster describes how recent changes to HDFS and the NameNode can help to improve Hadoop technology. Continue Reading
Dealing with Hadoop pain points
Despite its popularity, criticism of Hadoop ranges from the requirement for a specialized skill set to several single points of failure in the Hadoop cluster. In the following links, you'll find explanations of these and other Hadoop issues, and learn how to confront them.
Described as cutting-edge, hot, niche and hard to use, Hadoop, like all celebrities, has its shining moments and dismal displays. Continue Reading
The enterprise can achieve inexpensive big data analytics with the Apache Hadoop framework, but only with qualified data scientists and appropriate applications. Continue Reading
Analyst John Webster details issues with Hadoop technology and what users can expect from Hadoop Version 2.0. Continue Reading
Research reveals businesses' relative lack of big data maturity and shows hurdles in both Hadoop technology and analytics techniques they need to overcome. Continue Reading
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
Understanding Hadoop technology and storage
Because Hadoop stores three copies of each piece of data, storage in a Hadoop cluster must be able to accommodate a large number of files. To support the Hadoop architecture, traditional storage systems may not always work. The links below explain how Hadoop clusters and HDFS work with various storage systems, including network-attached storage (NAS), SANs and object storage.
Storage expert John Webster discusses three ways to use shared storage with Hadoop technology in this Ask the Expert answer. Continue Reading
Various software vendors have begun offering connectors designed to help users bridge the gap between Hadoop clusters and relational databases. Continue Reading
Brien Posey explains how Hadoop clusters can be extremely beneficial to large amounts of unstructured data -- but they aren't ideal for all environments. Continue Reading
John Webster discusses how vendors of Hadoop technology are faring when faced with the challenge of meeting enterprise big data demands. Continue Reading
How Hadoop technology works with the cloud
Hadoop can be useful for analytics across cloud storage because of its parallel-processing capability. Because Hadoop can process data across many servers, large amounts of data stored in the cloud can be searched and analyzed at high speeds. From the links below, you'll learn how using Hadoop in the cloud works and how cloud storage can help address some common Hadoop problems.
Hadoop technology enables distributed big data processing across servers that can improve application performance and offer redundancy. Continue Reading
Learn how private cloud storage providers can help solve common Hadoop problems relating to availability, capacity and migration. Continue Reading
Experts discuss Hadoop technology
Now that you have a better understanding of how Hadoop technology works with big data, watch the videos below to get experts' takes on how well Hadoop works and the best ways to use it.
Of all the ways to handle the storage requirements of big data analytics, Hadoop technology is receiving the most attention. Find out why.
Learn why you should be on the lookout for issues with NameNode and HDFS when using Hadoop technology for big data storage.
The chief technology officer at Sears explains how to reduce data latency troubles by turning to the Hadoop framework and data science analytics.
In a video interview, TechTarget's Wayne Eckerson discusses the benefits and challenges of deploying Hadoop-based systems in big data environments.
Hadoop storage systems traditionally call for the use of embedded DAS for hardware-based storage within the Hadoop MapReduce framework. But alternatives exist.