Big data by the numbers

A recent ESG survey highlights the growing importance of big data analytics and its increasing relevance to the storage market.

This Content Component encountered an error
This Content Component encountered an error
This article can also be found in the Premium Editorial Download: Storage magazine: Solid-state adds VROOM to virtual desktops:

A recent ESG survey highlights the growing importance of big data analytics and its increasing relevance to the storage market.

While the term “big data” has been expanded to mean many things to many people, Enterprise Strategy Group (ESG) defines it as “data sets that exceed the boundaries and sizes of normal processing capabilities, forcing you to take a non-traditional approach.”

It’s clear that application and processor advancements are creating demands on data storage infrastructures that exceed normal processing capabilities. ESG recently conducted some research into big data processing trends to see just how far storage boundaries are being pushed and we came back with some pretty interesting findings.

@pb

Data analytics takes on new importance

First off, big data analytics is becoming increasingly important for both IT and the business side. In a 2011 survey, we asked large midmarket (500 to 999 employees) and enterprise (1,000 employees and up) IT decision makers familiar with their organization’s current database environment about the importance of big data analytics -- 6% said it was their most important IT priority and 45% said it was among their top five. In 2012, when we asked the same question (albeit a slightly different demographic that covered more of the midmarket and went to firms with as few as 100 employees), we saw the number of firms that will consider enhancing data analytics to be their top IT priority triple to 18% while 45% had it in their top five.

We added a spin to our survey in 2012 by asking about the importance of enhancing big data processing and analytics exercises relative to all business priorities. That moved the needle quite a bit, with 28% rating it as a top business priority relative to all business priorities and 38% putting it in their top five.

Next, the data is big. Our 2011 research found more than 50% of respondents processed at least 500 GB of data, on average, as part of a typical data analytics exercise. In 2012, we found the largest data set on which organizations conduct data analytics is, on average, 10 TB.

Processing and analytics is also becoming a more real-time exercise. In 2011, we asked about the frequency of updates in general, and discovered that 15% of those surveyed update in real-time and 38% within a day. We wanted to see the biggest of big data challenges in 2012, so we asked users how often they updated their largest data set -- 22% said they update in real-time while 45% update in near real-time (within a day).

Finally, there’s very little tolerance for downtime. In our 2012 research, 53% could only tolerate fewer than three hours of downtime before their organization would experience significant revenue loss or another adverse business impact. Of that group, 6% can’t tolerate any downtime. Only 14% of respondents indicated they could withstand downtime exceeding 24 hours.

@pb

Storage implications

While this is just a sampling of the data ESG has gathered on processing and analytics big data trends, I called out these points specifically because they have a ripple effect when it comes to data storage infrastructures. The size of the data sets is big, and growing, especially when you start counting the copies of data sets required for additional analysis and data protection. So, IT will be looking for scalable storage systems that can efficiently meet demands without requiring an army to manage them. At the same time, big data analytics -- and information delivered in near real-time -- is becoming a bigger priority for business sponsors, and IT is paying attention to the underlying infrastructures.

Indeed, ESG research indicates that a large number of IT infrastructure decision makers with knowledge of their data processing and analytics requirements will pay a premium for a storage system with high availability. Even though research indicates that there’s price sensitivity for these environments, more users are willing to pay a premium for high availability than for tiered storage, solid-state or efficiency technologies like data reduction and thin provisioning.

Looking at storage in a vacuum is never a good idea. In today’s new world of big data, it’s important to understand the forward-looking trends that shape tomorrow’s infrastructure requirements, lest the infrastructure become obsolete before its time.

BIO: Terri McClure is a senior storage analyst at Enterprise Strategy Group, Milford, Mass.

This was first published in July 2012

Dig deeper on Enterprise storage, planning and management

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

-ADS BY GOOGLE

SearchSolidStateStorage

SearchVirtualStorage

SearchCloudStorage

SearchDisasterRecovery

SearchDataBackup

Close