Feature

'Big data' survey on storage and analytics shows rise in implementation

What you will learn: With application and processor advancements producing a “fire hose” of data for many organizations, “big data” analytics has become a hot topic for a wide range of technology professionals, including chief technology officers, chief information officers, data analysts and storage administrators. Enterprise Strategy Group (ESG) recently conducted some research into

    Requires Free Membership to View

big data processing trends, and Terri McClure, a senior storage analyst at the firm, digs into the big data survey results. Find out what respondents had to say on topics such as the prioritization of big data operations in the IT center, the average size of their big data sets, the necessity of analyzing data in real-time and their company's downtime tolerance.

Big data analytics is becoming increasingly important for both IT and the business side. In a 2011 survey, we asked large midmarket (500 to 999 employees) and enterprise (1,000 employees and up) IT decision makers familiar with their organization’s current database environment about the importance of big data analytics -- 6% said it was their most important IT priority and 45% said it was among their top five. In 2012, when we asked the same question (albeit a slightly different demographic that covered more of the midmarket and went to firms with as few as 100 employees), we saw the number of firms that will consider enhancing data analytics to be their top IT priority triple to 18% while 45% had it in their top five.

We added a spin to our big data survey in 2012 by asking about the importance of enhancing big data processing and analytics exercises relative to all business priorities. That moved the needle quite a bit, with 28% rating it as a top business priority relative to all business priorities and 38% putting it in their top five.

More big data survey articles

Results from The Economist Intelligence Unit and Avanade find rise in projects for big data

Ventana offers big data management survey results

Forrester: Big data cooperation key to overcoming business challenges

MIT Sloan CIO Symposium: Quality improves big data performance

Next, the data is big. Our 2011 research found that more than 50% of respondents processed at least 500 GB of data, on average, as part of a typical data analytics exercise. In 2012, we found the largest data set on which organizations conduct data analytics is, on average, 10 TB.

Processing and analytics is also becoming a more real-time exercise. In 2011, we asked about the frequency of updates in general, and discovered that 15% of those surveyed update in real-time and 38% within a day. We wanted to see the "biggest" of the big data challenges in 2012, so we asked users how often they updated their largest data set: 22% said they update in real-time while 45% update in near real-time (within a day).

Finally, there’s very little tolerance for downtime. In our 2012 research, 53% could only tolerate fewer than three hours of downtime before their organization would experience significant revenue loss or another adverse business impact. Of that group, 6% can’t tolerate any downtime. Only 14% of respondents indicated they could withstand downtime exceeding 24 hours.

Storage implications

While this is just a sampling of the data ESG has gathered on processing and analytics big data trends, I called out these points specifically because they have a ripple effect when it comes to data storage infrastructures. The size of the data sets is big, and growing, especially when you start counting the copies of data sets required for additional analysis and data protection. So, IT will be looking for scalable storage systems that can efficiently meet demands without requiring an army to manage them. At the same time, big data analytics -- and information delivered in near real-time -- is becoming a bigger priority for business sponsors, and IT is paying attention to the underlying infrastructures.

Indeed, ESG research indicates that a large number of IT infrastructure decision makers with knowledge of their data processing and analytics requirements will pay a premium for a storage system with high availability. Even though research indicates that there’s price sensitivity for these environments, more users are willing to pay a premium for high availability than for tiered storage, solid-state, or efficiency technologies such as data reduction and thin provisioning.

Looking at storage in a vacuum is never a good idea. In today’s new world of big data, it’s important to understand the forward-looking trends that shape tomorrow’s infrastructure requirements, lest the infrastructure become obsolete before its time.

BIO: Terri McClure is a senior storage analyst at Enterprise Strategy Group, Milford, Mass.

This article originally appeared in Storage magazine.

 


This was first published in August 2012

There are Comments. Add yours.

 
TIP: Want to include a code block in your comment? Use <pre> or <code> tags around the desired text. Ex: <code>insert code</code>

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
Sort by: OldestNewest

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to: