Article

MySpace tackles extraordinary data storage requirements

Jo Maitland, Senior Executive Editor

Social networking site MySpace, owned by News Corp., has 130 million subscribers, is adding 250,000 new users per day and currently hosts more than 127 million profile pages. That's an incredible amount of data that has to be stored and instantly available, 24/7.

Getting an up-to-the-minute figure on exactly how much

Requires Free Membership to View

data storage the company has is tough, as MySpace is constantly unpacking new servers. It's in the multipetabyte range and is spread across several different storage architectures. On a recent tour of one of the firm's largest data centers in Los Angeles, vice president of technology Jim Benedetto pointed to a stack of flattened cardboard boxes as large as a truck. "That's just the systems we unpacked this week … and we can't even power them all up, there's no more power in Los Angeles," he said. MySpace went through eight tons of cardboard in the past year, unpacking servers.

Data storage management process info
Storage users see power crunch coming  

3PAR adds midrange array for secondary sites  

Rising electric rates inspire energy-efficient storage  

HDS repeats as enterprise arrays champ

The company uses a homegrown distributed file system that runs across 1,000 Hewlett-Packard Co. (HP) servers to store the majority of its small files, including MP3s and video clips -- 3 billion images in total. Eight dedicated senior developers and engineers keep this monster up and running night and day.

"Eventually, we realized that nobody understood our problem more than us, so it was easier to build a system ourselves," Benedetto said, of the company's proprietary distributed file system. He noted MySpace has had "every storage vendor in our data center." The latest to get the boot is likely to be Network Appliance Inc. (NetApp), which currently has four of its older filers at the site. "We're undecided on the future of these," Benedetto said.

MySpace's biggest headache is access patterns. "We live or die by how fast we can serve our content at peak time, that's about 3 million users online, changing their data at the same time," he said. "Really, all our data can be accessed at all times." And today's data storage systems simply can't handle this, he said. At the back-end of its application, MySpace has a "massive caching tier" that helps ease input/output (I/O). "Our workload is extremely write intensive, which uses disk I/O significantly." In some cases, MySpace has 150 GB and 73 GB drives with only 5 GB to 10 GB of data on them, because otherwise the disk maxes out at the I/O level. The industry lingo for this is "disk thrashing."

At one time, MySpace did use traditional modular storage from a well-known industry supplier [Benedetto declined to name the vendor], but his administrators spent all day long moving disks around. "Suddenly, a bunch of users would be hot over here, then over here, and we had to restripe multiple storage volumes together into metaLUNs to expand existing volumes … We were doing this all day long," he said.

MySpace invited three vendors (all household names) into its shop for three months to show how they would address the problem. None were able to do that. Then the company discovered 3PARdata Inc., the only vendor to come close to alleviating its disk thrashing issue with an architecture that lets the application share all the disks, allowing MySpace to get consistent I/O throughout the array.

Two MySpace administrators manage six 3PAR arrays, each with 60 terabytes (TB) of usable space. The arrays are using about 15 TB per frame for production data and the rest for snap

If we ever had to retrieve from tape, we'd never get the data back fast enough. Copan is a good replacement.
Jim Benedetto,
vice president of technologyMySpace

backups. This data is the text content from its users' profile pages, as well as internal databases. MySpace has partitioned its users into 1 million user segments. For every 1 million new users, it adds one server and one LUN from 3PAR, which stripes across all the disks.

Benedetto said he could get around the I/Os of each of the available disks maxing out by buying more, smaller disks (say 36 GB) instead of fewer, larger disks (146 GB or 73 GB). For example, to get to 60 TB MySpace could have about 850 73 GB drives or about 1,700 36 GB drives. If each drive can handle 150 I/Os, it would get much better I/O performance out of the 36 GB drive system just because there are so many more drives available. "We could do this, but we choose not to because we like having the additional space available for all of our snaps," he said.

On that note, MySpace saves about 100 TB a day. The 3PAR arrays back up to each other and then to Copan Systems Inc.'s disk-based backup product. Eventually, the data is transferred to tape. About 500 TB of data has been migrated to Copan, and within a few months Benedetto expects MySpace to be off tape completely and slowly cycling them out of use. "If we ever had to retrieve from tape, we'd never get the data back fast enough. Copan is a good replacement," he said.

Eventually, MySpace plans to replicate its Copan systems to a bunker facility and to get more aggressive about disaster recovery. It's currently building an active-active-active failover configuration across three data centers, which is expected to go live in six months. This should help with the power crunch issue MySpace faces in Los Angeles. This summer the company was hit by a massive power outage for nearly 12 hours when a heat wave crippled California's power supply. Active, mirrored sites at different locations will alleviate this problem.

 


There are Comments. Add yours.

 
TIP: Want to include a code block in your comment? Use <pre> or <code> tags around the desired text. Ex: <code>insert code</code>

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
Sort by: OldestNewest

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to: