| Higher TV and movie resolution standards are radically pushing up storage requirements for producing and saving digital media.
For those users in the film, broadcast and music business, there could be a saying that applies to storage: "Anything that is bigger, better and faster is what I need." At least that's the view of Jason Navarro, IT lead at Image Engine Design Inc., a visual effects post-production studio in Vancouver, British Columbia.
Navarro, like others in the digital content business, is looking at the increasing terabytes (and even petabytes) of data generated in the creation, editing, archiving and distribution of digital content. Most of this data needs to be accessed quickly so organizations can meet deadlines imposed by movie premieres and television broadcasts, and as corporate HR and marketing departments increasingly develop multimedia training videos and product promotions.
Digital content requires not only more capacity, but the ability of the storage system to grow as a user's needs expand. And that's just what Navarro required: a system that could scale performance and capacity as Image Engine's work changed so the company could take on television projects in addition to its film visual effects.
"We were using a 3TB NetApp FAS940 [filer from NetApp Inc.], but found it wasn't going to be efficient enough," says Navarro. "With the transition to high-definition television [HDTV] from standard definition [SDTV], we needed five to six times the storage if we were going to pursue high-end film and television work."
Image Engine picked up the visual effects work for The Incredible Hulk, which was shot with 2K (2,048 horizontal pixel resolution) cameras and when finished could require more than 5TB of storage capacity. In transitioning from SDTV to HDTV, Navarro installed a system for Image Engine that consists of a render farm comprising 120 servers and a four-node 26TB NetApp FAS3050 running the NetApp Data Ontap GX operating system. As many as 200 clients connect to the NetApp cluster and render farm servers, which work together to perform computation-intensive 3D rendering.
"We were looking for a high-performance, high-availability system and decided that clustered storage was the way to go," says Navarro. "In this business, high availability is essential. If we can't get the work done in time, we'll lose money and miss deadlines."
Navarro stores high-demand applications on Fibre Channel (FC) disk and television files on SATA disk. Finished projects are archived on tape. If Image Engine requires more performance from the system, more controllers can be added; if more capacity is required, more disks can be added.
Digital media, and visual and audio effects, require high bandwidth. Traditionally, artists creating these digital files relied on their workstations' DAS to store the images they created. In the '90s, a storage administrator wheeled storage systems from edit station to edit station as needs required, sometimes delaying the production of a feature as artists waited for access to the storage system. The advent of SANs and NAS clusters in the late '90s gave artists a quick and easy way to share and access storage.
Digital media is growing rapidly mostly because of the transition from SDTV to HDTV, as well as the use of higher resolution cameras. According to Coughlin Associates, an Atascadero, CA-based research firm, the storage required for content creation and distribution will grow from 750,000TB today to about 2,250,000TB in 2012, which is more than 2 exabytes. (An exabyte is equal to 1 quintillion bytes. This year, 1 exabyte of hard drive storage would cost more than $200 million.)
"With digital media production, the user's machine needs the best possible storage performance, especially if you're dealing with high-definition media," says Ted Richardson, director of product management at Studio Network Solutions in St. Louis, which specializes in storage technology for the media and entertainment industry.
"With media, where you need the most storage horsepower, the highest bandwidth and lowest latency is at the front end when an artist is pulling data off cameras or recordings and doing that first round of editing," says Richardson. "Artists need to deal with the raw footage that hasn't been compressed or turned into production yet."
Post-production and archiving needs
"We started using the Archion [Inc.'s] Alliance product on the animated film Beowulf [released in 2007]," says Friedman. "We put in the Archion halfway through the production of Beowulf because [it was] newer technology and we were getting some drive failures on an older system." And because the company was entering the world of high-definition animation, a system that could "handle the bigger files seamlessly" was also needed, says Friedman.
"With the Archion system, six to seven users typically share an 8TB array," he says. "Scene by scene, they drop in high-resolution animation and that takes a lot of bandwidth."
Friedman says Non-Linear Solutions upgraded to a 4Gb/sec QLogic switch, adding that "if we had stayed in offline resolution throughout the production of Beowulf, we might have been OK. But once we started pumping high-resolution images through [the system, coupled] with the demands of six to seven users, it just made sense to upgrade our storage," he notes. "It's too taxing to the system when three editors, a visual effects editor and four assistants all use the system; you want to keep the throughput as optimized as possible."
Tom M. Coughlin, president of Coughlin Associates, estimates that even with the lower resolution uncompressed 2K format video (2,048 x 1,556 pixels), a user would need 12MB of storage per frame and 5TB for a movie. At the newest 4K resolution (4,096 x 3112 pixels), a movie would take 22TB at 3.1GB/sec; at 6K (6,144 x 4,668 pixels), a movie would take 49TB at 7GB/sec (see "Explosive growth in video content," below).
3ality Digital LLC has a different need for high-performance, scalable storage. The Burbank, CA-based company produces stereoscopic 3D content using a 250TB Isilon Systems Inc. IQ cluster. Its in-house post-production team, consisting of more than 90 users, edits raw imagery stored on its Isilon cluster.
"We had a storage area network before installing the Isilon cluster," says Howard Postley, COO/CTO at 3ality. "It was problematic and couldn't scale to the level we needed. We needed a system capable of storing petabytes, not simply terabytes, of information."
In its stereoscopic 3D activities, 3ality produces as much as 10TB of data each day, which it archives to a Quantum Corp. tape robot with LTO-4 media. The studio recently produced the U2 3D film, which premiered at the Sundance Film Festival.
In addition to higher resolution digital formats, broadcasters need to repackage content into other forms: breaking news stories, videos of news events and Web content, which increases storage capacity needs. According to Jim Casabella, director of advanced technology at ABC Owned Television Stations Group in Burbank, CA, an average-sized television station has approximately 30 to 40 non-linear edit systems and 100 desktop edit clients. If that same television station broadcasts about 200 news stories a day, the capacity necessary for those news stories will be 30TB in SDTV and 120TB in HDTV (see "The post-production process," below).
Once production is done, archiving of the finished project is required, which adds to the need for more storage capacity, albeit of a different type. Casabella has spent the last three years on a project for digitizing and preserving the organization's legacy archives using an EMC Corp. Documentum system for archiving his legacy data.
"We share media files among 450 business units," says Casabella. "In our legacy archive, we have over 30 years' worth of content on tape, which has to be digitized over the next 10 years or be lost forever. ABC Owned Television Stations have about 7 million news stories to archive requiring about 2PB of storage. Disney [The Walt Disney Co., which owns the ABC Television Group] itself has around 6 million hours of video tape, requiring about 300PB of storage to archive."
"Our primary service is Blu-ray disc authoring--a disc image is the end product--which we deliver to a disc replicator. Our services include HD video encoding, audio encoding in advanced formats, integrating menu elements, subtitles and navigation aspects of the disk," says Jordan. "The encoding process requires high-throughput and high-performance capabilities since it's high-definition video for Blu-ray."
Within their 300TB Isilon cluster, Jordan has created two different pools of storage. "We use a central pool of storage for the entire authoring process where all the elements--image files, encoded video and audio--come together," says Jordan. "A second cluster of storage is dedicated for video encoding, which has high capacity and throughput demands. The HD video encoding process runs on a bank of servers for each title. The file output from video encoding is 30GB on average, and is eventually migrated to the authoring cluster in our workflow."
Adds Jordan: "We start with high-definition video source files, which can be hundreds of GBs for feature-length content, and then encode to one of the supported Blu-ray video codecs, creating a high-quality file for the disc."
"We will see the end of analog [magnetic tape] and, on February 17, 2009, a transition to HD," he says. "Stations have already started to convert to digital from analog media." Over the next three to five years, says Casabella, most local production facilities will convert to HD. He estimates that the transition to HD will cost approximately $2 million to $10 million per station. And those estimates don't include the price of extra storage capacity.