Luckily, all the colorful distractions didn't put a damper on the conversations about storage. Here are a few tidbits we heard floating around the halls:
Quantum Corp. briefed us early on its new disk backup and replication appliances, the DXi3500 and DXi5500. These devices will eventually replace the company's older DX3000 and DX5000 boxes. Both systems include block-level deduplication software from Rocksoft, a file system from Advanced Digital Information Corp. (ADIC) and compression technology, asynchronous replication and remote diagnostics from Quantum. Assuming typical data mixes and standard backup methods, the company claimed these appliances can offer up to 216 terabytes (TB) of disk-based retention capacity, using an average 20-to-1 deduplication ratio -- enough capacity to retain months of backups on disk for an 11 TB primary data set.
Blue chips talk backup
Qualcomm Inc., Clark County government (the county where Vegas is located), MGM Mirage and Yahoo Inc., discussed their trials and tribulations with backup. Most of the big companies had few complaints about the products or technologies they had in place. Most of the challenges they encounter nowadays, according to the blue chips, has to do more with human processes than technological glitches.
"One of the biggest challenges of backup is that it finds every problem in your environment," said Paul Ferrarro, storage manager with Qualcomm, who estimated his company's backups at around 1 petabyte (PB) a month. "We're always going to the storage architects with bad news."
"We're constantly in firefighting mode," agreed Clark County's senior systems engineer Rich Taylor. He said the constant struggle to keep on top of day-to-day issues made it difficult to come up with long-range plans: "When the alligators are all coming out of the swamp, it's tough to think about finding the drain plug."
Marcellus Tabor, manager of storage and data protection for Yahoo, said the problem stemmed largely from a lack of good monitoring tools. "None of the backup monitoring tools out there had everything we needed," Tabor said. "It's really difficult to scale into the thousands of devices -- there aren't a lot of products that are able to give you the whole package at that point." Tabor said he and his staff had written their own application to manage and monitor the backup environment.
After the session, an attendee from another mammoth enterprise, David Ping, data center storage team lead, information systems and technology services for Pacific Gas and Electric Company (PG&E), said part of his staff meets regularly as a tactical advisory board, which so far has kept mostly on top of backup issues. But another deluge of data is coming when the company installs Smart Meters, which report information electronically back to the company's data center rather than requiring a human reader to take down the information manually.
"The biggest challenge isn't really finding primary storage for all that data -- just about any vendor can put as much disk on your floor as you could ask for," he said. "But storage architects don't often factor in backup when they're designing something -- they're concerned with IOPS, memory and disk space -- not how it's going to get backed up."
The company is planning to add more Tivoli Storage Manager (TSM) servers, but Ping wondered aloud if that "is just a Band-Aid.
"We began 2006 worrying about the tactical issues," Ping went on. "But as we approach 2007, we're beginning to wonder how we can take a step back and address long-term needs."
Backups are not archives -- But users want one process
In one session on encryption and archiving, attendees peppered speaker W. Curtis Preston, vice president of data protection services for GlassHouse Technologies Inc., who gave his "backups are not archives" talk, with questions about archiving "plug-ins" that would fit into their current backup environments. Preston named both Index Engines Inc.'s TE-200 Tape Engine, which keeps track of data in tape-based archives and Symantec Corp.'s Veritas Enterprise Vault, which can share information with NetBackup. But users still talked of one tool that would both backup and archive from the same data stream -- why deal with the same data twice? Excellent question; the market awaits the vendors' answer.
Dedupe hashing doubts are academic, analyst says
Preston can always be counted on to hold forth on something interesting at the "Ask the Experts" session during the show floor expo and reception, and this conference was no exception.
Preston, previewing an upcoming feature in Storage magazine, said it's time for debates and worries about "collisions" while deduping with the SHA-1 algorithm to end.
"I'm not a numbers guy," he said. "I had to sit down and work this out, but I figured out that the likelihood of a collision with a 180-bit SHA-1 hash is one in two to the180th power.
"Meanwhile the chance of writing a bad block to disk, without even knowing it, is one in two to the 50th power … And two to the180th power is a bigger number than all the blocks in the known computing universe."
Next time you're at a Storage Decisions show, buy Curtis a drink. You'll be well prepared for your next cocktail party.
ISCSI -- All or nothing?
Based on a poll of about 100 attendees at the Emerging Technology Showcase session on Wednesday afternoon, the use of iSCSI looks at this point like an all-or-nothing phenomenon. While 47% of the respondents said they weren't using iSCSI at all, the majority of those who were using it (22%) said they were running it for their mission-critical applications.