By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
|Up and coming|
The following technologies are heating up, but are not quite ready for enterprise storage shops.
Emerging technology is risky. The vendors are often startups and the technology isn't battle tested. But if it will solve your storage problem, it could be worth the risk. "We were a little nervous," about buying a storage encryption appliance from Neoscale Systems Inc., admits Kevin Granhold, director of server and desktop services at the University of Texas Health Sciences Center, in Houston. The vendor was small and the technology was new.
But Granhold's group believed the benefits outweighed the risks. The Health Sciences organization stores large amounts of patient and research data. As a result, it faced several data privacy issues relating to HIPAA and various internal policies. After looking at the usual access and privacy-control products, it chose Neoscale's encryption appliance. "It was cheaper and easier to encrypt it all than to deal with different policies and procedures for the various data," Granhold says.
The emerging technologies discussed here include continuous data protection (CDP), intelligent storage switches, storage encryption, network-attached storage (NAS) accelerators and storage virtualization. Several other storage technologies are equally interesting, but are currently in earlier phases of the adoption cycle. These include grid storage, serial-attached SCSI (SAS) and InfiniBand (see "Up and coming" on this page).
Continuous data protection
CDP--also referred to as continuous backup--are products that track and save information about stored data changes. (See "Nonstop data protection".) In the event of a failure or data corruption, the CDP system can return to any point in the past, usually just before the storage failure or data corruption. Some CDP vendors include Mendocino Software, Revivio Inc. and XOsoft Inc. The field is becoming crowded, and larger vendors haven't weighed in yet, although they're starting to make noises.
"We are big advocates of CDP. We think it is the future of data protection," says Nancy Hurley, senior analyst at Enterprise Strategy Group. The reason: Despite decades of work on the problem, "backup is still the No. 1 headache," she says. CDP promises to eliminate that headache once and for all. In a survey of buying intentions by TheInfoPro, roughly half of the 90 managers who responded to the survey either had purchased CDP technology or plan to do so by 2006.
Each vendor does CDP a bit differently, but the basic approach is to record every change to the stored data. You're essentially making a journal or index of the changes that are time-stamped when they're entered. When you need to restore data, you just restore, or rewind, to the point in time you want to recover.
CDP differs from point-in-time snapshots in three ways. First, you don't have to halt activity as you do with snapshots. Second, you aren't storing the changed data like snapshots, so you don't need as much disk storage. Third, you can restore to any point in the past, unlike snapshots, which are taken at specific intervals.
"Right now, companies are doing multiple hot mirrors and snapshots on their most expensive disk arrays to try to get the same result. Why bother when you can replicate block I/O actions to a free-flowing journal and go back to any point in time?" asks Jon Toigo of Toigo Partners International.
Compulinx Managed Services Inc., White Plains, NY, came to a similar conclusion. The company operates as a managed service provider with 70TB of storage spread across what Terrence Chalk, CEOof Compulinx, describes as four class-A data centers. Customer data is replicated among the data center storage area networks (SANs) for disaster-recovery purposes.
To ensure high availability, Compulinx made XOsoft's CDP capabilities part of its service offering. Using itself as the test subject, Compulinx ran XOsoft's Data Rewinder on a subset of its 250 servers at multiple facilities. On two occasions, service went down. "We were able to very quickly get everything back up using Data Rewinder," says Chalk. Since then, the company has been using CDP as a standard service for its high-availability customers. "It became clear that we don't need tape. With XOsoft, we can recover the data faster," he adds. The company has been without tape backup for more than six months.
How hot: heating up fast.
Risk factor: medium, because the technology may fail to scale, either in the volume of data it can handle or the performance it delivers.
Buy or pass: If you need the ability to recover current data fast, go for it.
"Will intelligence in the fabric be important? Yes, but don't expect to see much for 18 to 24 months," says ESG's Hurley.
At this point, however, "There has been a ton of hype," says Arun Taneja, founder of the Taneja Group. "The only one I see who's really delivering it is Maxxan." (See "Next-gen switches".) Hurley would add Maranti and several larger switch vendors to the list, such as Cisco Systems Inc. and Computer Network Technology Corp., that have announced initiatives or have just started shipping products.
According to TheInfoPro buying trends survey, 43% plan on deploying intelligent switches by 2006.
Intelligent switches are capable of running software services that now run on the host or are built into the firmware of the disk array, such as replication, mirroring or volume management. The advantage to putting that kind of functionality on the switch is that the switch can see all the hosts and disks, regardless of the vendor. This allows an organization to consolidate services that it buys from multiple vendors, saving money on software licenses and simplifying storage management.
How hot: warming up.
Risk factor: still highly proprietary; the intelligent switch port costs are two-and-a-half-times higher than ordinary Fibre Channel ports.
Buy or pass: Unless the cost of services from multiple vendors is unbearable, wait.
The case for storage encryption began building as storage became increasingly networked. Multiple hosts could access a shared pool of disk capacity, which increased data vulnerability. SANs, which rely on complicated zoning and LUN masking for security, aren't considered secure. Encryption ensures that even if both host and SAN security is breached, the data will remain unreadable to unauthorized parties.
Storage encryption typically is delivered as an appliance that encrypts and decrypts data as it moves in and out of storage. All the encryption capabilities are handled by the appliance, which requires little in the way of management. The challenge is to do it quickly enough so as not to impede storage performance and to be transparent to the application and the storage.
Although the products are pretty good, says Hurley, "storage encryption is a tough sell." The problem is that enterprises are already spending a ton of money on security and don't want to buy yet another security product. But regulatory compliance is driving interest in encryption, as was the case at the University of Texas, which is pushing encryption products into the data center. Leading vendors included Decru Inc., Neoscale Systems Inc. and Vormetric Inc.
How hot: slow simmer.
Risk factor: hard to build yet another security business case.
Buy or pass: Unless auditors or regulators are putting pressure on you, wait.
NAS wide area accelerators are gaining traction on Wall Street, where they are undergoing extensive testing, reports Taneja. These products are appliances that speed up the TCP/IP connection between remote NAS storage and the main data center. The University of New Hampshire has established an interoperability lab for storage, and NAS accelerators from Actona Technologies and Tacit Knowledge Systems Inc. have achieved impressive results there, Taneja adds.
NAS accelerators work by effectively taking TCP/IP out of the WAN picture as far as storage goes. They terminate the TCP/IP connection locally and insert their own protocol, effectively removing TCP/IP's error checking and latency. Additionally, the appliance may use techniques such as difference recognition, local read/write, read-ahead capabilities, data streaming and compression to speed the transmission, all of which makes the remote NAS storage perform as if it were on the local LAN. The savings in reduced WAN costs and remote storage management can be considerable.
In addition to Tacit and Actona, many vendors are working on approaches to WAN acceleration. Vendors include DiskSites Inc., Expand Networks, NetCelera, NetScaler Inc., Network Executive Software, Orbital Data Corp., Packeteer Inc., Peribit Networks and Riverbed Technology. Some of these companies are just out of stealth mode, and are searching for their next venture capital funding, says Taneja. Be forewarned.
How hot: warming up.
Risk factor: Many different approaches and immature companies may make choosing chancy.
Buy or pass: If you maintain file storage at many remote sites, seriously research these products and technologies.
The amount of vaporware surrounding storage virtualization is enough to turn anyone into a cynic. DataCore Software Corp., FalconStor Software Inc. and Hewlett-Packard Co. have offered working storage virtualization as part of their offerings for some time. But overall, the market has been long on claims and promises, but short on actual products from major vendors. Until now.
Storage virtualization consists of a software layer that masks differences in a heterogeneous storage environment through a common interface, a common namespace and common management. It uses filesets to combine storage from different devices into common pools and virtualizes block storage.
"IBM's SAN Volume Controller [SVC] is a good, solid product," says Toigo. The SAN SVC and the SAN File System are part of IBM's Virtualization Engine Suite for Storage. "Everyone has been waiting for this for years, at least since IBM started talking about StorageTank maybe four years ago," says Mike Karp, an analyst at Enterprise Management Associates in Boulder, CO. Unlike those who have tried to introduce storage virtualization products previously, IBM seems to understand that virtualization really isn't a product at all, but a capability that enables benefits such as lower TCO and streamlined management, Karp explains. When virtualization is applied to storage, the business payback can be significant.
How hot: was simmering, now suddenly very hot.
Risk factor: extremely difficult to do, especially in a highly heterogeneous environment.
Buy or pass: Let the early adopters bleed a bit as they smooth out rough edges.
Buy now, pay later?
You can buy products for each of these technologies, but the products are proprietary and many of the vendors are small--a combination that raises the risk level considerably. Although the products promise big returns on investment by offering solutions to nagging problems, you need to weigh the risks with the benefits. The real question: How much of a guinea pig do you want to be?