LAS VEGAS -- Hold onto your seatbelts for the latest storage growth projections. IBM predicts that the 3.2 million exabytes of information that currently exists on earth will reach 43 million exabytes by the year 2005. Along with this exponential growth in capacity comes an equally thorny issue for today's enterprise storage administrator: figuring out how to manage it all.
These are just a few of the insights shared Monday by Barry Rudolph, vice president of storage systems at IBM and keynote speaker at this year's IBM Storage and Storage Networking Symposium in Las Vegas, Nev.
"Things are moving at an unbelievable pace," said Rudolph. "We are doubling our storage capacity every year." An exabyte is equal to one thousand petabytes, or one million terabytes.
Rudolph managed to raise both excitement and anxiety over these staggering figures. "Most of you are probably scared to death," he told the audience of approximately 500. "We are sitting on the edge of a very exciting industry," he said. "The most important element in businesses in the coming years will be in storing and managing data."
Rudolph went on to say that two-thirds of today's data currently resides in isolated environments on the desktop, but that he expects centralized data and home data to grow exponentially in the coming years.
Casting the IBM storage vision, Rudolph attempted to answer the basic question of how IT professionals could exploit this data and manage it within their organizations. He outlined five fundamental industry initiatives that he indicated would promote universal data access and management. These included: storage virtualization, IP storage, SAN management software, SAN file systems and policy-based automation.
Rudolph said storage virtualization would simplify operations, reduce human labor and give storage professionals the ability to change the physical storage environment without having to change things on the applications level.
He touted IP storage (NAS and iSCSI) as being faster and less complex to manage.
Rudolph said SAN management software would enable better scaling and increased applications availability, while SAN file systems would provide a common infrastructure for better management, lower cost of ownership, and rapid deployment.
Finally, Rudolph said policy-based automation would lower storage complexity and automate operations for better scaling and improved applications availability.
He also stressed the critical role of interoperability testing across the globe.
"All of this will make little or no sense without interoperability," he said. "There is simply no way we can manage data in fully proprietary environments."
Rudolph's advice to IT storage professionals? Take the opportunity to learn, influence, and lead the way.
"Hang on. This is going to be an exciting ride over the next few years," he said.Let us know what you think about the story, e-mail Catherine Doss, assistant editor
FOR MORE INFORMATION:Definition of an Exabyte from Whatis.com searchStorage Best Web Links on Storage Management