News Stay informed about the latest enterprise technology news and product updates.

University rocks with clustered storage

Integration between Dell, EMC and channel partners helped university implement a new 834-node supercomputing cluster.

The State University of New York at Buffalo's Center for Computational Research (CCR) -- a behemoth that includes multiple high-powered server clusters connected to a massive Hewlett-Packard Co. SAN with 40 terabytes (TB) of usable capacity, is so large that even the code names for its machines have taken on a life of their own.

According to its director, Dr. Russ Miller, the "CCR" acronym reminded administrators of 70's rock band Creedence Clearwater Revival, also known as CCR. Thus, a system of naming machines after members of the Rock and Roll Hall of Fame was devised. The building's security system is nicknamed "The Doors." Miller's computer is named "Fogerty", after the musical Creedence Clearwater Revival's lead singer.

"We like to have a little fun where we can," Miller said of the naming scheme. But considering the purchase of a new 834-node server cluster for the university was a very serious undertaking.

Related articles

Clustering solves NAS scalability snag

Special Report: Storage clusters

Chart a course for consolidation  

How to roll your own NAS cluster

According to Miller, earlier this year "most of our machines were starting to show their age. The queue lengths were getting very long, to the point where there was starting to be some user frustration."

CCR decided to address these issues by adding a new cluster that emphasized balance and redundancy. They also sought one point of contact for the complex system when soliciting bids.

"We were looking at the system as a system – a full package, not with a company giving us the right computers but not the right interconnects and storage," Miller said. "We made it clear in our request for proposal that we were looking for a balanced system that could handle a very wide and varied computing audience."

That audience, according to Miller, includes everything from data mining projects for local food stores to hurricane simulations for study in the earth sciences department. Many research applications are so large, Miller said, that their calculations have to be moved from a PC or laptop node into secondary storage just to be processed.

A bid from Dell Inc., Miller said, "pulled together not just Dell servers and all of the Dell management strategy, but also pulled in two kinds of interconnect fabric. One is Gigabit Ethernet from a company called Force10 and the other is from a company called Myricom Inc., which is a very, very high-end connectivity between the nodes."

Besides the connection fabric and Dell's PowerEdge SC1425, 1850 and 2850 servers, the new system includes three, 10 TB EMC Corp. Clarion disk systems fronted by 24 I/O nodes running clustered file system (CFS) software from Ibrix Inc., which was recently certified by EMC and is currently the only CFS on the EMC support matrix.

Miller said the collaboration between competitors was "beautiful. I figure if there was any sniping going on, I'd know about it because they've all been working right outside my office. But everybody worked well together. They were wonderful about getting the right vendor here making the right delivery at the right time."

Drawbacks to the new cluster have been minimal, according to Miller. "Our staff had to invest some time in working with the integration team to tune the network connection between with the HP SAN in order to achieve maximum performance," he said, a process he called "business as usual."

"We expect to have to tune the system over the next three months. It will happen. It's inevitable as you bring users on," he explained.

In the future, Miller predicted more vendors would offer packaged systems like Dell's cluster, and improve on its architecture. "Right now we have heterogeneous systems within individual supercomputer sites," Miller said. "Our users have to figure out either how to get from one to the other or use our grid. Eventually, individual vendors will have offerings where within a single system they will have combined many different aspects of computing, storage networking and visualization."

For the immediate future, it's back to rock 'n' roll for CCR. The new system's handiwork will be demonstrated on MTV Friday nights this summer with the debut of music videos by rock bands Franz Ferdinand, Blink 182 and the Beastie Boys, among others, combining 3D video game environments with the band members and their songs.

As for the obligatory nickname, the new cluster has been christened "U2" after one of the most recent Rock 'n' Roll Hall of Fame inductees. The two front-end servers are nicknamed Edge and Bono. Two administration nodes handling the monitoring of the system and pushing out updates are known as Larry and Adam.

"The bass player and drummer, of course," Miller explained, "being the backbone of any good band."

Dig Deeper on NAS devices

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.