I need to mesh 250 (2gig) servers and 130 (1gig) storage devices so that each server has any to any connectivity...
and full 2 gig throughput. I can create 2 logical 1 gig devices per storage device and use the 2 ports on the storage device together to provide 2 gig throughput. How do I mesh it all together?
There are few things about this question that need clarification. Does this mean any of the 250 servers can communicate with any of the 130 storage devises? I think that's what is being asked. Or does it also include server-to-server connections. There was also a question about trunking sessions over two links to achieve higher bandwidth.
First off, a single storage I/O connection today cannot be divided up over two separate links. It is essential that the order of I/Os be preserved for integrity purposes and there is no way to do that with 100% accuracy when two separate links are used. A single path will be chosen and there will be no deviation from it for the length of the session.
Second, if you are talking about 250 servers and 130 storage ports in a mesh, you would probably need 32 port switches. This is a pretty big mesh, and from my limited understanding of mesh topologies it is not something to be taken lightly.
Third, the number of storage devices appears to be too small to match the number of servers. Why 130 storage devices and 250 servers.
Fourth, if every server can access every storage device, does that mean that you want them to access the same data on the same volumes, or do you want them to access data on different volumes on storage? If the latter is true, do you need to create 250 partitions on each storage devices? This would be virtually impossible to manage. If you want them to share data on this storage, then you would likely go crazy trying to implement a lock manager across 250 servers. I don't know for sure, but it probably can't be done.
Maybe I missed something in your request, but I'd be afraid of taking on an impossible task.
Editor's note: Do you agree with this expert's response? If you have more to share, post it in our Storage Networking discussion forum at http://searchstorage.discussions.techtarget.com/WebX?50@@.ee83ce4 or e-mail us directly at firstname.lastname@example.org.
Related Q&A from Marc Farley
Have a question for an expert?
Please add a title for your question
Get answers from a TechTarget expert on whatever's puzzling you.