raywoo - Fotolia
Published: 01 Dec 2014
Reference architectures have always been important, but in this era of software–defined everything, they play a much bigger role for IT and systems integrators. But the first step is to define the term reference architecture. I checked out Wikipedia for a definition and pulled out a couple of key passages to explain its meaning (I added the italics for emphasis):
- "A reference architecture in the field of software architecture or enterprise architecture provides a template solution for an architecture for a particular domain. It also provides a common vocabulary with which to discuss implementations, often with the aim to stress commonality."
- "Adopting a reference architecture within an organization accelerates delivery through the re-use of an effective solution and provides a basis for governance to ensure the consistency and applicability of technology use within an organization."
Let me put these excerpts in perspective by using a few examples.
Before the software-defined storage (SDS) era, when you might buy a VNX storage array from EMC, all the software and hardware came from EMC. If you bought replication, thin provisioning and snapshot software, you were assured it would all work together. EMC was dealing with a contained set of products and controlled all of them. But EMC still provided you with a set of guidelines to ensure you got the best experience from the implementation. Also, they likely provided several reference architectures that described what to do/not to do, and how to configure the servers and network switches to get a certain level of performance for a given application, such as SAP. Even when EMC controlled all aspects of storage, there was still a need for reference architectures.
Software-defined storage aided by reference architectures
Now let's take an example of a classic SDS product and see how a reference architecture becomes even more important. DataCore SANsymphony was probably the industry's first example of software-defined storage. SANsymphony logically sat in front of a wide variety of arrays from different vendors, and was designed to maximize utilization and bring uniformity to the mish-mash of functionality built into each array. Some arrays might have been overused while others were underutilized; functionality varied and even if they performed the same function, they performed it differently. SANsymphony corralled the disparate hardware and provided a common way of delivering storage services. DataCore probably limited support to a set of products defined in a hardware compatibility list. In addition, the firm likely provided a set of guidelines, based on its own experience and the experiences of its customers. It possibly also supplied certain reference architectures for specific application areas. These reference architectures were more critical than those supplied by array vendors, providing templates, sets of guidelines and best practices.
Hyper-converged system users need guidance, too
Even in the case of hyper-converged appliances -- where compute, storage, networking, server virtualization, data protection, WAN optimization, data deduplication and other technologies are all built into a single node -- there's still a need for a reference architecture. Convergence and hyper-convergence are designed to make infrastructure deployment and day-to-day management easier. But vendors of those products provide reference architectures for a variety of applications and deployment sizes so users can reap the benefits of convergence quickly. Of course, a key feature of hyper-convergence is flexibility, so if performance isn't adequate, you can add another node. But initial design still matters, and some of those issues can be resolved by having the right reference architecture for a given application or mix of applications.
Eventually, I believe hyper-converged vendors will develop specific models for targeted workloads of a particular size and users will simply pick the right model without having to worry about reference architectures. But until we get there, reference architectures matter.
Tried, tested and true
Reference architectures are templates of what works well together for specific use cases; they inject the experience of developers and users so new users don't stray down blind alleys. Reference architectures encompass best practices, cite dependencies, warn you if certain combinations are problematic and accelerate delivery of results from an IT infrastructure.
Reference architectures have always been important to IT. But with software-defined everything, the number of potential interactions becomes infinitely greater and the need for a reference architecture increases accordingly.
About the author:
Arun Taneja is founder and president at Taneja Group, an analyst and consulting group focused on storage and storage-centric server technologies.
- CW500: A roadmap to software-defined everything – Paddy Power Betfair –ComputerWeekly.com
- The State of Software-Defined, Hyperconverged, and Cloud Storage –DataCore Software Corporation
- Grundon Waste Management Reclaims Data Management Control with Software-defined... –DataCore Software Corporation
- The Business Value of HPE Object Storage with Scality RING for Data-intensive ... –Scality