Benchmarking Storage Systems, Part 3
This is the final installment in our three-part series on benchmarking. Part 1 examined each of the components that might be included in a typical benchmark, while Part 2 looked at developing representations of your workload as well as the pros and cons of using your applications and real data in the benchmark as opposed to developing emulations of both.
In the previous two storage systems benchmarking articles we covered:
- Types of hardware and software to benchmark
- Types of benchmarks
- Applications characterization
- Vendor issues
This leaves several steps that still need to be covered in order to complete the benchmarking process:
- Internal agreement on scoring
- Writing the specification for the benchmark and the vendor proposal
- Analysis of vendor responses
Each of these areas needs to be addressed, agreed upon by all involved, and scheduled as part of the whole procurement process. A formal procurement process, though significant work on your side, is also significant work for the vendors and should be made as painless as possible for all.
In addition, from what I have seen, a formal, open, and fair procurement process tends to get the customer a better price than calling up a vendor and saying, "How much would 10 TB of Fibre Channel RAID cost me?" The most important benefit from what I have seen is that by going through a formal benchmark process, everyone from the accounting department to the system administrator knows what the organizational and operational requirements are, as they will have been specifically defined as part of the procurement process.
The process of scoring a benchmark is just plain hard. You have different groups wanting different things that all need to fit within the budget, and vendors do not necessarily make it easy to determine what has been bid at what price, what the cost of maintenance will be, and over how many years that will span. Add to that the services from the vendor, and understanding the many cost factors can be hard work.
From what I have seen over the years it is critical that the scoring methodology be created and agreed upon before the benchmark is released. This saves a great deal of internal infighting and ensures that favorite vendors from each of the departments as well as not-so-favorite or unfamiliar vendors all remain on a level playing field.
One thing I would strongly suggest in order to reduce the complexity of scoring is defining a maintenance price that the vendors will all bid to. More often than not, vendors will price maintenance so differently that scoring the actual price becomes an exercise in futility. If you tell the vendors that the maintenance cost shall be say 8% of the price and have them bid the initial price based on that 8%, this will significantly reduce the complexity of scoring the bid. You may also want to add an inflationary factor into the maintenance cost depending on the lifecycle of your procurement.
Before you start, everyone involved in the process has to agree on what is important. This means that the people that are spending the money have to agree on what they have in the budget, the people who maintain the equipment have to provide what they have in their O&M (operations and maintenance) budget, and the group responsible for defining the performance requirements has to provide the specific performance requirements.
Page 2: Scoring Continued...
So how do you score all of this? Here are some scoring examples from some real benchmarks I have seen and participated in:
- Price alone – We need this much RAID-X storage, with the lowest price winning
- Price and performance and reliability requirements – We need this much RAID-X storage, you must meet this performance criteria and these requirements, and the reliability must meet these requirements
- Price, reliability and performance scaling – We need this much RAID-X storage and you must meet a minimum requirement, but vendors are encouraged to submit bids with better than the minimum performance. The reliability must meet these requirements
- Performance is the dominating factor – A few bids require performance near the edge of what is currently commercially available. For these bids, price is a consideration but performance dominates, as the customer must be willing to pay for this N-case requirement. Reliability is also defined in this type of procurement
The Real Price
One additional area to consider is the real price. The real cost of storage hardware is not just the purchase price and maintenance price, but also includes the cost to operate the system. This cost includes:
- Training your personnel
- In some cases, the cost of maintaining spare parts
- In some large sites, the cost of having vendor personnel on-site and the space that is required
All of these must be considered in the actual pricing model section of the scoring model that you need to develop.
As you can see, the different types of scoring run the gambit from price centric to performance centric. This is why it is so critical for an organization doing a procurement to all agree on the scoring model before the procurement goes out. If this is not done, the procurement often drags on while the various organizations fight the political battles of who has the most important requirements.
One thing that should go without saying but that is sometimes is lost in the fray is the importance of not allowing the vendors to get their hands on the scoring models. If they do get the models – even without the pricing models – you will end up with everyone providing the same price/performance ratios. In over 23 years of being involved with benchmarks, I have never seen a case where it is in the buyer’s best interest to provide the scoring model. When this does happen vendors tend, and rightly so, to try to optimize the scoring model with their architecture rather than providing the best bid with the hardware they can propose.
Page 3: Writing the Specification
Writing the Specification
This will likely be the most difficult part of any procurement process. For large, complex procurements, hundreds of pages might be written. These specifications are generally divided into six parts. Sometimes these are separated into two different documents: one technical document for the benchmark and one for the remaining information. The six general parts of the specification are as follows:
- How to respond to the bid, ask questions, and the bid requirements
- Background on the procurement and site(s)
- Description of what hardware and software you require
- Description of the overall environment as well as environmental factors such as power and cooling
- Cost proposal
- Benchmark description and the rules
How to Respond
This part of the document specifies issues such as:
- The number of pages a response is allowed – Vendors will and can inundate you with paper. It is always a good idea in large procurements to set limits
- The font size and page settings for the response – Believe me when I contend this is quite important. I have written responses with small margins and 8-point type, and I have even heard of one response in 6 point type and eighth of an inch margins
- How questions are asked and who they are to be directed to – You do not want vendors calling everyone in the organization to ask questions. Also, in some cases, you’ll want to document the questions and provide them to all of the vendors.
- When the benchmark and proposal are due – To the hour, minute, and second, and of course, the location they’re due to
If you clearly document these requirements, the number of questions will be reduced and the review of the responses will be simplified.
Background on the Procurement and Site(s)
It’s useful to clearly state what you are trying to do and why you are trying to do it. This is especially important for vendors that do not know much about your site and your business requirements. Clearly stating this type of information puts new vendors on a more even playing field with any vendors that do know your environment. This is especially important if you are doing a purchase for multiple sites in different locations.
In this section it’s important to clearly state the maintenance expectations. If you have a site in Key West, Florida, for example, and expect a 4-hour maximum response time, and a vendor cannot meet the requirement, they should know this before spending any time working on the benchmark.
Description of What Hardware and Software You Require
The benchmark instructions must include a description of the hardware and software required to run the benchmark. This may include items such as:
- C/C++ compiler to run the benchmark code
- SQL tools
- File Systems/Volume Managers/HSM (Hierarchical Storage Management)
- Storage definitions
- Fibre channel switches
It’s only fair to the vendors to provide a specific checklist of software that they will need as part of the benchmark.
Software is often purchased during storage procurements to manage, monitor, and fully utilize the new system. This may include some of the same items used to run the benchmark, but usually includes additional items such as SANs and systems management and monitoring software.
Page 4: Description of Overall Environment and Environmentals
Description of Overall Environment and Environmentals
You should provide a detailed description of the site(s) and the environmentals. This should include:
- Information on available floor space
- Information on power and cooling
- Current floor loading limitations
- Shipping address(es)
As Clint Eastwood said in Magnum Force: “A man’s got to know his limitations,” and vendors need to understand the environment and facilities they’ll be working with.
The cost proposal should detail all of the requirements for pricing the system, including:
- Hardware costs
- Software costs
- Hardware and software maintenance and upgrade policy
- Professional services
This section often provides instructions that define the requirements for services and asks for a single price to simplify the information from each vendor.
Benchmark Description and the Rules
This is by far the most difficult section to write. The complexity increases with the number of benchmark applications that you run. The greater the number of applications the more complexity you will have in running the applications and checking the results.
You need to write the rules in such a way that the vendors clearly understand what is expected of them. The rules should be written such that every vendor is running your benchmarks in exactly the same way.
You want to benchmark the systems, not benchmark the benchmarkers. For example, when benchmarking a RAID you must specify the number of HBAs and from what hosts the RAID will be attached to. Without doing this you might find that a vendor uses more HBAs and attaches to a host with a faster PCI or PCI-X bus. Leave nothing to chance — specify as much detail as possible.
This series on benchmarking provides a good overview of the benchmark process from both the vendor’s and the purchaser’s points of view. As I’ve illustrated, there are many obvious and not-so-obvious aspects to the benchmark process that need to be considered, from the politics inside the purchaser’s organization to the technical issues of how to develop the benchmark.
On the vendor side of the fence, they are trying to figure out what you really want and how much you can pay for it, and most importantly which combination of hardware and software products will meet the requirements for the benchmark at the best cost. Notice I say meet the requirements and not necessarily win the benchmark, because this becomes the price/performance tradeoff. Procurements generally come down to cost and performance, and from the vendor side, they are trying to determine what you can afford to pay for the hardware, software, and services you have specified.
It’s a game that is played every day, with the purchaser trying to get the best price that meets the requirements, and the vendor trying to win the business with the highest margins. It’s also an extremely complex game, so if you’re new to the game, it’s a good idea to find someone to help you with rules.
See All Articles by Columnist Henry Newman