It’s probably safe to say that most seasoned IT pros understand the importance of benchmarking the performance of mission critical systems. Benchmarking is the only reliable technique for quantifying system performance and for detecting system bottlenecks. However, you simply can’t rely on the performance metrics provided by hardware vendors because those metrics are going to be different from your own.
Hardware vendors establish performance metrics for their products in carefully configured lab environments. Real world, production environments are likely to be configured far differently from these highly optimized lab environments and customers are therefore going to see different levels of performance than what has been documented by the vendor.
In some ways these differences are similar to those encountered when buying a new car. In America, car dealerships are required to provide fuel economy information for the new vehicles that they sell. However, a vehicle’s real world fuel economy is unlikely to match the number printed on the window sticker.
In continuing with the vehicle analogy, at least some drivers are probably curious as to whether the gas mileage that they are getting from their new vehicle is good or bad. Of course comparing the vehicle’s actual fuel economy to the window sticker might not be a realistic comparison. As such, a curious driver might turn to an Internet discussion forum to see what kind of gas mileage other owners of similar vehicles are getting. In other words, vehicle owners have the option of using social networking as a way of seeing how their own experiences compare to that of others.
This same basic principle can also be applied to IT benchmarking. In the past, IT benchmarking has focused heavily on the comparison of historical data. IT pros have long been advised to benchmark the performance of newly implemented systems immediately. That benchmarking data can then be used as a performance baseline. Any time that the organization makes a configuration change or performs a hardware upgrade the system is benchmarked and the numbers are compared against the original baseline values and against more recent performance data in an effort to evaluate the impact of the change that was made.
The problem with this approach is that it is based around the assumption that the baseline values were taken with the system functioning properly. Imagine the implications of establishing a performance baseline on a system with a faulty storage controller that was causing a lot of retry operations for disk reads. The system might be performing poorly, but the sub-par performance might not be detected for quite some time because performance data is only being compared to the machine’s own historical data. Hence, there isn’t really a way of telling whether the recorded level of read IOPS is good or bad. Instead, the performance data will only reflect the change in read IOPS levels over time.
Social networking can help with this sort of situation because it allows you to compare your performance to that of other IT pros who are operating similar hardware in real world environments. By leveraging the power of social networks you are no longer stuck comparing performance data only to your own historical data and to potentially unrealistic benchmarks from the manufacturer.
Manage and monitor your VDI environment from one single dashboard. Read more here.