I have written about data center infrastructure management (DCIM) in past blogs:
- Tools Needed to Manage Data Centers
- The New Data Center Infrastructure Management Segment
- Chatting with Sherman Ikemoto of Future Facilities
As most people in the data center market know, both facilities and IT folks consider monitoring one of the most important elements in operating data centers. Smaller companies were the first to provide monitoring and reporting functions. Although this is not an exhaustive list, I had a chance to talk to some of these vendors and write about the meetings:
I understand their services and their usefulness. Some provide sensor hardware and software, but others provide only software. They all monitor, aggregate, and report several parameters relevant to data center operations, such as temperature, humidity, and power consumption. Some deal only with facilities equipment, and others handle data coming from both facilities and IT equipment. There are no standards by which to measure the data—no standard for frequency of measurement, data formats, or protocols. Each vendor has their set of customers, and they seem to be happy with the solutions they purchased.
Then there are Power Assure, Romonet, and Future Facilities. Power Assure does monitor, but that is not all. It also optimizes the use of power at your data center. Romonet is for capacity planning. Future Facilities provides an electronic version of a data center that you can play with before implementing your design physically. These three cannot be classified as monitoring and reporting vendors. But their functions are important to operating data centers, in addition to monitoring and reporting, so a new term has been introduced to describe a new segment, which is DCIM.
Clearly, DCIM should contain several categories of tools, including those for monitoring and reporting, capacity planning, and simulation. As I said before, this segment is in its infancy; there are no standards or actual-use information. Those who combat day-to-day operation problems would be confused about which tools to select. Do they want to buy one tool at a time or buy a suite of tools? But wait. There is no suite of tools yet, although Future Facilities (for example) has begun to partner with other DCIM vendors to share data.
If we were to develop a suite of tools or a framework or platform for DCIM tools, what would the requirements be? It would help if there were some information from actual use by someone other than the vendors. Because DCIM tools are at a very early stage, there is very little information about them.
Because the needs of operators can be quite different from one data center to another, we will have a good assortment of panelists from different environments:
- Chuck Rego, Chief Architect, High Density Data Centers at Intel Corporation
- Pam Brigham, Director, Global Technology at Equinix
- Phil Reese, Research Computing Strategist at Stanford University
Chuck develops Intel’s DCIM tools for their own and partner use and use commerical ones as well, while Pam at Equinix has homegrown tools. Phil at Stanford is starting to use a commercial tool. I will ask them what problems they perceive as the most important to solve at their data centers and why they chose their solutions, whether their own or commercial tools. Are they quite happy with the tools they are using? If not, what is missing? What additional work is needed to make them work? Conversely, were there any extra benefits they did not expect in applying their DCIM tools?
If you are interested in the answers to these questions, join me and the panelists at the panel and other sessions at the conference.