This is a continuation from Part 1.
Interfaces required to multiple domains
I think their decision to keep themselves a software infrastructure company is smart. In this way, they can apply their systems to many market segments where operations are involved. When operations are performed, some kinds of data are generated and often times those data should be collected, stored, and analyzed to tune and improve operations and business processes. In order to dive into new domains, they need to keep adding new interfaces as well as adding and revising in areas they already cover. Dave Roberts told me that they now have close to 500 interfaces.
Coming from the IT segment, I see people tending to converge to a handful of well-defined standards and, therefore, interfaces. When I first put my foot into the data center market, I was very, very surprised to find out that there were many interfaces on the facilities side. Although BACnet is becoming a major force in the data center facilities protocol of choice, there are still several other protocols, such as Modbus and LonWorks, being used. An IT guy like me tends to think we can force facilities to adopt a single standard to consolidate all the protocols into one, which is IP. I now know it does not work that way. I got involved in NIST’s Smart Grid Interoperability Panel, which was organized to come up with a set of standards to allow smart grid to function without conflicting technologies and protocols. The power industry has been around longer than IT, and there are many standards by IEEE, IEC, and others. The power industry has been conducting business to keep the lights on for more than 100 years, and they will not listen to IT about consolidating everything to IT technologies and protocols, for sure.
How to translate domain specific requirements for software developers
OSIsoft maintains that their core PI system is generic and does not change when they apply PI to different vertical markets. When they pick a new domain, they add new interfaces specifically required for that domain. So every time they step into a new domain, they need to worry about yet more interfaces to maintain. This seems daunting, but it is the only practical way to have a generic system to apply to many areas, such as the power industry, oil and gas, and building management segments.
For each vertical domain there is a dedicatedindustry management team that includes experts in that field who can communicate natively with customers. The experts get agreements on requirements, then translate those requirements to a specification for software development teams and partners/ecos to work on.
How to enter a conservative industry like the power industry
IT’s change of pace is very fast. New technologies come and go quickly, sometimes within months, if not days. In contrast, utility companies are very conservative and do not replace their technologies and equipment for many years until new technologies or equipment are proven to work solidly. I was curious to find out how a software company like OSIsoft could penetrate into the conservative power industry. In the 1990s, OSIsoft partnered with Westinghouse and also with ABB. Through their introductions to utilities, they started to work with utility players. They expanded their market presence in the utilities market. Although there are a lot of similarities, each utility has specific needs, which triggers customization. But OSIsoft does not provide customization services. Customization is done by utilities themselves or system integrators. Nearly all—97%—of their revenue comes from software maintenance; the remaining 3% comes from basic services such as installation. So a highly configurable nature is important for their product.
Sharing data among multiple entities
In general, if two entities work together, it would be most beneficial to share data among the two. For example, let me refer to the power grid in California. California ISO (CAISO), which reliably balances power supply and demand on the transmission, does not maintain the transmission lines. The lines are maintained by PG&E, a local utility in my region that also is responsible for the distribution grid. Power imbalances can be caused by operational or equipment problems. Therefore, it is very useful if CAISO shares data with PG&E so that they can work together to solve the problem. For this, OSIsoft has released a new feature called PI Cloud Connect, which allows highly granular data to be shared with specific accessibility control in a cloud setting. In this way, any number of organizations can share time-series data with a specific access privilege. Yes, this is a good application of ICT.
Once data are captured and stored, they are analyzed to derive useful information to improve operations and business processes. Analytics can be done at many levels. They can be as simple as out-of-bounds values analysis all the way up to prediction. Here OSIsoft does not do its own analytics packages but makes sure to plug in others’ packages seamlessly to the PI system. I am currently looking into analytics more in detail. Because analytics is a very broad term and it contains so many angles, most presentations or white papers on products do not mention it in detail. That is frustrating, to say the least.
What is an example of analytics in the utilities business?
Analytics example 1: equipment preventive maintenance
Do you see boxes of different colors and shapes on utility poles around you? One of those boxes is called a transformer and is used to step down high voltage to lower voltage before power gets to your home. Most transformers are based on the electromagnetism discipline and degrade physically as time goes by. If a transformer malfunctions or fails, power to your home will be interrupted. It would be nice to know when to repair or replace it before it fails. One of the analytics packages can monitor its health, bounce it with the historical trend, and provide an early warning.
Analytics example 2: wind power generation
Another example is in wind power generation. Wind is hard to predict. It is blowing one moment but not the next. It is vital to balance the demand and supply of power every second. If we cannot predict power generated by wind, it makes it more difficult to balance power. So it is very important to predict when wind blows and when it stops. Predictive analytics is used widely in weather forecasting, and wind prediction is part of it. First, a prediction model is developed from the historical data, and the model is fine-tuned and modified as more data are collected.
Analytics example 3: smart charging for EVs
Currently, in California, power demand increases as the day goes on and hits a peak in the early afternoon. It goes down to its lowest point during the night. An electric vehicle (EV) like the Nissan Leaf or Chevy Volt is known to draw about the same amount of power as a typical household. If they are charged when power demand is at peak, we run out of power to satisfy demand. But during the night, we usually have plenty of power available, and it is suitable to charge EVs at night at home. This is what a typical EV owner does now. As more public charging stations pop up, and faster yet power-hungry new charging technologies proliferate, charging may be done during peak time. That would disturb the power balance and lead to outages. For this reason, smart charging needs to be developed and deployed. The result of this type of analytics would dynamically allow charging to start when supply satisfies demand.
Different utilities could use an analytics package developed by one utility, but OSIsoft does not share particular users’ analytics algorithm with others. OSIsoft has its users communities, and those who belong to them might share such an algorithm via community. The T&D User Group community exists for 20 years, and they tend to share information when there is no competition among them.
Analytics example 4: more renewable energy sources for power generation in California
California has adopted a renewables portofolio system, known as RPS. This specifies the minimum percentage of renewable energy sources, like solar and wind, in power generation. California plans to attain 33% of all the power from renewable energy sources by 2020. Although not all the renewable energy sources are highly volatile, like wind power, a lot of unknowns will be thrown into the power grid. Constant power-supply predictions based on ever-changing weather (the wind may or may not blow at any given minute, and solar power goes down when clouds set in) will be vital to keep the power grid stable all the time.
Applying PI to more demanding domains
Smart grid is to make the power grid smarter. Our physical infrastructures consist of more than the just the power grid; we need, for example, gas, water, waste, transportation, government, street lights and traffic systems. Dave is working on the next topic beyond the power grid, which is the smart city. According to Dave, a smart city is defined differently by different people. But currently, US cities like Austin, Seattle, New York, and Chicago have their smart city projects. OSIsoft is involved in some of them, and a public announcement is coming shortly.
Collecting, aggregating, storing, and linking all sorts of data from its different sources would provide tremendous intelligence to a city. A utility at the conference reported that they collect 100,000 data per second. If we implement a system for a smart city, the number of data points would explode by the order of 2 to 3 magnitudes. That means millions of data per second would bombard the PI system. Even though the PI system is created to cope with a large amount of data of many kinds, at some point, they may have to alter their architecture and technologies to process such a massive amount of data. That makes me interested in talking to their technology visionary. Stay tuned for that in a coming blog.