Data analytics and technology are driving the changing operational practices of utilities.
Given the large scale and long term nature of typical energy sector projects, the pace of transformation that is occurring is remarkable.
Renewables are expanding at a fast pace – faster than originally envisaged. Their costs are dropping rapidly to the point of reaching parity with other options, as are those of storage, particularly lithium-ion battery storage. Similarly, rooftop solar costs are also dropping, improving their attractiveness to residential and other building owners.
What does this bode looking ahead to 2017? – Both certainty and uncertainty. On the one hand, these trends can be expected to continue and even accelerate. But on the other there is the likelihood of the unexpected as the impacts of these newly installed technologies are felt and as new technologies and business models emerge.
Besides the technologies themselves, there is one factor that is making all these developments possible – data and the associated analytics.
“There are multiple sources making power and managing them from efficiency and maintenance perspectives while also maintaining grid reliability has brought the need for accurate real-time information,” says David Thomason, Industry Principal Global Power Generation at OSIsoft.
With the implementation of growing numbers of decentralized resources and devices in the grid, those needs are increasing exponentially, adds Thomason, who with 34 years’ experience in the power industry, has become an active advocate of the use of advanced analytics and technologies to enhance business value.
“We see technology as the glue tying the diverse set of assets and the data together to make one set of assets working seamlessly together.”
As an example of an illustrative initiative, he cites Arizona Public Service’s rooftop PV programme. APS owns the panels and as such is able to manage those resources onto the distribution grid, giving insights on supply and demand that would be difficult to obtain if they were owned by the individual householders.
“We believe this is a unique approach and will be watching to see how successful it is.”
Thomason points to the advances in analytics practices that are converging OT and IT and that are driving today’s industry trends. [Engerati-In Focus: T&D Predictions for 2017 - IoT]
“Analytics is taking place in multiple layers. For example, there’s a move for analytics at the device edge. Another level is to bring that data back to the system to combine with other devices in that area and then do real-time streaming analytics against those. Or the data can be brought out into big data systems and integrated to other business applications.”
Resultant trends that are observed include the move away from static/periodic equipment condition assessment to dynamic real-time online condition monitoring. Disparate data systems are giving way to combining all plant and enterprise data in a single source. Instead of limited staff resources use of analytics beyond assigned facilities, experts are being leveraged throughout the company for multiple sites and assets.
There is a move from multiple projects and point solutions with delayed value realization to a common data infrastructure that supports continuous improvement in many areas. High cost reactive maintenance is being replaced with proactive and predictive based maintenance. There also is a move from aggregating information to assess adverse events to real-time situational awareness, market predictability and planned response.
As an example of how these trends are being driven, Thomason cites OSIsoft’s PI System for collection, analysis, visualization and sharing of large sets of time series data.
“If one wants to do a backcast of how the commercial dispatch efforts performed compared with how it was expected to perform, with the new tools available that integrate the real-time data and analytics results from PI System and can merge them with other systems, you can bring in those forward gas positions and other data,” he says. “You can now do that level of analytics with a wider context and deeper analytics view and at much higher frequency. This is a major leap in how companies can make use of their own information to make better decisions.”
With the prospect of ever more data in the future, how selective does one need to be and how much of it needs to be stored and for how long?
“I advocate getting all the data one possibly can,” says Thomason. “If one wants to be in this new world with decentralised generation and sensors and meters on the grid, one needs all the information one can get on those assets.”
How long the data should be kept, he advises depends on the asset and in some cases regulatory requirements. For example, for critical assets data should be retained for at least the life of the assets and in some cases forever.
“The costs of data storage are reasonable and for example, with the PI System time series data compression algorithm, years of data can be effectively stored,” he says. “We have a customer who has 20 years’ worth of data on an asset they can do trend analysis on. The data collection system is the ‘black box’ for utilities.”
So how should utilities be gearing up for this data-rich future? Thomason says that some utilities are “on the bow-wave of these analytics practices,” while others are following. However, he says that is “ok” as “the digital transformation is a journey, which each need to do at their own pace.” The important thing is that utilities should understand that ‘information’ is one of the most important things they have in their business and they need to exploit it.
“What I find exciting is that it is so much easier to do now than in the past as the tools are available and they work.”
He also notes that these capabilities are able to bridge the ageing workforce challenge. “With all the assets and their information in a structured context, knowledge may be more easily captured and transferred to new employees.”
This is being further aided with emerging tools such as augmented reality, which OSIsoft and a partner is demonstrating with its PI System allowing asset information and data to be “floated” over an image of the asset.
“The analogy I like to use is that of the home town car mechanic versus the race car mechanic. The hometown car mechanic is able to fix one’s car, whereas the race team is into studying things like G forces and vibrations, i.e. data, and they simply can’t let the car break down because of the amount invested in it. It is similar to what a utility is going through – the risk of failing is too great.”