Smart Grid Analytics - Powered by Integration & Distributed Intelligence

Real value can be achieved by exploiting a range of advanced smart grid analytics using a proven, scalable, flexible platform.
Published: Tue 06 Jan 2015

Brought to you by:


Data is creating opportunities for new business-changing insights and value in the utility. Franz Winterauer, head of analytics EMEA, OMNETRIC Group (Siemens led joint venture between Siemens AG and Accenture) a co-speaker in Engerati’s webinar, Next Generation Smart Grid Analytics – Powered by Integration & Distributed Intelligence, expands on the subject.

Factors affecting the grid

Winterauer explains that the grid is very different today because there are more “degrees of freedom” in the system. He lists five factors which are having a significant effect on the grid today:

1.Flexibility-For prosumers and the need to integrate PVs, EVs, smart homes

2.Stress- At the distribution level, there are increasing minimum and maximum peaks, power quality issues, and microgrids

3.Volatility – At the transmission level, grid balancing at the “backbone” is more volatile

4.Competition – At the generation level, new types of competitors, high price pressures and volatility

5.Demand – For supply and retail, consumers demand new services and business models

Turning the death spiral into opportunities

Winterauer explains that there is a great deal of change taking place in the energy sector and some players are unable to see the wood for the trees. The above factors, he says, can lead to increased threats for businesses. A “home grown” death spiral could be created if these firms do not adapt.

“Utilities should equip the business with new capabilities and focus on their mastered methods in order to create opportunities from these threats. “It is about becoming better equipped to evolve your business models and optimising your infrastructure,” says Winterauer.

He lists five steps to managing complexity and opportunity so that five new capabilities can be developed:

1.Integrate

2.Operate

3.Automate

4.Optimise

5.Explore

“On the foundation of greater integration of information and operational technologies, utilities can adopt any of the five aforementioned steps to better manage their grid and business opportunities,” explains Winterauer. He points out that utilities should adopt their own starting point when devising strategies but in order to get the most value out of analytics, companies should focus on exploration, integration and optimisation.

Exploration

Through exploration, the OT department can start by learning from other industries like the telcos, understand what the IT department does and try out new technologies and methods in “safe mode.” Webinar co-presenter, David Socha, utilities practice lead international, Teradata (which has teamed up with Siemens and the OMNETRIC Group to further develop IT integration and big data analytics), explains how a strong voltage asymmetry was identified in one particular feeder through discovery analytics by using their Teradata Aster platform. “This was a very simple example of real life discovery analytics making a real difference to the distribution business.”

Integration

Once an issue is discovered and resolved, the idea is to integrate it into the system. However, each vendor will offer a different application which will be based on different data models and will be using different visualization and streaming tools for instance. Teradata and OMNETRIC Group suggests that silos be bridged using the following “ingredients”:

  • Smart Grid Apps – A flexible application container, apps are easy to port onto any customer-chosen development framework

  • Smart Grid Logical Data Model (LDM) – Platform agnostic Logical Data Model with full integration. It offers CIM-compliant (common information model) application extensions to the Utility Data Model (UDM), offers easy access models for end users in business units and is driven solely by joint use case development with utilities.

  • Omnetric Data Quality Services

Winterauer points out that OMNETRIC Group supports the development of cloud and on-premise solutions and that these should enable customer choice and provide a seamless transfer between solutions. However, he does point out that they do not believe in “black box” applications in the cloud promising everything. “Your application and data model is only as good as the data you load into it. The garbage in and garbage out principle is still valid dependent on the application that you have in the cloud. It can only visualize things based on the data that you feed into it.”

The idea is to leverage an enterprise reference data model and add the common information model-compliant application extensions which can be done using Teradata’s Utility Data Model and OMNETRIC Group’s Smart Grid Logical Data Model, says Winterauer.

As an example of when the data model is put into practice, Winterauer points to the development of a smart city – Aspern: Vienna’s Urban Lakeside. The project forms part of Vienna’s aim to develop Vienna into one of the world’s “smartest” and greenest cities. “The vision in parallel was accompanied by us by creating a single data model around all the data applications that have been built into this city.”

Optimisation

Socha describes how power line communication (PLC) could work on a grid. He says that a team of experts from power supplier Kärnten Netz were investigating a communication interference. It was questioned whether PV or a saw mill operation could be the cause. Due to heavy loads in particular areas and with some PV feed in, the system was not communicating effectively and the smart meters were simply unreachable. “We were able to investigate what the issues were using analytics from the smart meter, and PLC data and improve the quality of the PLC investigations by automating a process based on the information that we had gathered. After four months of doing analytics and delivering the automated solution, we had solved that problem for Kärnten Netz.”

Proven analytical applications

Winterauer has the following advice for utilities:

  • Adopt proven analytical apps if relying on trustworthy data - this is key to attaining value from utilities’ big data

  • Data experimentation skills are probably the weakest capability to be developed in the industry

  • True domain expertise is required not only to provide actionable insights but to act on these insights

  • Proper data integration and data quality are needed more than ever in the big data era to make apps work

Utilities should not:

  • Buy a tempting “magical app” off the shelf which the whole market tries to sell

  • Try to beat the “garbage in garbage out” principle

  • Trust pre-calculated ROI promises from apps vendors

  • Install some flashy dashboards without putting effort into data quality and data integration first

  • Get confused between high speed data exploration and high data integration

Socha says, “The point of analytics is to take action on something that has been learnt. It provides actionable insights that will improve your business.”