What’s Next After The Smart Grid?

Moving automation to the edge creates knowledge repositories and frees human assets, writes Alex Clark of Bit Stew Systems.
Published: Wed 28 Oct 2015

Brought to you by:


The Industrial Internet of Things (IIoT) is transforming the way a modern energy business is operated. Millions of connected devices, sensors and smart meters operating throughout smart grids are generating billions of data points that feed utilities measurements. Increasingly, smart grids are being proposed, planned and implemented, and the benefits are being reaped by utilities all over the world. But often, the benefits reaped are just the lowest hanging fruit.

With almost US$9 trillion in new revenue predicted to be generated by the Internet of Things by 2020, there must be additional opportunities to derive revenue and operational benefits from our existing smart grid infrastructure. So what is next after a smart grid deployment?

Big data challenges

First, let’s understand what we are currently dealing with. The variety of operational data generated in the utility industry is orders of magnitude more complex than what is typically found in IT environments. Millions of connected devices, sensors and network systems operating throughout smart grids are generating billions of data points feeding utilities measurements on power quality, voltage, energy consumption and more.

This data can be a valuable resource for driving operational efficiency and increasing productivity, but only if it can be transformed into meaningful intelligence. Without the ability to integrate raw industrial data into a common model, gaining a holistic view of smart grid operations including visibility into potential risks in real-time, utilities will continue to struggle to grasp the true business benefits from the sensor technology and data integration and management architectures they implement.

Traditional data warehousing models and open-source alternatives simply can’t scale at the level needed to gain impactful benefits beyond the initial ones garnered at a smart grid deployment. These models not only don’t scale, but also don’t fully maximize a smart grid’s real incentive: as a real-time virtual agent for the utility company, already deployed right where you need it. It is one thing to have an agent on site, reporting on conditions. It is another to have an agent on site acting upon conditions.

As an example, managing low voltage networks for electric utilities is becoming increasingly important as energy efficiency takes centre stage and distributed energy resources come online. This is a new operating paradigm for utilities and industry leaders are looking for operational intelligence that can support new business models, decrease operating costs and in some cases tap into additional revenue streams. In this scenario, the impact of the low voltage network has an upstream impact on medium voltage and high voltage transmission. In a modern world, the electrical grid is no longer isolated or top-down - it now is an interdependent mesh where utilities and regulators rely on sensors and controls to create an energy balance.

Machine learning to drive situational intelligence

This reality means that utilities have a persistent need to manage processes through automation in real time. This requires machine learning to automate the data integration process, and to actively manage the transfer and impact from low, medium and high voltage transmission. This reduces costs, response to outages and asset failures, and also impacts the top line.

With actionable intelligence, a utility can begin creating rules-based processes that dictate a specific response. Through machine learning, software can learn patterns, automatically detect anomalies, and create and store new intelligence to address parameters outside of normal “triggers.” This requires your virtual agents to be positioned at the edge of a network.

Once utilities adopt an architecture that is purpose built for the Industrial Internet, they can then begin deriving real value from their data by using data-enabled applications that have been specifically developed to solve the unique challenges in the utilities sector. Data enabled applications (these are your virtual agents) operating at the edge of the network can provide utilities with real-time understanding of their operations and the situations occurring at any given time. We call this “situational intelligence” because it gives operators the contextual awareness they need in order to make real-time decisions that improve operations, increase uptime, optimize the dispatch of resources, enhance asset performance and more.

Virtual edge agents improve operations

Through the use of these data-enabled virtual agents, utilities can not only improve operations but also begin codifying and sharing information throughout the organization to create context-aware operations for all employees, from the network operations centre to the field.

Using artificial intelligence, machine learning and rules-based analysis of events, a utility can train the software to respond automatically to certain triggers and automate most manual workflows. As a result, the utility can create instinctive virtual operators that can assess and handle most situations that occur, and adapt to scenarios never seen before. The software is able to conduct a rules-based analysis of historical data to create a knowledge repository that invokes a response based on specific processes. If the situation doesn’t directly match, the analysis engine can look at all of the present factors and determine an intuitive decision to maintain operational performance and reliability.

Both of these elements aid in the very real challenge of the pending retirement boom. We are losing a lot of tribal knowledge and qualified workforce as baby boomers exit the workforce. [Engerati-Train Technology (in Addition to People) to Realize the Smart Utility of the Future] By bringing our edge agents into the fold, we are able to codify intelligence for all agents across IT and OT, while also offloading some of the decision-making to machine intelligence on the edge to more effectively position our human operators for more mission-critical activities.

Exceeding current smart grid benefits

Utilities can exceed the current benefits of the smart grid by implementing software purpose-built for the massive data of the Industrial Internet, empowering the edge of the network to enable us to leverage machine learning and complex event processing.

By doing so, we will more quickly integrate data from disparate sources, automate predictable operations and provide operators with the situational awareness they need to effectively identify and handle the anomalous or most critical events.

This real-time intelligence and operational automation helps utilities improve power distribution, lower operational costs, proactively identify and address risks and accommodate new distributed energy resources.

These and other topics will be discussed by Bit Stew CEO, Kevin Collins, in a presentation “The Smart Grid and Beyond: Lessons learned with BC Hydro”, and in the Big Data & Analytics and IoT panel at European Utility Week 2015.

Alex Clark is a seasoned architect and leading expert in big data technologies, global class computing and building high-performance, secure, scalable and distributed architectures. These skills have been instrumental in developing Bit Stew System’s purpose-built data integration and analytics platform for the Industrial Internet. Alex also oversees all software architecture functions within the business and collaborates on the evolution of the product roadmap. In his position as Chief Software Architecture at Bit Stew, he combines his extensive experience with various industries with his deep knowledge of software design, artificial intelligence, machine learning and data architectures to continually enable organizations to fully take advantage of the Industrial Internet.

Related Webinar