Data Analysis Just Got Faster

Utilities are turning to event stream processing to make accurate decisions more efficiently.
Published: Tue 18 Mar 2014

Most utilities know that data analytics has a very important place in its business model. Without it, increasing quantities of data will be lost and decision-makers will be left to make shaky decisions based on inaccurate and incomplete data.

Applying the best data processing system

Because data quantities will continue to increase and become more complex in nature, it’s important that utilities adopt the best data processing system.

Today, most utilities have an installed base of business intelligence software that is not capable of providing business insights from smart grid data. Traditional business intelligence tools are only useful for reporting on what has occurred in the past drawing on relatively limited historical data. These systems can be very limiting.

The big data opportunity for utility Chief Information Officers is to evolve analytics so that a better description of the current state (i.e. what is happening) can be attained, as well as predict the future (likely to happen). This evolution calls for more powerful analytics technologies that are capable of processing big data to generate better insights.

Event Stream Processing Engine

For fast-streaming data, an Event Stream Processing engine will give utilities the ability to obtain real-time analysis for insight at the drop of a hat. The idea behind stream data processing is to analyze data as it comes pouring out of a source system and into the memory of a distributed computing platform. The system is designed to continuously analyze received data, and then triggers actions based on the information flow. Analysts are using it to spot patterns and then make decisions faster than ever before.

For Event Stream Processing to work most effectively, utilities should adopt a more than a basic “if-then” business rule engine. What's more important is the ability to apply a variety of predictive or prescriptive types of analytics that will "learn" on the fly and update your models in place, explains David Pope, pre-sales SAS US Energy.

Predictive analytics helps utilities to predict occurrences based on historical patterns and new data. For instance, a series of readings leading up to an asset failure can establish a pattern that can be used to predict when other similar assets are likely to fail.

The benefit of machine learning (a catch-all phrase for multiple algorithm approaches) is that it enables a system to “learn” from experience by comparing predicted outcomes with actual outcomes. Using this process, utilities can tighten screening criteria to reduce future errors – this is how the data system “learns.” It is an automated way to update the algorithm.

A good Event Stream Processing solution also performs off-stream analysis, which provides a feedback loop into the stream or event stream processing production "jobs." The smart grid in utilities and drilling control projects in oil and gas are great examples of where this type of process and technology should be applied.

The Event Stream Processing engine provides the utility with a holistic view of its business by capturing and analyzing data from the past and present to predict the future.