With the introduction of smart grids and technology, utilities are left to manage and analyze a significant quantity of data which continues to grow. The proper analysis of this “big data” helps the utility to run more efficiently and cost-effectively, as well as improve customer satisfaction. Without proper analysis, the data is worthless.
According to Sanjeev Kumar Singh, senior consultant, utilities solutions at Mahindra Satyam, utilities need data analytics for:
- Real-time access to utility data bases
- Reading all data fields in all systems
- Checking equipment (meters, transformers, generators etc)
- Real-time tracking of power losses (technical and non-technical) and imbalances
- Decision-making based on cost, staff availability, pre-set performance targets
- Ordering of equipment for future work and monitoring equipment inventory
- Real-time monitoring of all network operations
- Scheduling of field work
However, as data volumes grow, utilities are struggling to process the large amount of information. The analytical process will need to be even faster, more accurate, and cost-effective for the utility to remain competitive. Computer power will need to be stronger. In-memory computing just may be the solution utilities are looking for.
The most evident benefit of in-memory processing is its speed. It allows utilities to carry out high-speed analysis of entire masses of data all at once. In-memory computing tools provide the power that utilities need to analyze vast quantities of data (from a variety of sources) as and when it is received and needed. As a result, access to and analysis of data is a lot faster. Utilities no longer have to access data stored in a data warehouse. Instead, transactional data is stored in-memory. This means that analytics can be carried out in real time. In-memory computing helps utilities to quickly detect patterns, analyze massive data volumes on the fly, and perform their operations even faster, thus creating the best possible strategies.
This speed is essential. Instead of analyzing out-of-date data, utilities can perform complex queries in a matter of minutes. This means that operations can be investigated and improved based on the immediate situation. In-memory computing allows utilities to investigate entire sets of data rather than representative samples. As a result, utilities can be sure that they have all the facts before they start acting. Also, instead of trying to streamline analysis speeds by presenting data in a set format that only responds to certain pre-ordained queries, utilities are able to save data in a more unstructured format. By relying on the power of in-memory computing to compensate for this lack of structure, utilities will have more flexibility in how the data is accessed. In-memory computing also accelerates routine reporting functions, such as the ability to run end-of-period reports in seconds or minutes, instead of pulling data from a database with a batch run of hours.
The drop in memory prices in today’s market is a major factor, contributing to the increasing popularity of in-memory computing technology. This has made in-memory computing economical among a wide variety of applications.
It seems, therefore, that in-memory computing is a perfect fit for performing any kind of real-time analytics on this tidal-wave of data. Utilities should adopt to avoid being left behind.
With the continued growth of big data, it is inevitable that utilities will have to adopt in-memory computing in some way or form in the future in order to function properly.