Ours is an era of unstructured information. Due to the growing number of devices, sensors and enterprise systems connected to the Industrial Internet of Things (IIoT), the data we seek to manage, analyze and interpret value from is available to us on a scale previously unfathomable – creating both opportunities and challenges for CIOs in every industry.
If we can manage that influx of data and coordinate the disparate systems that generate it, we can achieve a real-time view of operations and extract meaningful insights that can improve the bottom line. If not, we’ll find ourselves drowning in data and lagging behind in a relentlessly progressive, high-tech environment, where winners are often the new entrants, not the incumbents.
CIO Challenge #1: Seeking a real-time view of global operations
When I worked for Teekay Shipping, an oil and gas transport company with a fleet of more than 150 oil and liquefied natural gas tankers operating around the world, keeping close tabs on those vessels from shore was paramount. A minor mechanical failure or lost data transmission could cost tens of thousands a day in downtime, and safety breaches could be fatal.
In those days, we relied on manual maintenance reports communicated from the vessels via satellite, then looked to on-shore personnel to make sense of the data. We were continually searching for a way to automate this process – a system that could provide a more comprehensive, on-shore view of our fleet in real time. If we had real-time information about what was happening on our vessels and the ability to compare each vessel against the others, it would have been enormously powerful. We could have recognized patterns and predicted maintenance and safety issues before they arose, resulting in cost-savings and risk mitigation for the business.
Today, the latest tanker technology – like many industries that involve monitoring field devices remotely – has the capacity to relay detailed information from sensors and devices on the vessel instantaneously. But capturing all that data is just the first half of the problem. Personnel still struggle with sifting through millions of data points to identify meaningful and actionable insights. This is a challenge that is common across so many industries.
Regardless of what industry you’re in or where you’re collecting data from, the key is to look for software solutions that can apply machine learning and rule-based analysis of events to respond automatically to certain triggers and automate most manual workflows. Utilities and other companies in the energy sector can use this type of artificial intelligence to essentially create a virtual operator, meaning it can assess and handle most situations that occur by sifting through the “noise” and alerting the human operator only to “actionable” issues. Such technology should be capable of “predictive operations,” recognizing patterns and adapting to scenarios it has not seen before so organizations can mitigate issues proactively.
By using technology to automate processes and move toward software defined operations (SDO), more of the human operator’s time is freed up. Instead of responding to an overwhelming number of alarms and alerts from various systems throughout the operations, the operator can take a triage approach, using their knowledge of the situation to identify which are the most important and prioritizing their actions accordingly.
CIO Challenge #2: Coordinating disparate systems to provide a single source of truth
Moving into the finance industry as the CIO for First West, a credit union in British Columbia, Canada, with over $6 billion in assets, I landed in an environment rife with mergers and partnerships. This meant that numerous financial institutions needed access to members’ account information via each other’s systems. The problem? Each credit union operated its own system. To pull information from another credit union’s system, we used complex and cumbersome integration tools. This forced IT resources to play catch-up behind the scenes following a merger or partnership, rather than focus their efforts on innovation and progress in a highly competitive industry where new market disruptors were emerging daily.
Our goal was specific: to be able to quickly and seamlessly pull clean data from multiple systems to gain a 360-degree view of our customers and thereby target our products and services to them. But that goal was hindered by an underlying challenge I’ve faced in every industry I’ve worked in: the integral need to coordinate disparate operations systems. To do so (in any context) requires a multilingual software program that can read all coding languages, and is thus able to query and retrieve information from any operating system directly without having to reconfigure the operating systems themselves. Such a solution is not only faster and more cost-effective to set up, but it helps to identify the existence and source of bad data, so you’re able to make better decisions based on reliable and more easily accessible information.
Organizations in the energy sector should look to a data integration platform that leverages a semantic data modeling approach for faster data integration from disparate sources. Data integration has long been the Achilles Heel for industrial organizations. According to IDC research, 80% of an IT project is typically spent integrating and preparing data, and 50% of projects will fail to deliver the desired results or will exceed budget due to data integration issues. The time-to-value or ROI for IT projects lies in their ability to quickly integrate data and intelligently populate systems with clean data.
Without a semantic data model, there is little a machine can use to baseline data and thus becomes reliant on human interpretation, which can be inconsistent and is time consuming and costly for organizations. A semantic data modeling platform integrates data more quickly and creates consistency in the interpretation of data, thereby driving consistency throughout all processes and functions that depend on the data.
CIO Challenge #3: A simple interface anyone can use – catering to the business user
It wasn’t until I reached the utility sector, where I was tasked with choosing an integration tool for a massive smart metering project at BC Hydro, British Columbia’s primary utility company, that I saw the solutions I’d been looking for in action. I was faced with the same challenges as before: needing a real-time, contextual view of my “fleet” (2 million smart meters in the field) and integrating disparate systems (work order routing, asset management, etc.). In fact, the utility sector has pioneered machine intelligence technology for the Industrial Internet with the ability to capture, analyze and interpret data from any kind of sensor-equipped field device.
However, this time I faced the added challenge of needing a solution engineered for business users rather than IT staff and other technical experts. With the smart meter rollout being new to the whole company, we needed a tool that was easy to use and quick to learn so non-technical business users could leverage its full potential with a minimal learning curve. No matter what type of data or devices you’re looking to manage, a simple interface is key. It should take no more than a single day’s training to learn to use it to its full capacity, and everyone from the most junior operations staffer to the company CEO should be able to use plain English query language to retrieve the valuable insights and actionable intelligence they need to make operational decisions.
Solving the data challenges in the energy sector
As the volume and complexity of data in the energy sector continues to increase, the key to solving these longstanding data challenges lie in the ability to streamline data integration, increase operational automation and gain a holistic, contextual view of data across an entire organization – from the field to the operations centre, including both IT and operational data.
Rest assured that the technology to solve these challenges exists. Whether or not it has already been applied in your organization, machine intelligence technology for the Industrial Internet is in use on a daily basis in the utilities and oil and gas sectors. Leveraging machine learning and semantic data modeling helps industrial organizations speed data integration from disparate sources and apply real-time analysis. Rules-based analysis of events and software defined operations help them automate common processes and reduce the amount of human intervention needed.
Lastly, a user interface that provides rich visualizations of data gives personnel the contextual understanding and situational awareness of enterprise operations that they need for intelligent decision making. Together, these capabilities enable energy companies to improve operational performance, identify efficiencies, increase uptime, optimize asset management and reduce operational costs.
Such an approach was taken by BC Hydro, which leveraged Bit Stew’s purpose-built platform to quickly integrate massive volumes of data and apply real-time analytics at scale during its 2 million smart meter rollout. Kevin Collins, CEO at Bit Stew, will lead a case study presentation on BC Hydro, addressing the common issues utilities experience throughout the lifecycle of a smart grid rollout and lessons learned at European Utility Week 2015.
A former director of the CIO Association of Canada and current member of CIONET Spain, Michele Morgan has over 30 years experience in leading information technology functions and large, complex programmes across multiple industries worldwide. Through her extensive global experience, Michele brings a proven track record in restructuring, creating and leading high performance teams and large multi-disciplinary project teams. She has a strong strategic planning background, and is skilled at integration and alignment to business strategy and goals. Her domain expertise includes utilities, smart grid, IoT, software, SaaS, real-time operational systems, and data analytics. Michele leads business development for Bit Stew Systems in Europe.