MultiTechOpinionEdgeDataCloud

Opinion: Drowning in data? Tackle it at the edge, not in the cloud

Don't dump data in the cloud; process it at the edge, says Daniel Quant, Vice President of Product Management & Strategic Marketing at Multi-Tech Systems.
Published: Fri 10 Nov 2017

Every day, an estimated 2.5 quintillion bytes of data is added to the cloud, according to an IBM Marketing Cloud study. But too many data management systems use the cloud as an all-purpose dump for every type of raw signal or data output generated by local sensors and assets.

This type of data management architecture means that huge amounts of laborious and costly data processing have to be performed centrally, either in the cloud or in an enterprise data centre.

This approach has the following drawbacks, says Daniel Quant, Vice President of Product Management & Strategic Marketing at US-based Multi-Tech Systems:

  • Cost of backhauling data between edge nodes and the cloud or server. Traffic charges are levied at a cost per byte. These charges can rapidly mount up, particularly when using cellular networks.
  • Cost of cloud services for processing raw field-bus data from industrial, often proprietary assets, into a format that can be handled more efficiently by internet-based platforms.
  • Cost of data storage. Storage charges are typically levied by cloud service providers at a cost per byte of data storage capacity. The more raw data is accumulated in the cloud, the higher the storage cost.
  • Latency introduced into often simple business processes. It takes time to upload data to the cloud, process the data and produce a decision, and then transmit the decision back to the edge node. Simple decisions, such as whether to accept delivery of a shipment of frozen food, or whether to perform a truck roll to deliver or inspect an asset, can often be made much more quickly if the processing is performed locally.
  • Risk of downtime, which can disable or impair a local business process. Every decision or business process that relies on an application running in the cloud is vulnerable to a failure anywhere in the link between the edge and the cloud server. Internet connections are prone to downtime.

With the future of electricity grids based on a decentralised architecture, the alternative approach is efficient data management at the edge of the network.

This eliminates the need for time-consuming data backhauls and for the parsing required to prepare data for business decisions.

This raises the question of how best to provide the hardware and software platform that will enable the provision of intelligence at the edge of the IoT while maintaining built-in support for cloud connectivity, services and applications.

One approach to this question is to build intelligence into connectivity components at the edge.

Gateways and embedded modems, when backed by programmable real-time Operating Systems or application development tool chains such as ARM’s Mbed and IBM’s NodeRED, can implement and deploy applications locally and support rapid and intuitive application development in environments familiar to IT developers.

Intelligent edge connectivity solutions also provide a complete suite of software to cloud connectivity, including security and authentication capabilities and standards-based cloud middleware and telecoms protocols required to scale deployments.

MultiTech's Daniel Quant discusses the pros and cons of licensed and unlicensed spectrums for distributed assets.

Embedded application development

In the world of electronics engineering, the concept of local processing and intelligence at the edge is in fact nothing new.

The microcontroller, the core component in the majority of embedded devices, has been steadily growing in capability over the past three decades while falling dramatically in price.

Today, a microcontroller with a powerful ARM Cortex-M processor costs as little as $1.50 but provides the capability to execute rules-based algorithms, and is able to run sophisticated operating system such as Arm Mbed OS plus all the application code required for processing sensor data at the edge to the cloud.

If the vision of an IoT containing billions of connected Things is to be realised, however, it cannot be reliant on the relatively small pool of electronics design engineers who have lots of experience with M2M communications and telematics, and who are comfortable developing applications in a microcontroller’s integrated development environment.

The rapid development of applications to control 'things' locally, and to provide for efficient management of data transfers to and from the cloud, must be accessible to the millions of web developers, systems developers and computer scientists who understand the enterprise IT environment and are accustomed to programming using high level APIs that provide access direct to services.

Today there is an ecosystem in place to support this model of development for intelligence at the edge, comprised of hardware, software and wireless connectivity. In MultiTech’s case, it is enabled in a suite of products available to optimise enterprise digital transformation.

 

"Data management systems use the cloud as an all-purpose dump for every type of raw signal or data output generated by local sensors and assets"

Expansion of IoT implementation

Another argument for moving data processing to the edge is the resources and effort devoted to data collection rather than to data processing. The highly centralised data management systems in place cause opportunities for business process improvement to be missed, and result in substantial and unnecessary operating expenses.

By incorporating intelligence into devices at the edge of IoT networks, businesses can rapidly accelerate processes, reduce data storage and data transfer costs, and improve system resilience.

Such an approach calls for the deployment of smart gateways and modems which support IT-friendly development ecosystems such as Arm Mbed OS. MultiTech products are available today to support such an approach.

Related Company/s