in

Edge analysis drives smarter computers

Exploiting edge computing and IoT devices for real-time analytics is promising, but designing analytical models for edge deployment is a challenge.

Many use analytics and machine learning that connects to data stored in a data warehouse or data pool, run algorithms on a complete data set or sub-set of data and calculate results in a cloud architecture. This method works well when the data doesn’t change frequently. But what if the data changes frequently?

Today, many businesses need to process data and calculate analytics in real-time. IoT drives much of this model change because data transmission from sensors requires immediate processing and analysis to control downstream systems. Real-time analytics is also important in many industries including healthcare, financial services, manufacturing, and advertising, where small changes in data can have a significant impact on finance, health, safety, and other business impacts.

If you’re interested in turning on real-time analytics —and in emerging technologies leveraging a combination of edge computing, AR/VR, large-scale IoT sensors, and large-scale machine learning—it’s important to understand design considerations for edge analysis. Marginal computing cases such as self-driving drones, smart cities, retail chain management, and enhanced reality game networks are all targeted at deploying large-scale, highly reliable edge analysis.

How edge analytics will drive smarter computing

Edge analysis, flow analysis, and edge computing
Several different models of analysis, machine learning, and edge computing are related to edge analysis:

Edge analysis refers to analytical and machine learning algorithms deployed to infrastructure outside of cloud infrastructure and “on edge” in geographically localized infrastructure.
Streaming analytics refers to real-time computer analysis when data is processed. Streaming analysis can be performed in the cloud or on the edge depending on the use case.
Event processing is a way to process data and make decisions in real-time. This processing is a sub-set of flow analysis, and developers use event- orientation architecture to identify events and trigger downstream actions.
Marginal calculations refer to the implementation of calculations for edge devices and network infrastructure.
Fog computing is a more general architecture that divides computing between edge, edge, and cloud environments.
When designing solutions that require edge analysis, architects must consider physical and power limitations, network costs and reliability, security considerations, and processing requirements.

Reasons to implement analytics
You might ask why you deploy infrastructure for analysis? There are technical, cost, and compliance considerations that influence these decisions.

Applications that affect human safety and require resilience in computer architecture are one of the use cases for edge analysis. Applications that require low latency between data sources such as IoT sensors and analytics infrastructure are the second use cases that often require edge analysis. Examples of these use include:

Self-driving cars, automated machines, or any means of transportation that the control system is authorizing all or parts of navigation.
Smart buildings have real-time security controls and want to avoid re-depending on network and cloud infrastructure to allow people to enter and leave the building safely.
Smart cities track public transport, deploy smart watches for utility payments, and smart waste management solutions.

Cost considerations are an important factor in using edge analysis in production systems. Consider a set of cameras that scan manufactured products for errors when on fast-moving conveyor belts. It can be more cost-effective when deploying edge computing devices in the factory to perform image processing, instead of having to install a high-speed network to transmit video images to the cloud.

I spoke to Achal Prabhakar, Vice President of Engineering at Landing AI, an industrial AI company with solutions focused on computer vision. “Manufacturing plants are completely different from formal analytical applications and therefore require rethinking ai including deployment,” Prabhakar told me. A major area of focus for us is the implementation of complex deep learning vision models with continuous learning directly on the production line using highly capable but cargo equipment.”

Deploying analytics to remote areas such as construction and drilling site is also beneficial when using edge analysis and calculation. Instead of relying on expensive and potentially unreliable wide-area networks, engineers deployed on-premises edge analytics infrastructure to support data processing and required analytics. For example, an oil and gas company has implemented an online analytics solution with a distributed in-memory computing platform and reduced drilling time by up to 20%, from the usual 15 days to 12 days.

Compliance and data governance are another reason for edge analysis. Implementing a localized infrastructure can help meet GDPR compliance and other data sovereignty regulations by storing and processing data that is restricted in the countries where data is collected.

Analytical design for edges
Unfortunately, taking models and other analyses and deploying them into advanced computing infrastructure is not always trivial. Calculation requirements for processing large data collections through in-depth data models of calculations may require re-engineering before running and deploying them on marginal computing infrastructure.

First, many data developers and scientists are now leveraging higher-level analytics platforms available in public and private clouds. IoT and sensors often use embedded applications written in C/C++, which can be unfamiliar and challenging terrain for cloud-based scientists and data engineers.

Another problem may be the models themselves. As data scientists work in the cloud and expand on-demand computing resources at relatively low cost, they can develop complex machine learning models, with multiple features and parameters, to fully optimize results. But when deploying models for marginal computing infrastructure, an algorithm that is too complex can significantly increase infrastructure costs, equipment size, and power requirements.

I discussed the challenges of implementing successful AI models with Marshall Choy, Vice President of Product At SambaNova Systems. “Model developers for advanced AI applications are increasingly focusing more on high-detail models to achieve improvements in reducing parameters and computing requirements,” he noted. The training requirements for these smaller, high-detail models are still difficult.”

Another consideration is the deployment of a highly reliable and secure edge analysis system that requires the design and implementation of highly fault-capable architectures, systems, networks, software, and models.

I spoke to Dale Kim, senior director of product marketing at Hazelcast, about use cases and limitations when processing data at the edges. He noted that, while equipment optimization, preventive maintenance, quality assurance checks, and critical alerts are all available at the edges, there are new challenges such as limited hardware space, limited physical accessibility, limited bandwidth, and greater security concerns.

“This means that the infrastructure you’re accustomed to your data center won’t necessarily work,” Kim said. “So you need to explore new technologies designed with advanced computing architecture.”

Next frontier in analysis
The main use cases today for edge analysis are data processing functions, including data filtering and agrily. But as more and more companies deploy IoT sensors on a large scale, the need to adopt analytical algorithms, machine learning, and artificial intelligence in real time will require more deployment.

The capabilities at the edges create a very interesting future of smart computing as sensors become cheaper, applications require more real-time analysis, and the development of edge-to-edge optimization algorithms becomes easier.

What do you think?

Leave a Reply

Your email address will not be published. Required fields are marked *

Looking into the post-2020 future of Big Tech

Looking into Big Tech’s post-2020 future

Why you’re doing cloudops wrong

Why you made a mistake in the cloud