Data Center / Cloud

Is IoT Defining Edge Computing? Or is it the Other Way Around?

The only thing more impressive than the growth of IoT in the last decade is the predicted explosive growth of IoT in the next decade. Up from 46 billion in 2021, ARM predicts one trillion IoT devices will be produced by 2035

That’s over 100 IoT devices for every person on earth. The impact of this growth is amazing. As these devices continue to become smarter and more capable, organizations are finding creative new uses, as well as locations for these devices to operate.   

With IoT spending predicted to hit $1 trillion in 2022, companies are seeing the value of IoT as an investment. That’s because every location in which IoT devices are present has the potential to become a data collection site, providing invaluable insights for virtually every industry. With new and more accurate insights, retailers can reduce shrinkage and streamline distribution system processes, manufacturers can detect visual anomalies on high-speed product lines, and hospitals can provide contact-free patient interactions.  

What is AI for IoT?

Organizations have rallied around the power of vision to generate insights from IoT devices. Why? 

Computer vision is a broad term for the work done with deep neural networks to develop human-vision capabilities for applications. It uses images and videos to automate tasks and generate insights. Devices, infrastructure, and spaces can leverage this power to enhance their perception, in much the same way the field of robotics has benefited from the technology. 

While every computer vision setup is different, they all have one thing in common: they generate a ton of data. IDC predicts that IoT devices alone will generate over 90 zettabytes of data. The typical smart factory generates about 5 petabytes of video data per day and a smart city could generate 200 petabytes of data per day.

The sheer number of devices installed and the amount of data collected is putting a strain on traditional cloud and data center infrastructure. This is due to computer vision algorithms running in the cloud being unable to process data fast enough to return real-time insights. For many organizations, high-latency presents a significant safety concern. 

Take the example of an autonomous forklift in a fulfillment center for a major retailer. The forklift uses a variety of sensors to perceive the world around it, making decisions off of the data it collects. It understands where it can and cannot drive, it can identify objects to move around the warehouse, and it knows when to stop abruptly to avoid colliding with a human worker in its path. 

If the forklift sends data to the cloud, waits for it to be processed, and insights sent back to then act on, the forklift might not be able to stop in time to avoid a collision with a human worker. 

In addition to latency concerns, sending the massive amount of data collected by IoT devices to the cloud to be processed is extremely costly. This high cost is why only 25% of IoT data gets analyzed. 451 Research conducted a study in “Voice of the Enterprise: Internet of Things, Organizational Dynamics – Quarterly Advisory Report”where respondents admitted to only storing about half of all IoT data they create, and only analyzing about half of the data they store. By choosing not to process data due to high transit costs, organizations are neglecting valuable insights that could have a significant impact on their business. 

These are some of the reasons why organizations have started using edge computing. 

What is edge computing and its importance for IoT?

Edge computing is the concept of capturing and processing data as close to the source of the data as possible. This is done by deploying servers or other hardware to process data at the physical location of the IoT sensors. Since edge computing processes data locally—on the “edge” of a network, instead of in the cloud or a data center—it minimizes latency and data transit costs, allowing for real-time feedback and decision-making. 

Edge computing allows organizations to process more data and generate more complete insights, which is why it is quickly becoming standard technology for organizations heavily invested in IoT. In fact, IDC reports that the edge computing market will be worth $34 billion by 2023

Although the benefits of edge computing for AI applications using IoT are tangible, the combination of edge and IoT solutions has been an afterthought for many organizations. Ideally the convergence of these technologies is baked into the design, allowing the full potential of computer vision to be recognized, reaching new levels of automation and efficiency.

To learn more about how edge computing works and the benefits of edge computing, read the edge computing introduction post.

Check out Considerations for Deploying AI at the Edge to learn more about the technologies involved in an edge deployment.

Discuss (0)

Tags