Adding artificial intelligence (AI) functionality has an amazing opportunity to change the way systems behave and are controlled. What excites me is the new opportunity it brings for computers (robots) and humans to work together – harnessing the best of the capabilities of each.
The emergence of collaborative robots, or ‘cobots’, intended to interact with humans in a shared space or to work safely in close proximity is an important step in this direction. Cobots stand in stark contrast to traditional industrial robots which work autonomously with safety assured by isolation from human contact. Cobots are currently a very small percentage of the overall robotics market. However, this is an area we (and a number of recognized analysts) believe will grow rapidly in the next five years, for applications in manufacturing, healthcare and retail. The Covid-19 pandemic is accelerating this trend, providing more acceptance toward increased digital transformation and automation.
Cobots need much closer control for real-time implementation of complex decisions in co-working environments. This is an area where there is a lot of focus on AI to improve the user experience with these types of machines. This decision making has to be made by the robot as an edge device in order to achieve the speed and latency to cope with increasing data from more internet of things (IoT) sensors and the consequences of getting a decision wrong. The more pioneering manufacturing plants, however, are starting to rethink processes to make more efficient use of humans and robots together.
Edge computing will have a big impact on the development of AI. Right now, AI training produces vast volumes of data that are almost exclusively implemented and stored in the cloud. Placing compute at the edge creates a change to process and looks for patterns locally. I believe this can evolve the training models to become simpler and more effective. I’ve seen car manufacturers dramatically improve their quality-control processes through inference at the edge because they are able to catch any defects in real-time, before the product is put into commercialization.
In the first phase of deployments, ‘edge computing’ has often been a standard server blade being deployed at a facility. This will evolve to optimized-for-purpose hardware, deployed in the units such as robots. Typically, the heart of these systems will utilize very powerful processors running a number of different workloads concurrently. Think of the traditional highly virtualized computing environment of the cloud coming to hardware that fits inside the palm of your hand…and that ensures applications are isolated from each other, that one application cannot impact another and that those functions that simply must always operate in a responsive, deterministic way, always do so.
From an AI perspective, the ‘eyes’ of these systems must be computers. There is simply too much data to process in too short a period to consider any other type of approach. But that doesn’t mean we should fully rely on AI to protect and ensure systems continue to work as necessary. I cannot see AI being introduced into the mission critical electronics of these systems though with human lives at stake.
Before we see a future without human collaboration, we will see AI-enabled systems continue to be rolled out in a more controlled way. As the accuracy of models and datasets for specific functions improve, those functions will become AI enabled, with in-field upgrades being delivered safely and securely via containers.
Artificial intelligence has been widely covered as of late, especially as the pandemic has prompted enterprises in some sectors to accelerate digital transformation efforts. In the post Covid-19 world, I believe the market for cobots will grow quickly as it is the only way to preserve separation for humans. Until now, robots have been deployed on factory floors and assigned a specific task. Cobots that work collaboratively and flexibly with humans are an exciting vision whose promise relies on the safe, secure implementation of AI in these systems.
Ian Ferguson is the VP of sales and marketing at Lynx Software Technologies. Prior to Lynx, he spent nearly eleven years at Arm, where he held roles leading teams in vertical marketing, corporate marketing and strategic alliances. Ian is a graduate of Loughborough University (UK) with a BSc in electrical and electronic engineering.
- Engineering the future with robotics
- Qualcomm 5G and AI robotics platform delivers for Industry 4.0 and drones
- Edge AI platforms beat GPU/CPU on performance and efficiency
- Edge AI solution builds on neural processor and ML dev platform
For more Embedded, subscribe to Embedded’s weekly email newsletter.