Embedded technologies are evolving — performing faster, becoming more compact, and costing less. A supercomputer with a graphics processing unit (GPU) can be held in one hand and purchased for less than US$100. It can be used to do artificial intelligence (AI) development and machine learning. The AI modules that autonomous cars need to control the vehicle can now fit in the car. Let’s look at the top five trends.
Explosive growth is happening for wireless connections
According to Grand View Research, the internet of things is projected to be a US$1 trillion market by 2025. This represents a massive number of connections, which will improve many areas of our lives, including manufacturing, retail, energy, smart cities, health care, transportation, and more. The manufacturing sector’s Industry 4.0 initiative could be the first to reap the benefits. With the preventive maintenance, real-time data monitoring, and reporting that IoT connections make possible, smart factories will produce more with less.
The 5G era will push transmission speeds to 20 Gbps; 6G promises to be even faster. And with the availability of low-power wide-area networks (LPWANs), along with chips and devices from suppliers such as NimbeLink, Sequans, MultiTech , Elektronik, Digi, and Telit , the explosive growth of wireless connections will continue to accelerate. The IoT will be everywhere.
Speed is everything
Multicore processors perform better because they can dedicate individual cores to performing unique computational tasks such as graphics processing. Just about everyone is familiar with 8- or 16-core processors. Lately, though, Intel and AMD have been pushing 128-core processors. Arm, an IP company, has put the development of a 5-nm-node–based 192-core processor on its roadmap, determined to outperform everyone else. No wonder Nvidia, a GPU leader, is currently pursuing the purchase of Arm for a price tag of US$40 billion.
Everyone is chasing super speed in anticipation of 5G internet (currently demonstrated to be capable of 10-Gbps downlink speeds, versus 0.5 Gbps for 4G LTE). Processors, the internet, and systems will become not just faster, but superfast.
Separately, quantum computing’s mission is to provide very fast computing capability for scientific research and machine learning. Quantum computing can enable problem solving in seconds rather than the days that conventional computers would require for similar problems. Instead of using bits to represent a state of 0 or 1, quantum computing’s approach is to use a quantum bit, or qubit, to represent 0 and 1 at the same time.
While quantum computing is fast, measuring the qubit is a challenge. However a team at Finland’s Aalto University recently was able to measure the qubit using only a very small amount of energy, potentially pushing the technology forward.
Controlling the route a qubit will be taking is another challenge. But researchers at the Massachusetts Institute of Technology (MIT) and City University of New York in the U.S. reported the generation and direct integration of quantum emitters in an aluminum nitride-based photonic IC platform.
No doubt, quantum computing will play a role in increasing computing speed, but widespread quantum computing is still a few years away.
Cyberattacks are becoming more frequent and more dangerous
Cyberattacks have been on the rise, and while such attacks have always been disruptive, they are becoming increasingly dangerous. As the pandemic spread in the spring, for example, the World Health Organization warned that cyberattacks on the organization had increased fivefold compared with the number a year earlier. The infamous attack on Ukraine’s power grid left 200,000 customers without power, a major disruption to daily life — and it was not an isolated incident.
There are many different types of malware. In ransomware attacks, hackers commandeer a system and demand a ransom from their victims to return control to the rightful owners. Ransomware hackers are becoming more sophisticated and getting greedier. They have attacked businesses, government facilities, and even hospitals.
In September, the death of a patient in Germany was directly linked to a cyberattack. She needed urgent medical care, but a ransomware attack on the Düsseldorf hospital to which she had been rushed for care prevented the facility from providing the needed services to save her life. The patient was rerouted to a hospital 20 miles away, but it was too late. This was the first report of a tragedy of this kind, but it may not be the last.
With the establishment of more connections and higher, 5G speeds, such attacks will likely become more frequent. To counter them, organizations are stepping up their cybersecurity efforts. Projections are that cybersecurity revenues will reach US$254 billion by 2025. Embedded security software and hardware are growing exponentially. New hardware designs increasingly have built-in security silicon from Infineon, Microchip, STMicroelectronics, Micron, Winbond, and others.
The race between cyberattacks and counterattacks will heat up.
Virtual reality is becoming a reality
Since the early development of 3D virtual reality for gaming, VR and its offshoots have evolved into a serious, and seriously profitable, business. The initial VR concept has evolved to include augmented reality (AR), mixed or merged reality (MR), and extended Reality (XR), expanding the technology into a range of industrial and commercial applications.
AR is ideal for training. Users wear a see-through goggle with information superimposed on the screen. The information could be, for example, instructions for carrying out an equipment repair procedure. MR carries this one step further by allowing users to view both a real situation and other possible scenarios, mixing real and created images. Finally, XR is the ultimate application, projecting holographic-like images that look like the real, physical thing. For example, XR would lend a new dimension to remote conferencing; unlike traditional video conferencing, with XR an image of the person would appear in front of the user as if a face-to-face meeting were being held.
VR applications that can increase productivity span a broad range, including medicine, manufacturing, training, and entertainment. For example, a doctor-in-training requires many hands-on experiences, which may be costly to set up if not difficult to find. With VR, a simulation can be set up for the medical intern to practice without the risk of making a serious error on a real patient.
Microsoft HoloLens mixed-reality smart glasses (Image: Microsoft)
Tier 1 organizations investing in VR include Microsoft, Google, and Intel, among others. Market watchers expect new VR innovations and opportunities to emerge in the coming months. As the pandemic grinds on, for example, holographic-like virtual meetings might be the next best thing to meeting with colleagues face to (real) face.
Artificial intelligence will be in every embedded design
Artificial intelligence and its practical application, machine learning, are gaining momentum, with more embedded designs expected to include the concepts over time. AI today is used in almost any segment you can think of, including retail, health care, autonomous driving, e-commerce, manufacturing, supply chain, industrial control, entertainment, banking, and many more.
For example, an AI-based system can be equipped with a video camera to ensure that workers are wearing the proper protective gear during an operation. Today, the smart factory equipped with IoT and AI can increase productivity by monitoring the operation in real time and having AI make decisions that avoid operational errors. In the long run, AI could do much more. In one vision of the future factory, the facility would use AI and robotics to retool itself on the fly for production of a different product. An assembly line set up to build medical devices one day, for example, might build wearable smartwatches the next day.
AI can also make banking and other transactions more secure. Flowchain, an open-source–based blockchain organization based in Taiwan, has proposed a way to combine IoT, AI, and blockchain technologies to create a more secure data-mining approach. Blockchain is a relatively new electronic ledger solution to ensure security. When a transaction is pending, all stakeholders are notified and must agree before it is allowed to take place.
Partnerships in AI include a collaboration between Nvidia and VMware, a software company helping industrial segments to connect to multi-clouds. The partners intend to deliver an end-to-end enterprise platform for AI. Intel has aligned itself with the Heidelberg University Computing Center in a bid to catch up to de facto leader Nvidia.
At the same time, more AI hardware is becoming available, at a more affordable price. Nvidia recently announced a repackaged GPU module with a US$59 price tag, intending to make it available to the public. The compact module offers many I/O ports, including USB, HDMI, and gigabit Ethernet, and supports the open-source projects based on the module.
Nvidia’s Jetson Nano 2GB (Image: Nvidia)
AI startup Hailo recently launched an AI acceleration module based on its own Hailo-8 chip to compete with Google and Intel.
The uses to which AI can be put are limited only by one’s imagination. And with AI hardware decreasing in price, embedded designs are expected to add AI capability.
The pandemic has forced us to look at everything differently. Some industries have been turned upside down and might be forever changed. Businesses in the travel and entertainment industries have struggled to reinvent themselves, with some forced to shut down permanently. But there has been a telemedicine boom, enabled in part by new medical devices that connect doctors and patients online. More vital signs now can be measured at home and the data made available to doctors remotely.
Embedded technologies will continue to grow, even though the pandemic may have altered the paths to that expansion. Looking ahead, expect to see more explosive growth in wireless connections, faster embedded processing and computing, more cyberattacks and counterattack solutions, more sophisticated virtual reality, and artificial intelligence in every embedded design.
>> This article was originally published on our sister site, EE Times Europe.
|John Koon is a contributing writer to the AspenCore Media network.|
- How embedded software development has evolved over 20 years
- embedded Forum at electronica 2020: highlights
- What’s driving AI to the edge
- Microcontroller architectures evolve for AI
- Wally Rhines: silicon growth to continue to 2038
- 10 key trends in wireless technology
For more Embedded, subscribe to Embedded’s weekly email newsletter.