Data harvest feeds Agriculture 4.0 - Embedded.com

Data harvest feeds Agriculture 4.0

Farmers are harvesting sensor data to shift from preventive to predictive agriculture.

Since its inception, the Industrial Revolution has centered on automating production processes. Now that we have entered the era of Industry 4.0, most industrial processes have become data-centric, generally involving five steps of data manipulation: collection, transmission, storage, analysis, and, finally, display. This last step is to keep humans in the loop, but data can also be fed back to some actuating device, bringing the process into the realm of robotics.

Agriculture has not been immune to industrialization over the past two centuries, and in recent years, Agriculture 4.0 has gained momentum. Just as industrial production made the transition toward data management, agriculture is now following that path. Companies that traditionally have served industrial segments now offer similar data-centric approaches to the agriculture sector, and we are even seeing agricultural-equipment manufacturers expand into industrial-equipment manufacture. Although agriculture is often characterized by an unstructured environment with respect to traditional industrial manufacturing industries, the versatility of new data-centric technologies is helping agriculture to become an industry that is piloted in the same manner as automotive or aerospace. The farmer has become an engineer like any other engineer.

It all started in the 1990s with the first automation equipment for the high-value dairy industry – primarily milking machines from the likes of Swedish manufacturer DeLaval and Netherlands-based Lely. At the same time, optical sorters for grains, particularly rice, were developed by companies such as Satake, headquartered in Japan, and Bühler, based in Switzerland. Some of these sorting techniques ended up in the field again for high-end agricultural products, such as vineyard grapes. Pellenc, in southern France, developed such robotic gear, which transformed farmers into data scientists.

Indeed, once automation was in place for this new generation of farmers, they had the opportunity to go the extra step, not just looking passively at their yield but acting proactively to improve the quality and quantity of their agricultural produce. Whereas the small-scale farming operations of the past could rely on the farmer’s eyes and intuition to monitor everyday activities, today’s gigantic farming operations can no longer rely on human senses. Data technology has become central to steering the farm in the right direction. Whether it is for herding, crop production, or high-end production such as wine, data is the focus of Agriculture 4.0.

Camera utilization in agriculture

One of the best examples of agricultural data management is the monitoring of fields using drones. Paris-based Parrot is a key player in that domain, largely thanks to its U.S. subsidiary, MicaSense. However, the French company announced in January that it had agreed to sell MicaSense to AgEagle Aerial, a U.S.-based data collection, analytics, aerial imaging services, and drone company, for US$23 million. MicaSense developed a camera that uses different wavelengths to compute normalized difference vegetation index (NDVI) maps, which have become the accepted way to monitor crop growth and spot problem areas. The state-of-the-art methodology is now to download the NDVI maps to tractors and thereby adjust the fertilizers delivered to the field.

The U.S. Federal Aviation Administration (FAA) recently reported that 7% of the 1.6 million registered drones in the U.S. were for agricultural purposes. This represents more than 100,000 active drones for agriculture in the United States. While accounting for only a small portion of the overall commercial drone market, the agricultural drone segment has become a significant revenue-generating reality. The collection of data is increasingly the role of robots. Whether for an automated barn, an agricultural drone, or an autonomous tractor, data is no longer the new oil; it is the new crop.

IMU utilization in agriculture

The robots used in smart agriculture fall into two main categories: aerial (drones) and land-based (such as tractors and harvesters). In both cases, the robots’ functionalities rely on various types of sensors. One such functionality is the inertial system for navigation and stabilization, which must meet requirements for high performance, reliability, and accuracy; low bias drift; low bias instability; and stable performance over temperature – all at an affordable price – to justify the investment.

click for full size image

(Source: Yole Développement)

click for full size image

(Source: Yole Développement)

Drones make it possible to monitor the health and status of crop fields (via cameras) and are typically used for fertilization of small to medium fields (<20 hectares) as an alternative to more costly airplane-based fertilization solutions. Drone navigation and stabilization are very important when pointing the camera at the ground, as it is necessary to know what the camera is capturing. At a height of 10 meters, an error of 5° results in an 80-cm error.

While GPS could be accurate enough for drone navigation, robust inertial measurement unit (IMU) solutions are needed for camera stabilization.

Land-based robotic vehicles for agriculture navigate crop rows and need centimeter-level precision to avoid damaging the plants. Most of these machines have an accurate GPS system, which allows the driver to know the location of the vehicle and prevents double fertilization or lack of fertilization. However, GPS could be limiting in cases in which the robot drives, for example, under trees, where the signal could be lost. That’s where IMU or attitude-heading–reference system (AHRS) solutions are needed. IMUs based on microelectromechanical systems (MEMS) are well-equipped to meet land-based application requirements for high performance and low size, weight, power, and cost (SWAP-C).


Dimitrios Damianos is a technology and market analyst at Yole Développement’s Photonics & Sensing Division.
Pierre Cambou is principal analyst in the Photonics & Sensing Division at Yole Développement.

Related Contents:

For more Embedded, subscribe to Embedded’s weekly email newsletter.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.