For an intelligent agricultural robot to reliably operate on a large-scale farm, it is crucial to accurately estimate its pose. In large outdoor environments, 3D LiDAR is a preferred sensor. Urban and agricultural scenarios are characteristically different, where the latter contains many poorly defined objects such as grass and trees with leaves that will generate noisy sensor signals. While state-of-the-art methods of state estimation using LiDAR, such as LiDAR odometry and mapping (LOAM), work well in urban scenarios, they will fail in the agricultural domain.
3D Move To See (3DMTS) is a mutli-perspective visual servoing method for unstructured and occluded environments, like that encountered in robotic crop harvesting. This paper presents a deep learning method, Deep-3DMTS for creating a single-perspective approach for 3DMTS through the use of a Convolutional Neural Network (CNN). The novel method is developed and validated via simulation against the standard 3DMTS approach.
This paper proposes a bionic electric spraying rod to perform the crop watering and spraying in the farm. The design concept of multiple vertebrae structures of snake is used to realize a reproducible snake bone arm and muscles of snake, which can be regarded as multiple sets of thin wires and be pulled and released through driver module. It results in different attitudes of the snake bone arm. A water pipe is installed in the snake arm connected to the spray nozzle for spraying. The mobile application interface (APP) is designed to provide the user to control the arm remotely.
Agricultural machinery manufacturers historically referred to the intermediate players for selling, maintenance, customer service and/or training of equipment appear to interact with farmers and end-users. Intermediate players have therefore faced the burden to master the technology, in constant evolution, and the associated training needs at the interface between sophisticated equipment and the end-user and its sociological characteristics (age, education, background, etc.).