Eingabe löschen

Kopfbereich

Schnellnavigation

Hauptnavigation

SmartWeeder Details

Detailed Project Description

Rumex obtusifolius is one of the most highly competitive and persistent weeds in agriculture. An automatic recognition and plant-treatment system is currently under development as an alternative treatment technique. An infrared laser triangulation sensor combined with a high-resolution smart camera are used to generate 3D images of the weeds and their natural environment. In a segmentation process, contiguous surface patches are separated from each other. These 3D surface patches are compared with different criteria of a plant database containing surface parameters such as shape, state of surface, etc. If an object is recognized as a dock leaf, its coordinates in the vehicle coordinate system are computed and the leaves are sprayed with herbicide.

The surface analysis in space can boost segmentation performance under conditions where state-of-the-art 2D recognition systems are not successful, e.g. low contrast, green-on-green images, noisy images, or images taken from inappropriate positions. Initial results have been promising. System development focuses on a more robust imaging-sensor technique and renement of the dierent algorithms. Looking to the future, the system design allows the flexible integration of other plant species.

Sustainable land management is characterized by a systematic approach. Plant-production measures are in harmony with ecological, work-economics and business-management requirements. In practice, for example, this means that plant protection products are only applied above a certain threshold of damage, or that individual plants are treated instead of broadcast spraying. At present, the mechanization and automation of tasks such as infestation recognition and plant recognition is seldom possible, given the lack of sensor technologies that work reliably in agricultural conditions with grime, dust, vapor, vibrations, natural light, plants at different stages of growth, etc., and which are in addition affordable.

With advances in the IT disciplines of image processing and localization, constantly improving tools for the development of applications for the agricultural and naturalspace sectors are available. These tools can help in replacing the human capacity for observation and decision making by technology. They free people from monotonous, physically and mentally taxing chores. The environment benefits e.g. from mechanical alternatives to chemical plant-protection means, the use of which is nowadays impracticable owing to time and ergonomic considerations.

In geodesy and photogrammetry collected data are three-dimensional, whilst products and interpretations, e.g. topographic maps, are two-dimensional. When it comes to extraction and recognition of plants in their natural environment, analyzing and processing 3D point clouds have several advantages over 2D-image-processing approaches. Segmentation is the crucial part of data analysis, and ranges from simple binarization of images to complex analysis of multispectral images, multidimensional data, etc. The challenges of real-time data analysis lie in a reliable and fast segmentation of raw data. An initial step of the segmentation is edge extraction.

The processing of the data succeeds following sequential steps:

1. Pre-processing:

  • Filtering.
  • Edge detection.
  • Labeling of 3D connected components.

2. Recognition:

  • Joining the surface patches.
  • Projection of the surface boundary on the plane.
  • Analysis of the boundary with help of the elliptic

3. Fourier descriptors.

  • Classication with help of the support vector machine (SVM).

The prototype is actually working under real time conditions in a natural environment. Under simple conditions (broad leafed dock between grasses) the detection rate is very high. Under more complicate conditions (dock between clover or other broad leafed plants) the recognition rate still has to be improved. We are working partially on this issue in student projects.

Mehr Infos

Hätten Sie gerne noch detailliertere Infos? Kontaktieren Sie den Teamleiter Vision & Navigation.