Multimedia  

 

Volume 21 Issue 01 - Publication Date: 1 January 2002
 
An Autonomous Crop Treatment Robot : Part II. Real Time Implementation
 
T. Hague Silsoe Research Institute, Silsoe, UK , B. Southall Sarnoff Corporation, USA and N. D. Tillett Silsoe Research Institute, Silsoe, UK
 
Implementation of an autonomous vehicle for precision treatment of crop plants is described. The navigation system integrates the vision system described in Part I with inertial and odometric sensing. A modular approach is adopted, where the crop grid observation model is re-formulated as a non-linear compression filter, which combines a set of observations of crop plants into a single pseudo-observation of the position of the crop planting grid relative to the vehicle position. The compression filter encapsulates all internal detail of the vision system (camera calibration, crop layout, etc.). The output from this vision module can be used as an observation in the conventional way by the vehicle's navigation Kalman filter.
Plant features classified into crop and weed by the vision module are registered into a treatment map. The need to treat targets from a moving platform requires time delays in the processing of observations to be considered. A method of compensation is introduced which allows time delayed vision observations to be used; the effectiveness of the technique is illustrated by an example with an artificially extended time delay.
In field trials of the vehicle have been performed, and the accuracy of both vehicle navigation and crop treatment are reported. We conclude that the navigation accuracy falls within the 20 mm root mean square error region thought appropriate for precise horticultural operations, and that spray application is sufficiently accurate for the treatment of individual crop plants.
 
Return to Contents