Automated weed and feed

Conventional crop-spraying with herbicide to kill weeds among a crop wastes a lot of the herbicide and raises environmental concerns. A smart crop sprayer might identify weeds growing through the crop and spot spray only the unwanted plants. Work from a team in China published in the International Journal of Computational Science and Engineering, looks at the real-time segmentation of a cornfield to detect weeds that could be used to control such a smart crop-sprayer.

Uncontrolled weed growth in a crop leads to reduced yields of that crop. However, herbicides to selectively kill the weeds are expensive and also lead to pollution. It is in the best interests of farmers the world over and for the sake of the environment, that herbicides are used as efficiently and as effectively as possible.

Hao Guo, Shengsheng Wang, and Yinan Lu of Jilin University in Changchun have proposed a lightweight network based on the encoder-decoder architecture SResNet. They optimized the model so that it can quickly discern weed plant from crop plant in an image.

“In weed identification, the recognition effect is susceptible to factors like light, occlusion, and image quality, so improving the robustness of weed recognition is still a challenging subject in traditional machine vision,” the team explains. Their approach offers a lightweight semantic segmentation model based on the encoder-decoder architecture which takes into account accuracy and processing speed. To demonstrate the benefits of their system, they have compared results with classical semantic segmentation models (SegNet and U-Net) and showed it to have competitive performance. The test frame-rate is almost 70 frames per second and so capable of real-time weed identification in a cornfield. Their average score has almost 99 percent accuracy.

Guo, H., Wang, S. and Lu, Y. (2020) ‘Real-time segmentation of weeds in cornfields based on depthwise separable convolution residual network’, Int. J. Computational Science and Engineering, Vol. 23, No. 4, pp.307–318.