Authors: Ali M Nafchi 1, Karishma Kumari1, Ahmed Abdalla1, Karl Glover1, Sunish Kumar Sehgal 1, Shaukat Ali 1, and Kwanghee Won 2
1. South Dakota State University, College of Agriculture, Food and Environmental Sciences/Department of Agronomy, Horticulture, & Plant Science, Brookings, SD
2. South Dakota State University, Department of Computer Science, Brookings, SD
Corresponding Author: Ali Nafchi, ali.nafchi@sdstste.edu
Presenting Author: Ali M Nafchi
Abstract
Fusarium Head Blight (FHB) results in massive yield and quality losses in wheat and barley annually. Evaluating and estimating the infection level on FHB-resistant lines is time-consuming, labor-intensive, and needs expertise. This paper describes an innovative method to detect and assess the stage of FHB disease in wheat and barley by implementing the advancements in artificial intelligence (AI) and image processing and utilizing an innovative system. The 360° phenotyping robot, developed by the Precision Ag team at SDSU, captures close-up images of wheat heads to detect FHB symptoms even at early stages. In this study, 10,000 images were captured from an FHB-inoculated wheat field at SDSU's research farm in Volga, SD, forming a dataset for deep learning models. Images were annotated using the Roboflow platform's smart polygon tool, which precisely outlined irregular shapes and improved detection accuracy. The dataset was categorized into "Healthy" and "Unhealthy" classes and split into training (70%), validation (15%), and testing (15%) sets. Faster R-CNN was utilized initially for object detection, effectively predicting bounding boxes and segmenting infected areas. The model training included 80 epochs, with early stopping triggered at epoch 38 to avoid overfitting. Key hyperparameters included a learning rate of 0.005, momentum of 0.9, and stochastic gradient descent (SGD) optimizer. Metrics such as precision, recall, mean average precision (mAP), and Intersection over Union (IoU) evaluated model performance, with IoU improving from 0.1660 to 0.5942, precision from 0.0602 to 0.2961, and recall from 0.1328 to 0.4201. Results indicated effective learning, as train loss decreased from 1.7762 to 1.1850 and validation loss from 1.7436 to 1.2841, signaling successful model generalization. However, future approaches could integrate multiple models, such as attention U-Net, U-Net, PSPNet, MANet, SSD, and DeepLabV3+, to enhance segmentation and detection accuracy. These advanced architectures, especially those incorporating attention mechanisms, could focus on disease-affected areas and improve detection sensitivity.
Keywords: FHB, Wheat, Disease Detection, Artificial Intelligence (AI), Image Processing, 360° Phenotyping Robot, Deep Learning Models and Faster R-CNN