Weed and Chilli crop discrimination to evaluation of inter and intra row weeder
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Automated crop and weed detection is a crucial advancement in precision agriculture, enabling efficient weed management while reducing herbicide use and labour costs. This study focuses on developing a machine learning model using google teachable machine to distinguish between chili crops and weeds. This study evaluates the performance of three camera systems (iphone 15, Samsung M32, and Moto g 64) for plant and weed detection using key metrics such as accuracy, F1 score, and recall. Results indicate that iphone 15 camera consistently outperforms the others, achieving an average accuracy of approximately 95% for plant detection and 90% for weed detection. Its F1 scores of around 0.94 for plants and 0.89 for weeds, coupled with high recall rates of about 96% and 92%, demonstrate a strong balance between precision and sensitivity, ensuring reliable identification of true positives. In comparison, Samsung M32 camera shows moderate performance with accuracy near 88% (plants) and 82% (weeds), F1 scores around 0.86 and 0.81, and recall rates of approximately 89% and 83%. Moto g 64 camera exhibits the lowest performance, with accuracy of about 80% (chilli plants) and 75% (weeds), F1 scores around 0.78 and 0.73, and recall rates near 81% and 77%. These findings highlight the importance of high-quality imaging and robust detection algorithms in agricultural monitoring systems. The superior performance of iphone 15 underscores its suitability for precise plant and weed detection, essential for optimizing crop management and sustainable farming practices. Improving the capabilities of Samsung M32, and Moto g 64 cameras could further enhance their effectiveness, but current results iphone 15 camera has the most reliable option for discrimination of weeds and chilli crop.