The You Only Look Once (YOLO) deep learning model iterations-YOLOv7-YOLOv8-were put through a rigorous evaluation process to see how well they could recognize oil palm plants. Precision, recall, F1-score, and detection time metrics are analyzed for a variety of configurations, including YOLOv7x, YOLOv7-W6, YOLOv7-D6, YOLOv8s, YOLOv8n, YOLOv8m, YOLOv8l, and YOLOv8x. YOLO label v1.2.1 was used to label a dataset of 80,486 images for training, and 482 drone-captured images, including 5,233 images of oil palms, were used for testing the models. The YOLOv8 series showed notable advancements; with 99.31 %, YOLOv8m obtained the greatest F1-score, signifying the highest detection accuracy. Furthermore, YOLOv8s showed a notable decrease in detection times, improving its suitability for comprehensive environmental surveys and in-the-moment monitoring. Precise identification of oil palm trees is beneficial for improved resource management and less environmental effect; this supports the use of these models in conjunction with drone and satellite imaging technologies for agricultural economic sustainability and optimal crop management.
Keywords: Accuracy; Detection time; Oil palm tree; Precision; YOLO.
© 2024 The Authors. Published by Elsevier B.V.