Potato plant phenotyping and characterisation utilising machine learning techniques: A state-of-the-art review and current trends

Ciarán Miceal Johnson, Juan Sebastian Estrada, Fernando Auat Cheein

Research output: Contribution to journalArticlepeer-review

Abstract

Globally, potatoes are the fourth most produced food crop, and in the United Kingdom alone, they generated approximately £705 million in 2022. However, to achieve the United Nations (UN) Sustainable Development Goals (SDG), potato farmers need to sustainably increase yields to address the growing demand for both food and land. Crop yield can be affected by various factors, including disease, pests, and nutrient deficiencies. To tackle these challenges and optimise yields, researchers have leveraged remote sensing platforms for high-throughput non-destructive phenotyping. Data collected from these platforms can be used to develop machine learning (ML) models aimed at addressing the aforementioned issues. To summarise recent developments in ML models applied to potato plant phenotyping, a systematic review of journal articles from the last seven years was conducted. This review underscored the advantages of Deep Learning (DL) approaches and the rising trend of Convolutional Neural Network (CNN)-based architectures, while also noting the limited availability of data for training these models. This review is intended to benefit researchers and farmers by providing an up-to-date review of ML models in potato plant phenotyping.
Original languageEnglish
Article number110304
JournalComputers and Electronics in Agriculture
Volume234
Early online date5 Apr 2025
DOIs
Publication statusE-pub ahead of print - 5 Apr 2025

Keywords

  • Phenotype
  • Potato
  • Machine learning
  • Deep learning
  • Remote sensing
  • Artificial intelligence
  • Agriculture

Fingerprint

Dive into the research topics of 'Potato plant phenotyping and characterisation utilising machine learning techniques: A state-of-the-art review and current trends'. Together they form a unique fingerprint.

Cite this