Getting to the Root of Plant Morphometrics, one DEEP Picture at a Time
Hüther et al. develop a new pipeline to extract phenotypic traits from plant pictures with minimal supervision from the user. The Plant Cell (2020) https://doi.org/10.1105/tpc.20.00318
By Patrick Hüther, Niklas Schandry, Katharina Jandrasits, Ilja Bezrukov and Claude Becker
Background: To understand the function of a gene, we need to be able to link genetic information and phenotype. For many research questions in plant biology, it is therefore important to accurately measure the size, shape, and color of the plant. Often, plant phenotyping means collecting vast amounts of images, which one then needs to process and annotate to extract plant morphometric information. In recent years, the speed and dimensions at which we can decode genomic information have increased exponentially; by contrast, the bottleneck of large-scale plant research now often lies in reliable automated image processing. Our goal was to develop a pipeline that accurately annotates plants in top-view images and extracts phenotypic parameters while requiring minimal user input.
Question: We set out to build an unsupervised machine learning-based pipeline for plant phenotyping that would not only segment plants from top-view images but would also return measurements for a range of phenotypic traits. The pipeline, named ARADEEPOPSIS, had to be able to process large datasets, be accessible to non-expert users, and run on common computer environments.
Findings: We show that one can use a relatively small number of manually-annotated images of Arabidopsis thaliana to successfully re-train an already established Deep Neural Network, underscoring the power of transfer-learning approaches to address challenges in plant phenotyping. We demonstrate the versatility of the pipeline by analyzing a collection 150,000 images across plant development, not only correctly segmenting rosette areas but also automatically classifying leaves based on their health status. Our pipeline thus allows automated segmentation of anthocyanin-rich and senescent areas, which we were then able to use in genome-wide association studies to link these phenotypes to standing genetic variation in the species.
Next steps: ARADEEPOPSIS was designed to segment plant rosettes specifically from top-view images of individual plants. The pipeline illustrates that transfer learning is a powerful approach to solve plant phenotyping problems. In the future, we wish to expand image segmentation to recognize a wider array of plant species and tissues. Moreover, we are working on routines to identify plants from images that contain more than one individual.
Patrick Hüther, Niklas Schandry, Katharina Jandrasits, Ilja Bezrukov and Claude Becker (2020). ᴀʀᴀᴅᴇᴇᴘᴏᴘsɪs, an Automated Workflow for Top-View Plant Phenomics using Semantic Segmentation of Leaf States. The Plant Cell https://doi.org/10.1105/tpc.20.00318