Measuring Stem Diameter of Sorghum Plants in the Field Using a High-Throughput Stereo Vision System

2021 ◽  
Vol 64 (6) ◽  
pp. 1999-2010
Author(s):  
Lirong Xiang ◽  
Lie Tang ◽  
Jingyao Gai ◽  
Le Wang

HighlightsA custom-built camera module named PhenoStereo was developed for high-throughput field-based plant phenotyping.Novel integration of strobe lights facilitated application of PhenoStereo in various environmental conditions.Image-derived stem diameters were found to have high correlations with ground truth, which outperformed any previously reported sensing approach.PhenoStereo showed promising potential to characterize a broad spectrum of plant phenotypes.Abstract. The stem diameter of sorghum plants is an important trait for evaluation of stalk strength and biomass potential, but it is a challenging sensing task to automate in the field due to the complexity of the imaging object and the environment. In recent years, stereo vision has offered a viable three-dimensional (3D) solution due to its high spatial resolution and wide selection of camera modules. However, the performance of in-field stereo imaging for plant phenotyping is adversely affected by textureless regions, occlusion of plants, variable outdoor lighting, and wind conditions. In this study, a portable stereo imaging module named PhenoStereo was developed for high-throughput field-based plant phenotyping. PhenoStereo features a self-contained embedded design, which makes it capable of capturing images at 14 stereoscopic frames per second. In addition, a set of customized strobe lights is integrated to overcome lighting variations and enable the use of high shutter speed to overcome motion blur. PhenoStereo was used to acquire a set of sorghum plant images, and an automated point cloud data processing pipeline was developed to automatically extract the stems and then quantify their diameters via an optimized 3D modeling process. The pipeline employed a mask region convolutional neural network (Mask R-CNN) for detecting stalk contours and a semi-global block matching (SGBM) stereo matching algorithm for generating disparity maps. The correlation coefficient (r) between the image-derived stem diameters and the ground truth was 0.97 with a mean absolute error (MAE) of 1.44 mm, which outperformed any previously reported sensing approach. These results demonstrate that, with proper customization, stereo vision can be an effective sensing method for field-based plant phenotyping using high-fidelity 3D models reconstructed from stereoscopic images. Based on the results from sorghum plant stem diameter sensing, this proposed stereo sensing approach can likely be extended to characterize a broad range of plant phenotypes, such as the leaf angle and tassel shape of maize plants and the seed pods and stem nodes of soybean plants. Keywords: Field-based high-throughput phenotyping, Point cloud, Stem diameter, Stereo vision.

2019 ◽  
Author(s):  
Xu Wang ◽  
Hong Xuan ◽  
Byron Evers ◽  
Sandesh Shrestha ◽  
Robert Pless ◽  
...  

ABSTRACTBackgroundPrecise measurement of plant traits with precision and speed on large populations has emerged as a critical bottleneck in connecting genotype to phenotype in genetics and breeding. This bottleneck limits advancements in understanding plant genomes and the development of improved, high-yielding crop varieties.ResultsHere we demonstrate the application of deep learning on proximal imaging from a mobile field vehicle to directly score plant morphology and developmental stages in wheat under field conditions. We developed and trained a convolutional neural network with image datasets labeled from expert visual scores and used this ‘breeder-trained’ network to directly score wheat morphology and developmental stages. For both morphological (awned) and phenological (flowering time) traits, we demonstrate high heritability and extremely high accuracy against the ‘ground-truth’ values from visual scoring. Using the traits scored by the network, we tested genotype-to-phenotype association using the deep learning phenotypes and uncovered novel epistatic interactions for flowering time. Enabled by the time-series high-throughput phenotyping, we describe a new phenotype as the rate of flowering and show heritable genetic control.ConclusionsWe demonstrated a field-based high-throughput phenotyping approach using deep learning that can directly score morphological and developmental phenotypes in genetic populations. Most powerfully, the deep learning approach presented here gives a conceptual advancement in high-throughput plant phenotyping as it can potentially score any trait in any plant species through leveraging expert knowledge from breeders, geneticist, pathologists and physiologists.


2019 ◽  
Author(s):  
Anand Seethepalli ◽  
Haichao Guo ◽  
Xiuwei Liu ◽  
Marcus Griffiths ◽  
Hussien Almtarfi ◽  
...  

ABSTRACTRoot crown phenotyping measures the top portion of crop root systems and can be used for marker-assisted breeding, genetic mapping, and understanding how roots influence soil resource acquisition. Several imaging protocols and image analysis programs exist, but they are not optimized for high-throughput, repeatable, and robust root crown phenotyping. The RhizoVision Crown platform integrates an imaging unit, image capture software, and image analysis software that are optimized for reliable extraction of measurements from large numbers of root crowns. The hardware platform utilizes a back light and a monochrome machine vision camera to capture root crown silhouettes. RhizoVision Imager and RhizoVision Analyzer are free, open-source software that streamline image capture and image analysis with intuitive graphical user interfaces. RhizoVision Analyzer was physically validated using copper wire and features were extensively validated using 10,464 ground-truth simulated images of dicot and monocot root systems. This platform was then used to phenotype soybean and wheat root crowns. A total of 2,799 soybean (Glycine max) root crowns of 187 lines and 1,753 wheat (Triticum aestivum) root crowns of 186 lines were phenotyped. Principal component analysis indicated similar correlations among features in both species. The maximum heritability was 0.74 in soybean and 0.22 in wheat, indicating differences in species and populations need to be considered. The integrated RhizoVision Crown platform facilitates high-throughput phenotyping of crop root crowns, and sets a standard by which open plant phenotyping platforms can be benchmarked.


2020 ◽  
Vol 2020 ◽  
pp. 1-15 ◽  
Author(s):  
Anand Seethepalli ◽  
Haichao Guo ◽  
Xiuwei Liu ◽  
Marcus Griffiths ◽  
Hussien Almtarfi ◽  
...  

Root crown phenotyping measures the top portion of crop root systems and can be used for marker-assisted breeding, genetic mapping, and understanding how roots influence soil resource acquisition. Several imaging protocols and image analysis programs exist, but they are not optimized for high-throughput, repeatable, and robust root crown phenotyping. The RhizoVision Crown platform integrates an imaging unit, image capture software, and image analysis software that are optimized for reliable extraction of measurements from large numbers of root crowns. The hardware platform utilizes a backlight and a monochrome machine vision camera to capture root crown silhouettes. The RhizoVision Imager and RhizoVision Analyzer are free, open-source software that streamline image capture and image analysis with intuitive graphical user interfaces. The RhizoVision Analyzer was physically validated using copper wire, and features were extensively validated using 10,464 ground-truth simulated images of dicot and monocot root systems. This platform was then used to phenotype soybean and wheat root crowns. A total of 2,799 soybean (Glycine max) root crowns of 187 lines and 1,753 wheat (Triticum aestivum) root crowns of 186 lines were phenotyped. Principal component analysis indicated similar correlations among features in both species. The maximum heritability was 0.74 in soybean and 0.22 in wheat, indicating that differences in species and populations need to be considered. The integrated RhizoVision Crown platform facilitates high-throughput phenotyping of crop root crowns and sets a standard by which open plant phenotyping platforms can be benchmarked.


Author(s):  
Bikram Pratap Banerjee ◽  
German Spangenberg ◽  
Surya Kant

Phenotypic characterization of crop genotypes is an essential yet challenging aspect of crop management and agriculture research. Digital sensing technologies are rapidly advancing plant phenotyping and speeding-up crop breeding outcomes. However, off-the-shelf sensors might not be fully applicable and suitable for agriculture research due to diversity in crop species and specific needs during plant breeding selections. Customized sensing systems with specialized sensor hardware and software architecture provide a powerful and low-cost solution. This study designed and developed a fully integrated Raspberry Pi-based LiDAR sensor named CropBioMass (CBM), enabled by internet of things to provide a complete end-to-end pipeline. The CBM is a low-cost sensor, provides high-throughput seamless data collection in field, small data footprint, injection of data onto the remote server, and automated data processing. Phenotypic traits of crop fresh biomass, dry biomass, and plant height estimated by CBM data had high correlation with ground truth manual measurements in wheat field trial. The CBM is readily applicable for high-throughput plant phenotyping, crop monitoring, and management for precision agricultural applications.


Sensors ◽  
2020 ◽  
Vol 20 (16) ◽  
pp. 4550
Author(s):  
Huajian Liu ◽  
Brooke Bruning ◽  
Trevor Garnett ◽  
Bettina Berger

The accurate and high throughput quantification of nitrogen (N) content in wheat using non-destructive methods is an important step towards identifying wheat lines with high nitrogen use efficiency and informing agronomic management practices. Among various plant phenotyping methods, hyperspectral sensing has shown promise in providing accurate measurements in a fast and non-destructive manner. Past applications have utilised non-imaging instruments, such as spectrometers, while more recent approaches have expanded to hyperspectral cameras operating in different wavelength ranges and at various spectral resolutions. However, despite the success of previous hyperspectral applications, some important research questions regarding hyperspectral sensors with different wavelength centres and bandwidths remain unanswered, limiting wide application of this technology. This study evaluated the capability of hyperspectral imaging and non-imaging sensors to estimate N content in wheat leaves by comparing three hyperspectral cameras and a non-imaging spectrometer. This study answered the following questions: (1) How do hyperspectral sensors with different system setups perform when conducting proximal sensing of N in wheat leaves and what aspects have to be considered for optimal results? (2) What types of photonic detectors are most sensitive to N in wheat leaves? (3) How do the spectral resolutions of different instruments affect N measurement in wheat leaves? (4) What are the key-wavelengths with the highest correlation to N in wheat? Our study demonstrated that hyperspectral imaging systems with satisfactory system setups can be used to conduct proximal sensing of N content in wheat with sufficient accuracy. The proposed approach could reduce the need for chemical analysis of leaf tissue and lead to high-throughput estimation of N in wheat. The methodologies here could also be validated on other plants with different characteristics. The results can provide a reference for users wishing to measure N content at either plant- or leaf-scales using hyperspectral sensors.


Plant Methods ◽  
2021 ◽  
Vol 17 (1) ◽  
Author(s):  
Shuo Zhou ◽  
Xiujuan Chai ◽  
Zixuan Yang ◽  
Hongwu Wang ◽  
Chenxue Yang ◽  
...  

Abstract Background Maize (Zea mays L.) is one of the most important food sources in the world and has been one of the main targets of plant genetics and phenotypic research for centuries. Observation and analysis of various morphological phenotypic traits during maize growth are essential for genetic and breeding study. The generally huge number of samples produce an enormous amount of high-resolution image data. While high throughput plant phenotyping platforms are increasingly used in maize breeding trials, there is a reasonable need for software tools that can automatically identify visual phenotypic features of maize plants and implement batch processing on image datasets. Results On the boundary between computer vision and plant science, we utilize advanced deep learning methods based on convolutional neural networks to empower the workflow of maize phenotyping analysis. This paper presents Maize-IAS (Maize Image Analysis Software), an integrated application supporting one-click analysis of maize phenotype, embedding multiple functions: (I) Projection, (II) Color Analysis, (III) Internode length, (IV) Height, (V) Stem Diameter and (VI) Leaves Counting. Taking the RGB image of maize as input, the software provides a user-friendly graphical interaction interface and rapid calculation of multiple important phenotypic characteristics, including leaf sheath points detection and leaves segmentation. In function Leaves Counting, the mean and standard deviation of difference between prediction and ground truth are 1.60 and 1.625. Conclusion The Maize-IAS is easy-to-use and demands neither professional knowledge of computer vision nor deep learning. All functions for batch processing are incorporated, enabling automated and labor-reduced tasks of recording, measurement and quantitative analysis of maize growth traits on a large dataset. We prove the efficiency and potential capability of our techniques and software to image-based plant research, which also demonstrates the feasibility and capability of AI technology implemented in agriculture and plant science.


Sensors ◽  
2021 ◽  
Vol 21 (4) ◽  
pp. 1137
Author(s):  
Ondřej Holešovský ◽  
Radoslav Škoviera ◽  
Václav Hlaváč ◽  
Roman Vítek

We compare event-cameras with fast (global shutter) frame-cameras experimentally, asking: “What is the application domain, in which an event-camera surpasses a fast frame-camera?” Surprisingly, finding the answer has been difficult. Our methodology was to test event- and frame-cameras on generic computer vision tasks where event-camera advantages should manifest. We used two methods: (1) a controlled, cheap, and easily reproducible experiment (observing a marker on a rotating disk at varying speeds); (2) selecting one challenging practical ballistic experiment (observing a flying bullet having a ground truth provided by an ultra-high-speed expensive frame-camera). The experimental results include sampling/detection rates and position estimation errors as functions of illuminance and motion speed; and the minimum pixel latency of two commercial state-of-the-art event-cameras (ATIS, DVS240). Event-cameras respond more slowly to positive than to negative large and sudden contrast changes. They outperformed a frame-camera in bandwidth efficiency in all our experiments. Both camera types provide comparable position estimation accuracy. The better event-camera was limited by pixel latency when tracking small objects, resulting in motion blur effects. Sensor bandwidth limited the event-camera in object recognition. However, future generations of event-cameras might alleviate bandwidth limitations.


2017 ◽  
Vol 114 (13) ◽  
pp. 3393-3396 ◽  
Author(s):  
Narangerel Altangerel ◽  
Gombojav O. Ariunbold ◽  
Connor Gorman ◽  
Masfer H. Alkahtani ◽  
Eli J. Borrego ◽  
...  

Development of a phenotyping platform capable of noninvasive biochemical sensing could offer researchers, breeders, and producers a tool for precise response detection. In particular, the ability to measure plant stress in vivo responses is becoming increasingly important. In this work, a Raman spectroscopic technique is developed for high-throughput stress phenotyping of plants. We show the early (within 48 h) in vivo detection of plant stress responses. Coleus (Plectranthus scutellarioides) plants were subjected to four common abiotic stress conditions individually: high soil salinity, drought, chilling exposure, and light saturation. Plants were examined poststress induction in vivo, and changes in the concentration levels of the reactive oxygen-scavenging pigments were observed by Raman microscopic and remote spectroscopic systems. The molecular concentration changes were further validated by commonly accepted chemical extraction (destructive) methods. Raman spectroscopy also allows simultaneous interrogation of various pigments in plants. For example, we found a unique negative correlation in concentration levels of anthocyanins and carotenoids, which clearly indicates that plant stress response is fine-tuned to protect against stress-induced damages. This precision spectroscopic technique holds promise for the future development of high-throughput screening for plant phenotyping and the quantification of biologically or commercially relevant molecules, such as antioxidants and pigments.


Author(s):  
M. Herrero-Huerta ◽  
V. Meline ◽  
A. S. Iyer-Pascuzzi ◽  
A. M. Souza ◽  
M. R. Tuinstra ◽  
...  

Abstract. Breakthrough imaging technologies are a potential solution to the plant phenotyping bottleneck in marker-assisted breeding and genetic mapping. X-Ray CT (computed tomography) technology is able to acquire the digital twin of root system architecture (RSA), however, advances in computational methods to digitally model spatial disposition of root system networks are urgently required.We extracted the root skeleton of the digital twin based on 3D data from X-ray CT, which is optimized for high-throughput and robust results. Significant root architectural traits such as number, length, growth angle, elongation rate and branching map can be easily extracted from the skeleton. The curve-skeleton extraction is computed based on a constrained Laplacian smoothing algorithm. This skeletal structure drives the registration procedure in temporal series. The experiment was carried out at the Ag Alumni Seed Phenotyping Facility (AAPF) at Purdue University in West Lafayette (IN, USA). Three samples of tomato root at 2 different times and three samples of corn root at 3 different times were scanned. The skeleton is able to accurately match the shape of the RSA based on a visual inspection.The results based on a visual inspection confirm the feasibility of the proposed methodology, providing scalability to a comprehensive analysis to high throughput root phenotyping.


Sign in / Sign up

Export Citation Format

Share Document