Limits...
Field phenotyping of grapevine growth using dense stereo reconstruction.

Klodt M, Herzog K, Töpfer R, Cremers D - BMC Bioinformatics (2015)

Bottom Line: The method has been successfully applied to objective assessment of growth habits of new breeding lines.To this end, leaf areas of two breeding lines were monitored and compared with traditional cultivars.A statistical analysis of the method shows a significant (p <0.001) determination coefficient R (2)= 0.93 and root-mean-square error of 3.0%.

View Article: PubMed Central - PubMed

Affiliation: Department of Informatics, Technische Universität München, Boltzmannstraße 3, 85748, Garching, Germany. klodt@tum.de.

ABSTRACT

Background: The demand for high-throughput and objective phenotyping in plant research has been increasing during the last years due to large experimental sites. Sensor-based, non-invasive and automated processes are needed to overcome the phenotypic bottleneck, which limits data volumes on account of manual evaluations. A major challenge for sensor-based phenotyping in vineyards is the distinction between the grapevine in the foreground and the field in the background - this is especially the case for red-green-blue (RGB) images, where similar color distributions occur both in the foreground plant and in the field and background plants. However, RGB cameras are a suitable tool in the field because they provide high-resolution data at fast acquisition rates with robustness to outdoor illumination.

Results: This study presents a method to segment the phenotypic classes 'leaf', 'stem', 'grape' and 'background' in RGB images that were taken with a standard consumer camera in vineyards. Background subtraction is achieved by taking two images of each plant for depth reconstruction. The color information is furthermore used to distinguish the leaves from stem and grapes in the foreground. The presented approach allows for objective computation of phenotypic traits like 3D leaf surface areas and fruit-to-leaf ratios. The method has been successfully applied to objective assessment of growth habits of new breeding lines. To this end, leaf areas of two breeding lines were monitored and compared with traditional cultivars. A statistical analysis of the method shows a significant (p <0.001) determination coefficient R (2)= 0.93 and root-mean-square error of 3.0%.

Conclusions: The presented approach allows for non-invasive, fast and objective assessment of plant growth. The main contributions of this study are 1) the robust segmentation of RGB images taken from a standard consumer camera directly in the field, 2) in particular, the robust background subtraction via reconstruction of dense depth maps, and 3) phenotypic applications to monitoring of plant growth and computation of fruit-to-leaf ratios in 3D. This advance provides a promising tool for high-throughput, automated image acquisition, e.g., for field robots.

Show MeSH
Generalization to the phenotypic class ‘grape’ and computation of fruit-to-leaf ratios. The input image (A) is segmented to ‘grape’, ‘leaf’ and ‘background’. The segmentation can be used to compute the grape-to-leaf ratio in the 2D image domain (B). A more accurate ratio can be computed in the depth weighted 3D space using the reconstructed surface (C).
© Copyright Policy - open-access
Related In: Results  -  Collection

License 1 - License 2
getmorefigures.php?uid=PMC4447010&req=5

Fig3: Generalization to the phenotypic class ‘grape’ and computation of fruit-to-leaf ratios. The input image (A) is segmented to ‘grape’, ‘leaf’ and ‘background’. The segmentation can be used to compute the grape-to-leaf ratio in the 2D image domain (B). A more accurate ratio can be computed in the depth weighted 3D space using the reconstructed surface (C).

Mentions: The use of dense depth maps enables a scaling of each pixel according to its depth which corresponds to the actual size of the area captured in the pixel. The respective area computation in 3D can increase accuracy of the resulting complex phenotypic data, in comparison to area computation in the 2D image plane. Figure 3 shows a comparison of the average grapes-to-leaf ratio in 2D (pixel) and 3D (actual size). Computing the ratio in 3D results in a 10% decreased ratio compared to the computation in 2D.It can balance out the fact that some leaves are closer to the camera and thus occupy a disproportionally larger area in the 2D image plane than the grapes that are farther away. This effect can also be observed in the Additional file 1 which shows a 3D view of the surface shown in Figure 3C.Figure 3


Field phenotyping of grapevine growth using dense stereo reconstruction.

Klodt M, Herzog K, Töpfer R, Cremers D - BMC Bioinformatics (2015)

Generalization to the phenotypic class ‘grape’ and computation of fruit-to-leaf ratios. The input image (A) is segmented to ‘grape’, ‘leaf’ and ‘background’. The segmentation can be used to compute the grape-to-leaf ratio in the 2D image domain (B). A more accurate ratio can be computed in the depth weighted 3D space using the reconstructed surface (C).
© Copyright Policy - open-access
Related In: Results  -  Collection

License 1 - License 2
Show All Figures
getmorefigures.php?uid=PMC4447010&req=5

Fig3: Generalization to the phenotypic class ‘grape’ and computation of fruit-to-leaf ratios. The input image (A) is segmented to ‘grape’, ‘leaf’ and ‘background’. The segmentation can be used to compute the grape-to-leaf ratio in the 2D image domain (B). A more accurate ratio can be computed in the depth weighted 3D space using the reconstructed surface (C).
Mentions: The use of dense depth maps enables a scaling of each pixel according to its depth which corresponds to the actual size of the area captured in the pixel. The respective area computation in 3D can increase accuracy of the resulting complex phenotypic data, in comparison to area computation in the 2D image plane. Figure 3 shows a comparison of the average grapes-to-leaf ratio in 2D (pixel) and 3D (actual size). Computing the ratio in 3D results in a 10% decreased ratio compared to the computation in 2D.It can balance out the fact that some leaves are closer to the camera and thus occupy a disproportionally larger area in the 2D image plane than the grapes that are farther away. This effect can also be observed in the Additional file 1 which shows a 3D view of the surface shown in Figure 3C.Figure 3

Bottom Line: The method has been successfully applied to objective assessment of growth habits of new breeding lines.To this end, leaf areas of two breeding lines were monitored and compared with traditional cultivars.A statistical analysis of the method shows a significant (p <0.001) determination coefficient R (2)= 0.93 and root-mean-square error of 3.0%.

View Article: PubMed Central - PubMed

Affiliation: Department of Informatics, Technische Universität München, Boltzmannstraße 3, 85748, Garching, Germany. klodt@tum.de.

ABSTRACT

Background: The demand for high-throughput and objective phenotyping in plant research has been increasing during the last years due to large experimental sites. Sensor-based, non-invasive and automated processes are needed to overcome the phenotypic bottleneck, which limits data volumes on account of manual evaluations. A major challenge for sensor-based phenotyping in vineyards is the distinction between the grapevine in the foreground and the field in the background - this is especially the case for red-green-blue (RGB) images, where similar color distributions occur both in the foreground plant and in the field and background plants. However, RGB cameras are a suitable tool in the field because they provide high-resolution data at fast acquisition rates with robustness to outdoor illumination.

Results: This study presents a method to segment the phenotypic classes 'leaf', 'stem', 'grape' and 'background' in RGB images that were taken with a standard consumer camera in vineyards. Background subtraction is achieved by taking two images of each plant for depth reconstruction. The color information is furthermore used to distinguish the leaves from stem and grapes in the foreground. The presented approach allows for objective computation of phenotypic traits like 3D leaf surface areas and fruit-to-leaf ratios. The method has been successfully applied to objective assessment of growth habits of new breeding lines. To this end, leaf areas of two breeding lines were monitored and compared with traditional cultivars. A statistical analysis of the method shows a significant (p <0.001) determination coefficient R (2)= 0.93 and root-mean-square error of 3.0%.

Conclusions: The presented approach allows for non-invasive, fast and objective assessment of plant growth. The main contributions of this study are 1) the robust segmentation of RGB images taken from a standard consumer camera directly in the field, 2) in particular, the robust background subtraction via reconstruction of dense depth maps, and 3) phenotypic applications to monitoring of plant growth and computation of fruit-to-leaf ratios in 3D. This advance provides a promising tool for high-throughput, automated image acquisition, e.g., for field robots.

Show MeSH