Limits...
Species Identification of Food Contaminating Beetles by Recognizing Patterns in Microscopic Images of Elytra Fragments.

Park SI, Bisgin H, Ding H, Semey HG, Langley DA, Tong W, Xu J - PLoS ONE (2016)

Bottom Line: A crucial step of food contamination inspection is identifying the species of beetle fragments found in the sample, since the presence of some storage beetles is a good indicator of insanitation or potential food safety hazards.Both global and local characteristics were quantified and used as feature inputs to artificial neural networks for species classification.Through examining the overall and per species accuracies, we further demonstrated that the local features are better suited than the global features for species identification.

View Article: PubMed Central - PubMed

Affiliation: Department of Computer Science, Texas A&M University, College Station, Texas, United States of America.

ABSTRACT
A crucial step of food contamination inspection is identifying the species of beetle fragments found in the sample, since the presence of some storage beetles is a good indicator of insanitation or potential food safety hazards. The current pratice, visual examination by human analysts, is time consuming and requires several years of experience. Here we developed a species identification algorithm which utilizes images of microscopic elytra fragments. The elytra, or hardened forewings, occupy a large portion of the body, and contain distinctive patterns. In addition, elytra fragments are more commonly recovered from processed food products than other body parts due to their hardness. As a preliminary effort, we chose 15 storage product beetle species frequently detected in food inspection. The elytra were then separated from the specimens and imaged under a microscope. Both global and local characteristics were quantified and used as feature inputs to artificial neural networks for species classification. With leave-one-out cross validation, we achieved overall accuracy of 80% through the proposed global and local features, which indicates that our proposed features could differentiate these species. Through examining the overall and per species accuracies, we further demonstrated that the local features are better suited than the global features for species identification. Future work will include robust testing with more beetle species and algorithm refinement for a higher accuracy.

No MeSH data available.


Related in: MedlinePlus

An example of elytra hair like feature detection.
© Copyright Policy
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC4920424&req=5

pone.0157940.g006: An example of elytra hair like feature detection.

Mentions: The equation implies that a largely blurred image is subtracted from a less blurred one and vice versa, thus, two DoG filtered images are used. Through the DoG filter, we approximate higher and lower intensity of the objects/regions. Then, the approximated objects (i.e. regions of feature elements such as hairs and holes) are detected [26] and binarized; a dynamic threshold for binarizing the DoG filtered image is calculated by averaging Otsu thresholds [27] computed from a gray-scale original image and the DoG filtered image. Fig 6 shows a sample fragment image (Species 2), the DoG filtered image, and the binary segmented image. Finally, from the binary image, color distribution, number and area density, and average and median size of the objects are computed. Specifically, color distribution is represented with 30 bins (10 bins in each domain) in RGB color histogram only for segmented hair/hole/line objects. Number and area density are obtained by dividing the total number and area of the segmented objects by the total submiage area, respectively. The final feature vector include 68 elements: 30 dimensional color disciptor, number & area density, mean & median for each of the two DoG filtered images. Notably, along with color distribution, the density features identify and quantify the amount of local key objects that are not attentively measured in spatial/spectra texture or color descriptors. Since they are computed throughout the whole captured area, but takes in only local key regions, we categorize this feature set into another type of global feature as “global feature 2” in Results section, which lead us to explore the impact of key objects.


Species Identification of Food Contaminating Beetles by Recognizing Patterns in Microscopic Images of Elytra Fragments.

Park SI, Bisgin H, Ding H, Semey HG, Langley DA, Tong W, Xu J - PLoS ONE (2016)

An example of elytra hair like feature detection.
© Copyright Policy
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC4920424&req=5

pone.0157940.g006: An example of elytra hair like feature detection.
Mentions: The equation implies that a largely blurred image is subtracted from a less blurred one and vice versa, thus, two DoG filtered images are used. Through the DoG filter, we approximate higher and lower intensity of the objects/regions. Then, the approximated objects (i.e. regions of feature elements such as hairs and holes) are detected [26] and binarized; a dynamic threshold for binarizing the DoG filtered image is calculated by averaging Otsu thresholds [27] computed from a gray-scale original image and the DoG filtered image. Fig 6 shows a sample fragment image (Species 2), the DoG filtered image, and the binary segmented image. Finally, from the binary image, color distribution, number and area density, and average and median size of the objects are computed. Specifically, color distribution is represented with 30 bins (10 bins in each domain) in RGB color histogram only for segmented hair/hole/line objects. Number and area density are obtained by dividing the total number and area of the segmented objects by the total submiage area, respectively. The final feature vector include 68 elements: 30 dimensional color disciptor, number & area density, mean & median for each of the two DoG filtered images. Notably, along with color distribution, the density features identify and quantify the amount of local key objects that are not attentively measured in spatial/spectra texture or color descriptors. Since they are computed throughout the whole captured area, but takes in only local key regions, we categorize this feature set into another type of global feature as “global feature 2” in Results section, which lead us to explore the impact of key objects.

Bottom Line: A crucial step of food contamination inspection is identifying the species of beetle fragments found in the sample, since the presence of some storage beetles is a good indicator of insanitation or potential food safety hazards.Both global and local characteristics were quantified and used as feature inputs to artificial neural networks for species classification.Through examining the overall and per species accuracies, we further demonstrated that the local features are better suited than the global features for species identification.

View Article: PubMed Central - PubMed

Affiliation: Department of Computer Science, Texas A&M University, College Station, Texas, United States of America.

ABSTRACT
A crucial step of food contamination inspection is identifying the species of beetle fragments found in the sample, since the presence of some storage beetles is a good indicator of insanitation or potential food safety hazards. The current pratice, visual examination by human analysts, is time consuming and requires several years of experience. Here we developed a species identification algorithm which utilizes images of microscopic elytra fragments. The elytra, or hardened forewings, occupy a large portion of the body, and contain distinctive patterns. In addition, elytra fragments are more commonly recovered from processed food products than other body parts due to their hardness. As a preliminary effort, we chose 15 storage product beetle species frequently detected in food inspection. The elytra were then separated from the specimens and imaged under a microscope. Both global and local characteristics were quantified and used as feature inputs to artificial neural networks for species classification. With leave-one-out cross validation, we achieved overall accuracy of 80% through the proposed global and local features, which indicates that our proposed features could differentiate these species. Through examining the overall and per species accuracies, we further demonstrated that the local features are better suited than the global features for species identification. Future work will include robust testing with more beetle species and algorithm refinement for a higher accuracy.

No MeSH data available.


Related in: MedlinePlus