The present invention relates to methods and apparatus for analyzing and evaluating plant samples and in particular to methods for analyzing and evaluating maize kernels on the cob.
Determining the number of kernels per ear of maize is useful in estimating yield. Pre-harvest yield prediction methods, such as the yield component method, estimate yield from estimates of components that comprise grain yield, including the number of ears per acre, the number of kernels per ear (which may be comprise number of rows per ear and number of kernels per row), and the weight per kernel.
In one exemplary method of counting kernels, the number of kernels on a sample ear of maize is manually counted. In another exemplary method, kernels from one or more sample ears are separated from the cob before being manually or mechanically counted. These methods may be laborious and time consuming.
In another method, such as that described in U.S. Pat. No. 8,073,235 to Hausmann, et al., the number of kernels per ear is estimated based on the number of kernels visible from a single side of the ear. In this method, the number of kernels in an image of one side the ear is counted, and the total number of kernels per ear is estimated based on an empirical correlation between the number of kernels visible in an image and the number of kernels on an ear. Because the estimate relies on an image of a single side of the ear, the resulting estimate assumes little variation between rows on the ear, including little variation between rows of a tip area around the circumference of the cob.
In an exemplary embodiment of the present disclosure, an apparatus for determining the number of kernels on a sample cob is provided. The apparatus includes at least one reflective surface, an imaging system positioned to capture an image of the sample cob, the image including a front region of the cob and a back region displayed in the at least one reflective surface, and an image processor that receives the image from the imaging system, identifies the presence of kernels in the image, and determines the number of kernels based on the identified presence of kernels in the image of the sample cob.
In another exemplary embodiment of the present disclosure, a method for determining the number of kernels on a sample cob having a circumference is provided. The method includes positioning the sample cob between an imaging system and at least one reflective surface, the sample cob having a front region oriented towards the imaging system and a back region oriented away from the imaging system; capturing an image of the sample cob, the image including greater than 180° of the circumference of the cob; identifying a presence of kernels in the image of the sample cob; and calculating the number of kernels on the sample cob based on the identified presence of kernels in the image of the sample cob. In another embodiment, the determining step is further based on an identified presence of an exposed area of the sample cob.
The above mentioned and other features of the invention, and the manner of attaining them, will become more apparent and the invention itself will be better understood by reference to the following description of embodiments of the invention taken in conjunction with the accompanying drawings.
The embodiments disclosed below are not intended to be exhaustive or to limit the invention to the precise forms disclosed in the following detailed description. Rather, the embodiments are chosen and described so that others skilled in the art may utilize their teachings. While the present disclosure is primarily directed to the analysis of kernels on a ear of maize, it should be understood that the features disclosed herein may have application to the analysis of other samples.
Referring first to
In one exemplary embodiment, imaging system 30 includes an image capture device 34. Image capture device 34 is a device capable of capturing an image. Exemplary image capture devices include cameras, CCD cameras, and other suitable image capture devices. Illustrated image capture device 34 includes aperture 33. In the illustrated embodiment, image capture device 34 captures an image 76 (see
In the illustrated embodiment, a light source 35 is also provided. In one embodiment, light source 35 is provided as a part of image capture device 34. In another embodiment, light source 35 is independent of image capture device 34. Although illustrated as attached to image capture device 34, light source 35 may be positioned apart from image capture device 34. In still another embodiment, imaging system 30 does not include a light source, but may use light provided from the environment.
Imaging system 30 also includes first reflective surface 36, and second reflective surface 38. Exemplary reflective surfaces include mirrors and other suitable reflective surfaces. Line A indicates a line perpendicular to a line extending perpendicular to an image plane of the image capture device 34. First reflective surface 36 intersects line A at an angle A1. Second reflective surface 38 intersects line A at an angle A2. In one embodiment, A1 is equal to A2. In another embodiment, A1 is a different angle than A2. In still another embodiment, A1 and A2 are about 120°. In yet still another embodiment, first reflective surface 36 and second reflective surface 38 are positioned about sample 32 such that image capture device 34 is provided a reflected view of first back region 44 in first reflective surface 36 and a view of second back region 46 in second reflective surface 38.
In the illustrated embodiment, imaging system 30 is at least partially enclosed in container 40. In an exemplary embodiment, container 40 reduces or eliminates stray light for image capture device 34. In another exemplary embodiment, container 40 reduces or eliminates wind or particulates from interfering with imaging system 30. In another embodiment, imaging system 30 does not include a container 40.
In the embodiment illustrated in
Referring next to
In one embodiment, at least a portion of the front region 42, first back region 44 reflected in first reflective surface 36, and second back region 46 reflected in second reflective surface 38 show overlapping portions of sample 32. In another embodiment, not all of sample 32 is visible in front region 42, first back region 44 reflected in first reflective surface 36, and second back region 46 reflected in second reflective surface 38.
Referring next to
In one exemplary embodiment, sample 32 is attached to sample holder 28. In the illustrated embodiment, sample holder 28 positions sample 32 such that the longitudinal axis of sample 32 is oriented substantially vertically. In another embodiment, sample holder 28 positions sample 32 in a substantially horizontal orientation. Other suitable orientations may also be used. In the illustrated embodiment, sample holder 28 positions sample 32 by gripping an external surface of sample 32. In another exemplary embodiment, a portion of sample holder 28 is inserted into a portion of sample 32 to position sample 32. In still another exemplary embodiment, sample 32 is an ear of maize and a portion of sample holder 28 is inserted into the cob of the ear of maize to position the ear.
Referring next to
A sample 32′ to be imaged is shown positioned in imaging system 60. In one exemplary embodiment, imaging system 60 includes an image capture device 64. In the illustrated embodiment, a light source 65 and container 70 are also provided. Imaging system 60 also includes reflective surface 66. Line B indicates a line perpendicular to a line extending perpendicular to an image plane of the image capture device 64. Reflective surface 66 intersects line B at an angle B1. In one embodiment, B1 is from about 120° to about 180°. In another embodiment, B1 is from about 90° to about 120°. In still another embodiment, reflective surface 66 is positioned about sample 32′ such that image capture device is provided a reflected view of back region 74 in reflective surface 66.
In one embodiment, at least a portion of the front region 72 and back region 74 reflected in reflective surface 66 show overlapping portions of sample 32′. In another embodiment, not all of sample 32′ is visible in front region 72 and back region 74 reflected in reflective surface 66. In still another embodiment, front region 72 and back region 74 comprise more than 180° of the circumference of sample 32′.
Referring to
Although exemplary systems with one reflective surface such as imaging system 60, and two reflective surfaces such as imaging system 30, are illustrated greater numbers of reflective surfaces may also be used. In addition, additional optical elements including lenses, fiber optics, reflective elements with optical power, and other suitable devices for forming an image or the sample 32 may be included.
In another embodiment, memory may further include operating system software 86, such as LINUX operating system or WINDOWS operating system available from Microsoft Corporation of Redmond Washington. Memory further includes communications software if computer system has access to a network, such as a local area network, a public switched network, a CAN network, and any type of wired or wireless network. Any exemplary public switched network is the Internet. Exemplary communications software includes e-mail software, internet browser software. Other suitable software which permit image processor 80 to communicate with other devices across a network may be used.
In another exemplary embodiment, image processor 80 further includes a user interface 92 having one or more I/O modules which provide an interface between an operator and image processor 80. Exemplary I/O modules include user input 96 and display 94. Exemplary user input 96 include buttons, switches, keys, a touch display, a keyboard, a mouse, and other suitable devices for providing information to image processor 80. Exemplary display 94 are output devices including lights, a display (such as a touch screen), printer, speaker, visual devices, audio devices, tactile devices, and other suitable devices for presenting information to an operator.
In one exemplary embodiment, image 76 is provided to image processor 80 and stored in memory 84. In the embodiment illustrated in
Memory 84 may also include image analysis software 90, as described below. Image analysis software 90 may include image processing software 88. In one embodiment, image processor 80 stores in memory 84 a processed image 78 of image 76 that has been processed with image processing software 88, as described below.
In block 108, the processed image 78 is then stored in memory 84.
In blocks 110 to 120, image analysis software 90 is used to analyze processed image 78. In one exemplary embodiment, processing sequence 100 includes one or more of blocks 110 to 120. In another embodiment, processing sequence 100 does not include one or more of blocks 110 to 120. Which of blocks 110 to 120 are included depends on the outputs desired to be determined, such as the outputs in block 122, that outputs are displayed an operator on display 94 or the outputs in block 124 that are stored in memory 84.
In block 110, image analysis software 90 identifies kernels. In one exemplary embodiment, image analysis software 90 uses a pattern recognition routine to identify kernels in processed image 78. Other suitable means for identifying kernels 48 in processed image 78 may also be used.
In block 112, image analysis software 90 determines if rows repeat. In one exemplary embodiment, image analysis software 90 identifies repeated rows by kernel patterns or repeated individual kernel characteristics in the kernels identified in block 110. In one embodiment, the rows extend along a longitudinal extent of the cob.
In block 114, image analysis software 90 determines the number of kernels. In one embodiment, this comprises counting the kernels identified in block 110. In another embodiment, this involves counting the kernels identified in block 110 and subtracting the number of kernels in the repeated rows identified in block 112. In still another embodiment, this involves counting the kernels identified in block 110 and adding an estimate of kernels not visible in the photographic images.
In one exemplary embodiment, the number of kernels is determined by counting the number of kernels in one or more rows on the ear, determining the number of rows on the ear, and subtracting a number of kernels corresponding to the exposed cob area in processed image 78.
In block 116, image analysis software 90 identifies a tip area 98 in each of the reflected regions. In one embodiment, tip area 98 is defined as a predetermined percentage at the top of the sample 32. In another embodiment, tip area 98 is defined as the area above the lowest exposed cob area 49.
In block 118, image analysis software 90 determines fill percentages. An exemplary total fill percentage is determined by dividing the total area identified as kernels on the ear in block 110 by the total area of kernels and exposed cob in processed image 78. An exemplary tip fill percentage is determined by dividing the total area identified as kernels in the tip area 98 in block 116 by the total area of kernels and exposed cob in the tip area 98 in processed image 78.
In block 120, image analysis software 90 determines kernel sizes. In one exemplary embodiment, image analysis software 90 determines the average size of kernels on sample 32 by averaging the size of each kernel identified in block 110. In another exemplary embodiment, image analysis software 90 determines a size distribution of kernels on sample 32 by categorizing each kernel identified in block 110 based on kernel size.
In one exemplary embodiment, in block 122, outputs determined in blocks 112 to 120 are displayed for operator on display 94. In another exemplary embodiment, in block 124, outputs determined in blocks 112 to 120 are stored in memory 84. In still another exemplary embodiment, an operator provides additional data, such as but not limited to kernel weight, ears per stalk, and stalks per acre, and processing sequence determines the estimated yield. Exemplary yields include bushels per acre and tons per acre.
While this invention has been described as relative to exemplary designs, the present invention may be further modified within the spirit and scope of this disclosure. Further, this application is intended to cover such departures from the present disclosure as come within known or customary practice in the art to which this invention pertains.
This application claims the benefit of U.S. Provisional Application Ser. No. 61/674,602, filed Jul. 23, 2012, titled KERNEL COUNTER, docket DAS-P0259-01-US, the disclosure of which is expressly incorporated by reference herein.
Number | Date | Country | |
---|---|---|---|
61674602 | Jul 2012 | US |