1. Field of the Invention
The present invention relates to a method, a system and a computer program product for detecting lost objects. More particularly, the present invention relates to a method, a device and a computer program product for detecting the location of lost objects in environments where the color of a lost object is not naturally found.
2. Description of the Prior Art
There are many circumstances where an object is lost and determining its location is difficult due to the characteristics of the environment in which it has been lost. One such circumstance occurs during the playing of the sport of golf. Typically, the sport of golf is played on terrain having a variety of characteristics, such as grass, sand, trees, water, a specified distance, etc. It is not uncommon for a golf ball to become lost while playing golf due to the characteristics of the environment in which it is played. Once a golf ball is lost, a substantial amount of time can be spent trying to find it. This results in an increase of playing time for the player who lost the ball, as well as other players playing behind or with the player. In cases where the golf ball cannot be located, the player who lost the ball is accessed a penalty stroke increasing the player's final score.
Accordingly, there is a need for a device that detects the location of an object in an environment having a variety of characteristics. There is further need for the device to be mobile. There is a further need for the device to detect the location of an object over long distances. There is a need for the device to be operable in a variety of lighting conditions. There is a need for the device to reduce glare and related image artifacts. There is a need for the device to reduce multiple reflections and shadowing in the detection of the object. There is a need for the device to decrease the amount of time required to locate an object.
According to embodiments of the present invention, a method, a device and a computer program product for detecting the location of an object in an environment are provided. The method receives an optical image and converts the optical image of the object into a color digital image consisting of charged signals, where each charged signal was generated by a pixel in an array of a Charged Coupler Device (CCD) by photoelectric conversion. The color digital image depicts one or more similar lost objects in a particular environment. Software performs an analysis of the color digital image to detect the location of the one or more objects in the environment by using color and shape characteristics of the one or more objects. The software uses a range of the visible portion of the color space uniquely identified for the type of object in that environment. The range of the color space is based at least in part on the color spaces identified for the object type under various lighting conditions in the environment where the type of object would be lost. The color spaces for the object are identified by analyzing the color spaces of color digital images of the object type under the various lighting conditions in a training mode and storing the color spaces identified in association with the object type. The analysis includes comparing the color space of each pixel in the color digital image with each of the color spaces in the range of color spaces to determine if there is a match. Once a match is determined the location of that pixel is recorded. Recorded pixels are analyzed to determine whether there are clusters of pixels with the sought features. If pixel clusters are identified, the size of the cluster of pixels is compared to the size characteristics of the object type to determine the likelihood of the pixel cluster being the lost object. The image may be filtered using polarization to eliminate glare.
The patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawing(s) will be provided by the Office upon request and payment of the necessary fee.
The above described features and advantages of the present invention will be more fully appreciated with reference to the detailed description and appended figures in which:
a–4d depict exemplary color space diagrams of an object shown in a color digital image.
The present invention is now described more fully hereinafter with reference to the accompanying drawings that show embodiments of the present invention. The present invention, however, may be embodied in many different forms and should not be construed as limited to embodiments set forth herein. Appropriately, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the present invention.
According to embodiments of the present invention, a method, an image taking device and a computer program product for detecting the location of an object in an environment are provided. The method receives an optical image and converts the optical image of the object into a color digital image consisting of charged signals, where each charged signal was generated by a pixel in an array of a Charged Coupler Device (CCD) by photoelectric conversion.
The color digital image depicts one or more similar objects in a particular environment where the objects may be lost. Software performs an analysis of the color digital image to detect the location of the one or more objects in the environment by using color and shape characteristics of the one or more objects. The software uses a range of the visible portion of the color space uniquely identified for the type of object in that environment. The range of the color space is based at least in part on the color spaces identified for the object type under various lighting conditions in the environment where the type of object could be lost. The color spaces for the object are identified by analyzing the color spaces of color digital images of the object type under the various lighting conditions in a training mode and storing the color spaces identified in association with the object type. The analysis includes comparing the color space of each pixel in the color digital image with each of the color spaces in the range of color spaces to determine if there is a match. Once a match is determined the location of that pixel is recorded. Recorded pixels are analyzed to determine whether there are clusters of pixels. If pixel clusters are identified, the size of the cluster of pixels is compared to the size characteristics of the object type to determine the likelihood of the pixel cluster being the lost object. The image may be filtered using polarization to eliminate glare.
In the
The input system 104 is coupled to circuitry 106 and provides an analog image signal to the circuitry 106. The circuitry 106 samples the analog image signal and extracts the voltage that is proportional to the amount of light which fell on each pixel of the charge-coupled device sensor of the input system 104 using color components R (red), G (green) and B (blue). Programmable gain amplifier (PGA) 108 is coupled to circuitry 106, amplifies the voltages to the proper range and provides the voltages as input to analog-to-converter 110. Analog-to-digital converter (ADC) 110 is coupled to CPU 102 and converts the voltage to a digital code suitable for further digital signal processing by CPU 102. The CPU 102 is a microprocessor, such as an INTEL PENTIUM® or AMD® processor, but can be any processor that executes program instructions in order to carry out the functions of the present invention.
In the
In the
An exemplary flow diagram of an embodiment for detecting the location of an object in a particular environment is shown in
In defining a target color space, color shifts caused by the specific lighting conditions of the particular type of object must be considered and included in the target color space for the type of object. Accordingly, the color shifts of the type of object must be determined. This includes color shifts caused by “global” lighting, such as sunny versus cloudy weather, as well as “local” lighting, such as in grass or under a bush. For purposes of our invention, we define “white” as the color of a typical golf ball.
Turning here briefly to
Returning here to
In step 306, a decision statistic is defined that represents the likely characteristics of the type of object. In an embodiment of the present invention, the intensity of the background can be used as a decision statistic. The intensity of the background can be determined by processing the color digital image a second time. With an image-specific histogram of the background intensity, a lower-bound threshold for the expected target intensity can be defined, such as at the 90%, 95%, or 99% level of the background intensity. The pixels whose locations are stored can be screened using this criterion, with those pixels not meeting the intensity specification removed.
In an embodiment of the present invention, the size of the type of object can be used as a decision statistic. The size of the type of object can be used to identify the object by determining the diameter, such as a golf ball measured in pixels. This value can serve as a cluster distance. The pixels whose locations are stored can be screened using this criterion by collecting into groups, or clusters, those pixels that are within a cluster distance of each other.
In step 308, it is determined whether the object is identified in the environment based on one or more statistics. A statistic includes color space information, and may also include intensity information and/or cluster information. A statistic may also include weighting values from any reference images collected. The preferred approach is to define one statistic, but it is obvious that multiple statistics could be defined and used with this method. In step 310, the object is reported if identified, such as by display 118.
While specific embodiments of the present invention have been illustrated and described, it will be understood by those having ordinary skill in the art that changes can be made to those embodiments without departing from the spirit and scope of the invention. For example, while the present invention concentrates on a single color digital image and stationary lost object analysis, it is understood that information from a series of images, a moving object or a specific object might advantageously be used as well. Also, while our application to golf balls has us discussing UV and visible light, the method is not dependent on this choice.
Number | Name | Date | Kind |
---|---|---|---|
5357352 | Eschbach | Oct 1994 | A |
5416513 | Morisaki | May 1995 | A |
5495428 | Schwartz | Feb 1996 | A |
5903318 | Demay et al. | May 1999 | A |
5911003 | Sones | Jun 1999 | A |
5912980 | Hunke | Jun 1999 | A |
6241622 | Gobush et al. | Jun 2001 | B1 |
6320173 | Vock et al. | Nov 2001 | B1 |
6390934 | Winfield et al. | May 2002 | B1 |
6431990 | Manwaring | Aug 2002 | B1 |
6458035 | Katayama | Oct 2002 | B1 |
6459495 | Silverbrook | Oct 2002 | B1 |
6488591 | Gobush et al. | Dec 2002 | B1 |
6500073 | Gobush et al. | Dec 2002 | B1 |
6506124 | Manwaring et al. | Jan 2003 | B1 |
6520864 | Wilk | Feb 2003 | B1 |
6533674 | Gobush | Mar 2003 | B1 |
6542645 | Silverbrook et al. | Apr 2003 | B1 |
6556709 | Kumar | Apr 2003 | B1 |
6561917 | Manwaring | May 2003 | B1 |
6579190 | Yamamoto | Jun 2003 | B1 |
6592465 | Lutz et al. | Jul 2003 | B1 |
6602144 | Manwaring et al. | Aug 2003 | B1 |
6616543 | Gobush et al. | Sep 2003 | B1 |
6639628 | Lee et al. | Oct 2003 | B1 |
6665454 | Silverbrook et al. | Dec 2003 | B1 |
6742385 | Katayama | Jun 2004 | B1 |
6758759 | Gobush et al. | Jul 2004 | B1 |
6764412 | Gobush et al. | Jul 2004 | B1 |
6781621 | Gobush et al. | Aug 2004 | B1 |
6786420 | Silverbrook | Sep 2004 | B1 |
6821209 | Manwaring et al. | Nov 2004 | B1 |
20040170318 | Crandall et al. | Sep 2004 | A1 |
20040228524 | Okutsu et al. | Nov 2004 | A1 |
Number | Date | Country | |
---|---|---|---|
20060038892 A1 | Feb 2006 | US |