(A) Field of the Invention
This invention relates to an electronic apparatus, more specifically, to an electronic apparatus identifying surrounding environment by means of image processing and outputting the result for use by the blind people.
(B) Description of Related Art
The welfare of handicapped people has become a major focus in the modern society. For example, many apparatuses and facilities used by the blind people, such as guide stick, guide dog, or acoustic alarm set at busy intersections, have been invented and made it much easier for the blind people to walk around. However, each of the above mentioned apparatuses and facilities has different drawbacks. Conventional guide stick can only detect ground condition in front of the user but cannot provide information in a range. Guide dog cannot “tell” the user what is happening in front of him/her so as to prepare in advance. Moreover, the cost to train and maintain a guide dog is relatively high so that it is not affordable for many people. As for preset landmarks, such as acoustic alarm and speech guide, only installing them at a few intersections is useless, as all intersections must be rebuilt to include the preset landmarks, which will result in high cost.
Because of advances of electronic technology, many patents focus on improvements of conventional blind guiding apparatuses, such as Taiwan Patent Publication No. 563525, entitled “Traffic Guiding Apparatus for Blind people.” Tactile tiles for guiding blind people in intersections are magnetizable and the magnetism can be “activated” when the traffic light is red so as to attract special guide stick and shoes and notify the user of the red traffic signal ahead. However, besides the fact that all blind people must wear special shoes or use special guide sticks, it still has the disadvantage of “landmarks” aforementioned.
Taiwan Patent Publication No. 518965, entitled “Speech Guide Glasses”, disclosed glasses comprising a sensor and a speech earphone. The sensor has two functions. The first function is to sense the color of traffic light in front of the user and notify the user. The second function is to detect obstacles ahead by receiving reflected IR beam sent out by the sensor. The information is then output by the earphone. After a brief review, we are of the opinion that this invention has the following suspicious points:
The inventor of the present application is familiar with pattern recognition and computer vision. Based on his specialty, the present invention is a practical solution for the disadvantages of prior art aforementioned.
The apparatus of the present invention comprises three parts as shown in
The input device mainly comprises the said CMOS or CCD photographing means capturing color images and transmitting the images to the processing device for pattern recognition. The input device could comprise additional active detecting means such as IR or radar detecting means so as to improve functionality of the system.
The processing device transforms the color images transmitted from the input device into HSV (hue/saturation/value) format first so as to adapt to the variety of ambient light condition. In the transformed images, areas with hue value close to traffic lights and consecutive to each other are marked. The transformation is already prior art, for instance, well-known commercial software “Photoshop” containing similar functions.
After abovementioned transformations, more judging conditions are considered. Do the colors change in a reasonable time interval and within a limited area? Is there any other auxiliary signal (such as the red standing figure indicating “stop”) existing? Is the “area” in a round shape (or other possible shapes of traffic lights; this kind of pattern recognition is prior art)? Following this procedure, it is possible to mark areas of traffic lights.
Besides the function of determining the existence of traffic lights, the processing device can determine the existence of traffic markings according to the images captured by the input device. Traffic markings are regular patterns, so that recognizing the length, width and direction of which is easy by utilizing hue and geometry detecting techniques.
In addition to basic functions aforementioned, it is possible to determine if fixed or moving obstacles exist in the images taken by the input device by binocular vision analysis or image differential analysis.
Moreover, texture analysis, frequency analysis, or data taken by active sensor included in the input device could provide more information of surrounding environment. The technique of analyzing obstacles in images taken by a moving platform by utilizing image differential analysis has been disclosed by EP 1,391,845, entitled “Image Based Object Detection Apparatus and Method.”
The processing device can further determine the horizon by analyzing hue and geometry of lines information in images taken. After horizon is determined, the processing device could either actuate the input device to adjust the viewing angle, or notify the user to adjust the viewing angle by outputting a signal to the output device.
Finally, the processing device could include a system setup mode. In this mode, the user could adjust system settings as well as “educate” the system so as to make the accuracy of recognition higher. The method to enter the system setup mode could be done by physical access to the processing device (by keyboard, for instance), or more cleverly, by inputting predetermined signal into the input device, for instance, a special designed gesture in front of the input device.
The output device outputs a speech or tactual notification to the user in accordance to the environmental situation determined by the processing device.
In view of above, the advantages of the present invention are:
Please refer to
The embodiment includes a micro camera 1 which is composed by CCD or CMOS means, a micro processor system 2 (such as a laptop computer or PDA), and a speech earphone 3. The micro camera is worn at steady locations on body, such as chest and head, of the user, and the micro processor system is carried by the user, as shown in
Please refer to
H1=cos−1[0.5(2R−G−B)/√{square root over ((R−G)2+(R−B)(G−B))}{square root over ((R−G)2+(R−B)(G−B))}{square root over ((R−G)2+(R−B)(G−B))}]
H=H1 if B≦G
H=360° −H1 if B>G
Then mark the areas with hue value closing to traffic lights:
If 0<H<0.125, it is recognized as red traffic light. If 0.125<H<0.22, it is recognized as yellow traffic light. If 0.25<H<0.45, it is recognized as green traffic light.
In the mean time, the micro processor system 2 continuously checks if the micro camera 1 is at a right angle of elevation and adjust the angle accordingly.
Subsequently, the micro processor system 2 marks the possible areas of traffic lights, and then combines other auxiliary information, such as geometry shape of the traffic light, existence of auxiliary signs (for instance, red standing figure indicating “stop”), and existence of crosswalk marking (including length, width, direction, etc.), to determine if the possible areas are actually traffic lights. If the marked areas are determined to be traffic lights, the result will be output to the user by the speech earphone 3. The output result includes real-time accessibility of the intersection (i.e. the color of traffic light.) If the intersection is accessible, information of the crosswalk will also be provided. Moreover, the apparatus of the embodiment can determine if obstacles are present by utilizing algorithms such as image differential analysis, and provide the result to the user through the speech earphone 3.
Number | Date | Country | Kind |
---|---|---|---|
93116990 A | Jun 2004 | TW | national |
Number | Name | Date | Kind |
---|---|---|---|
3369228 | Foster | Feb 1968 | A |
5803740 | Gesink et al. | Sep 1998 | A |
6055048 | Langevin et al. | Apr 2000 | A |
6115482 | Sears et al. | Sep 2000 | A |
6504942 | Hong et al. | Jan 2003 | B1 |
6608941 | Suzuki et al. | Aug 2003 | B1 |
6828918 | Bowman et al. | Dec 2004 | B2 |
6885771 | Takahashi | Apr 2005 | B2 |
6901163 | Pearce et al. | May 2005 | B1 |
6944331 | Schmidt et al. | Sep 2005 | B2 |
7035461 | Luo et al. | Apr 2006 | B2 |
7068814 | You et al. | Jun 2006 | B2 |
7127108 | Kinjo et al. | Oct 2006 | B2 |
20030012435 | Forde | Jan 2003 | A1 |
20030026461 | Arthur Hunter | Feb 2003 | A1 |
20030048928 | Yavitz | Mar 2003 | A1 |
20030095140 | Keaton et al. | May 2003 | A1 |
20040086153 | Tsai et al. | May 2004 | A1 |
20050007449 | Ikado | Jan 2005 | A1 |
20050232481 | Wu | Oct 2005 | A1 |
20060011718 | Kurzweil et al. | Jan 2006 | A1 |
20060098089 | Sofer | May 2006 | A1 |
Number | Date | Country |
---|---|---|
1 391 845 | Feb 2004 | EP |
518965 | Jan 2003 | TW |
563525 | Nov 2003 | TW |
Number | Date | Country | |
---|---|---|---|
20050275718 A1 | Dec 2005 | US |