Field of the Invention
The present invention relates to a surrounding environment recognition device for detecting traffic signal lights using a peripheral image.
Description of the Related Art
In Japanese Laid-Open Patent Publication No. 2012-168592 (hereinafter referred to as “JP 2012-168592A”), a red-light signal Lr, etc., of a traffic signal S is detected based on an image T that is captured by an image capturing means 2, and an arrow signal A, an image of which is captured within a search region Rs set based on the position of the detected red-light signal Lr, etc. in the image T, is extracted (abstract).
In JP 2012-168592A, a stereo matching process is carried out, in which two images acquired by a stereo camera (a reference image T of a main camera 2a and a comparison image Tc of a sub-camera 2b) are combined (paragraphs [0040], [0045], [0046]). In accordance with this feature, a distance image Tz is calculated, in which a parallax value dp is assigned to each of the pixels of the reference image T (paragraph [0048]). In addition, the red-light signal Lr or the like is detected using the distance image Tz (paragraphs [0074], [0075]), and the arrow signal A is extracted based on the position of the detected red-light signal Lr or the like (see FIG. 15). Further, in JP 2012-168592A, it is disclosed that only one image T, as in the case of a monocular camera, may be used (see paragraph [0056]).
The inventors of the present invention have discovered that when a monocular camera (a single camera) is used, cases occur in which, even though a red-light signal Lr and an arrow signal A are illuminated simultaneously, the recognition device cannot recognize both the red-light signal Lr and the arrow signal A at the same time. Upon carrying out an investigation into the cause thereof, it was understood that the reason was due to the use of multiple light emitting diode (LED) lamps in the light emitting portions of the traffic signal. More specifically, such LED lamps flash in a specific period that cannot be recognized by the naked eye. Therefore, in images of frames that are captured at timings when the LED lamps are momentarily turned off or not illuminated, the LED lamps that are turned off cannot be recognized as being in an illuminated state. This type of problem is not limited to LED lamps, but similarly is true for other types of lamps that flash on and off at a specified period.
In JP 2012-168592A, even in the case that either one of a stereo camera (the main camera 2a and the sub-camera 2b) or a monocular camera is used, it can be assumed that the red-light signal Lr and the arrow signal A are recognized based on a single frame image. In the case of a stereo camera, it can be assumed that the reference image T and the comparison image Tc are acquired while the main camera 2a and the sub-camera 2b are synchronized. For this reason, even in the case that either one of the stereo camera or the monocular camera is used, there is a concern that the lamps of the traffic signal cannot be recognized with sufficient accuracy.
The present invention has been devised taking into consideration the aforementioned problems, and has the object of providing a surrounding environment recognition device which is capable of improving detection accuracy.
A surrounding environment recognition device according to the present invention includes an image capturing unit that captures a peripheral image, and a traffic signal recognizing unit that recognizes a traffic signal from within the peripheral image. The image capturing unit captures a plurality of images of frames, and the traffic signal recognizing unit recognizes the traffic signal based on a combination of the plurality of images of frames.
According to the present invention, the traffic signal is recognized by a combination of the plurality of images of frames. Therefore, for example, even in the event that the traffic signal is difficult to recognize with a single frame, as in the case of an LED traffic signal, the traffic signal can be recognized accurately.
The surrounding environment recognition device may include a storage unit in which a light emitting pattern of a plurality of frames is stored as teacher data. Further, the traffic signal recognizing unit may recognize the traffic signal by comparing a light emitting pattern of the plurality of frames captured by the image capturing unit and the teacher data. By this feature, since the transition of the light emitting state of an LED traffic signal, etc., is stored as a light emitting pattern and is compared, the LED traffic signal, etc., can be recognized accurately.
The traffic signal recognizing unit may confirm light emitting lamps that are included in one of the plurality of frames that has a greatest number of light emitting lamps therein, as being the light emitting lamps. In accordance with this feature, a plurality of signal lamps (for example, a red-light lamp and an arrow lamp), which are illuminated simultaneously, can be recognized more accurately.
If one of a red-light signal and an arrow signal is recognized in a certain frame, the traffic signal recognizing unit may make it easier for the other of the red-light signal and the arrow signal to be recognized in a next frame thereafter or in a previous frame therebefore. Further, if one of a red-light signal and an arrow signal is recognized in a certain frame, the traffic signal recognizing unit may make it easier for the one of the red-light signal and the arrow signal to be recognized in a next frame thereafter or in a previous frame therebefore. In accordance with this feature, it becomes easier for a plurality of light emitting lamps, which are recognized as being illuminated simultaneously by the naked eye, to be recognized accurately.
The traffic signal recognizing unit may confirm a light emitting lamp whose recognition count in the plurality of frames has exceeded a recognition count threshold, as being the light emitting lamp. By this feature, the illuminated state of a traffic signal can be judged more accurately, so that a light emitting lamp, which would be mistakenly detected in a signal frame, is not confirmed as being the light emitting lamp.
If there are plural light emitting lamps whose respective recognition counts have exceeded the recognition count threshold, and a mutual difference in the recognition count between the light emitting lamps is greater than or equal to a difference threshold, then the traffic signal recognizing unit may confirm only the light emitting lamp having a larger recognition count, as being the light emitting lamp. In accordance with this feature, it is possible to improve the accuracy with which light emitting lamps are recognized by a relationship between the light emitting lamps themselves.
The above and other objects features and advantages of the present invention will become more apparent from the following description when taken in conjunction with the accompanying drawings, in which a preferred embodiment of the present invention is shown by way of illustrative example.
The sensor unit 12 acquires the sensor information Is that is used in the recognition device 14 for detecting the traffic signal 300. As shown in
The camera 20 is an image capturing unit that captures a peripheral image 100 around the vehicle 10 (see
The vehicle velocity sensor 22 detects a velocity V [km/h] of the vehicle 10. The yaw rate sensor 24 detects a yaw rate Yr [deg/sec] of the vehicle 10.
The map information supplying device 26 supplies map information Im as information (peripheral information) relating to the surrounding area of the vehicle 10. The map information supplying device 26 includes a current position detector 30 and a map information database 32 (hereinafter referred to as a “map DB 32”). The current position detector 30 detects a current position Pc of the vehicle 10. The map DB 32 stores map information Im including positions of traffic signals 300 therein. Such positions can be defined comparatively roughly, so as to indicate which intersection has a traffic signal 300, for example. Alternatively, each of the positions Ps of the traffic signals 300 may be defined with comparatively high detail, including a front and back location in the intersection, a height H, and a left and right (lateral) location, etc. Furthermore, the map information Im may also include the shape (vertically elongate, horizontally elongate, etc.) of a light emitting section 304 (see
The map information supplying device 26 calculates a distance Lsmap [m] from the vehicle 10 (camera 20) to the traffic signal 300 based on the current position Pc and the position Ps of the traffic signal 300, and supplies the same as distance information Ilmap to the recognition device 14. The distance information Ilmap makes up a portion of the map information Im.
The map information supplying device 26 can be configured as a navigation device, for example. Alternatively, the map information supplying device 26 may be a device that supplies the map information Im to the recognition device 14 without performing route guidance for the benefit of the driver.
The surrounding environment recognition device 14 detects a traffic signal 300 that is present in the direction of travel of the vehicle 10. As shown in
The computation unit 52 serves to control the recognition device 14 as a whole, and operates by executing programs that are stored in the storage unit 54. The programs may be supplied externally through a non-illustrated wireless communications device (a portable telephone, a smartphone, or the like). A portion of such programs can be constituted as hardware (circuit components).
The computation unit 52 includes a lane detecting unit 60 and a traffic signal detecting unit 62 (traffic signal recognizing unit). The lane detecting unit 60 detects or recognizes lanes 210l, 210r (see
The storage unit 54 is constituted by a random access memory (RAM) for temporarily storing data, etc., which is subjected to various computational processes, and a read only memory (ROM) in which executable programs, tables, maps, etc., are stored. The storage unit 54 of the present embodiment stores, as teacher data, light emitting patterns Pl (or illumination patterns) for facilitating detection of the traffic signals 300.
The driving assistance unit 16 performs driving assistance for the vehicle 10 using the calculation results of the recognition device 14. The driving assistance unit 16 includes a brake device 70 and a warning device 72. The brake device 70 serves to control a braking force of the vehicle 10, and includes a hydraulic mechanism 80 and a brake electronic control unit 82 (hereinafter referred to as a “brake ECU 82”). The brake ECU 82 controls the hydraulic mechanism 80 based on the traffic signal information Isig from the recognition device 14. The brake in this case is assumed to be a frictional brake in which the hydraulic mechanism 80 is used. However, in addition to or in place of frictional braking, a system may be provided in which one or both of engine braking and regenerative braking are controlled.
The warning device 72 notifies the driver of an illuminated state of the traffic signal 300, in particular, a red light signal (i.e., a state in which a red-light lamp 314 of the traffic signal 300 is illuminated). The warning device 72 includes a display device 90 and a warning electronic control unit 92 (hereinafter referred to as a “warning ECU 92”). The warning ECU 92 controls the display of the display device 90 based on the traffic signal information Isig from the recognition device 14.
With the vehicle 10 of the present embodiment, a traffic signal 300 is detected (or recognized) using the surrounding environment recognition device 14. In addition, driving assistance for the vehicle 10 is carried out based on the information of the detected traffic signal 300. In the driving assistance, for example, there may be included automatic braking, in the case that the vehicle 10 approaches too closely to a traffic signal 300 illuminated with a red-light signal, and a notification of the approach to the traffic signal 300 illuminated with the red-light signal.
Hereinbelow, the control process by which the surrounding environment recognition device 14 detects traffic signals 300 is referred to as a “traffic signal detection control process”. Further, the control process by which the driving assistance unit 16 carries out driving assistance is referred to as a “driving assistance control process”.
The arrow lamp 316a is a lamp that indicates permission to make a left turn, and hereinafter also is referred to as a “left turn permission lamp 316a”. The arrow lamp 316b is a lamp that indicates permission to travel straight forward, and hereinafter also is referred to as a “straight forward permission lamp 316b”. The arrow lamp 316c is a lamp that indicates permission to make a right turn, and hereinafter also is referred to as a “right turn permission lamp 316c”. Below, the arrow lamps 316a, 316b, 316c will be referred to collectively as “arrow lamps 316”.
Further, as shown in
More specifically, in frame F1 of
In each of the frames F1 to F5, the red-light lamp 314, the left turn permission lamp 316a, and the straight forward permission lamp 316b are actually flashing. However, to the naked eye, the red-light lamp 314, the left turn permission lamp 316a, and the straight forward permission lamp 316b are seen as being illuminated continuously.
In the case that the red-light lamp 314, the left turn permission lamp 316a, and the straight forward permission lamp 316b are flashing, if only an image 100 of a single frame F is used, there is a concern that the lamps that are emitting light (hereinafter referred to as “light emitting lamps Ll”) will be mistakenly recognized. Thus, in the traffic signal detection control process of the present embodiment, the traffic signal 300 (or the light emitting lamps Ll thereof) is recognized by combining the images 100 of a plurality of frames F.
In step S2, the computation unit 52 controls the search window 320 to scan (or move over) the image 100 for one frame. Consequently, the computation unit 52 can detect the light emitting lamp Ll. Moreover, as will be described in detail later, the computation unit 52 can change the search region 322 based on the vehicle velocity V, the yaw rate Yr, and the map information Im, etc.
In relation to scanning by the search window 320, for example, while the search window 320 scans the search region 322 from the left side to the right side, the traffic signal detecting unit 62 determines whether or not certain characteristics (e.g., shape, color, brightness, etc.) of the light emitting section 304 or the respective lamps 310, 312, 314, and 316a to 316c of the traffic signal 300 exist inside of the search window 320. Next, while the search window 320 scans the search region 322 from the left side to the right side at a position lowered by a predetermined distance, the computation unit 52 determines whether or not such characteristics (e.g., shape, color, brightness, etc.) of the traffic signal 300 exist inside of the search window 320. By repeating the above steps, the search window 320 scans over the entirety of the search region 322.
Further, during scanning by the search window 320, the current position of the search window 320 is set so as to overlap with the previous position of the search window 320 at which a judgment was made as to the existence of characteristics of the traffic signal 300. Stated otherwise, the offset amount from the previous search window 320 to the current search window 320 is shorter than the width of the search window 320 (for example, about one-half of the width thereof). Owing thereto, even in the case that only a portion of the characteristics of the traffic signal 300 appear within the previous search window 320 so that the traffic signal 300 cannot be detected, the entire characteristics of the traffic signal 300 appear within the present search window 320, whereby it is possible to enhance the accuracy with which the traffic signal 300 is detected. Further, overlapping of the previous position and the current position is not only in the widthwise direction, but can also be performed in the vertical direction.
In step S3, the computation unit 52 determines whether or not light emitting lamps Ll of any type have been detected. As the light emitting lamps Ll, there can be included the green-light lamp 310, the yellow-light lamp 312, the red-light lamp 314, the left turn permission lamp 316a, the straight forward permission lamp 316b, and the right turn permission lamp 316c. Types of lamps apart from those listed above may be included. In the case that one or a plurality of light emitting lamps Ll are detected (step S3: YES), then in step S4, the computation unit 52 changes the count values CNT from 0 to 1 respectively for the detected light emitting lamps Ll.
In step S5, the computation unit 52 judges whether or not the red-light lamp 314 is included in the detected light emitting lamps Ll. If the red-light lamp 314 is included in the detected light emitting lamps Ll (step S5: YES), the process proceeds to step S6. If the red-light lamp 314 is not included in the detected light emitting lamps Ll (step S5: NO), the process proceeds to step S7. In step S6, for the following three frames F thereafter, the computation unit 52 lowers a brightness threshold THb for the arrow lamps 316a, 316b, 316c. Consequently, in the following three frames F, it becomes easier for the arrow lamps 316a, 316b, 316c to be detected. The brightness threshold THb is a threshold value for brightness, which is used at the time that the respective lamps 310, 312, 314, and 316a to 316c are detected in step S2.
In step S7, the computation unit 52 judges whether or not any of the arrow lamps 316a, 316b, 316c are included in the detected light emitting lamps Ll. If any of the arrow lamps 316a, 316b, 316c are included in the detected light emitting lamps Ll (step S7: YES), the process proceeds to step S8. If no arrow lamps 316a, 316b, 316c are included in the detected light emitting lamps Ll (step S7: NO), the process proceeds to step S9. In step S8, for the following three frames F thereafter, the computation unit 52 lowers a brightness threshold THb for the red-light lamp 314. Consequently, in the following three frames F, it becomes easier for the red-light lamp 314 to be detected.
In step S9, the computation unit 52 determines whether or not data of a predetermined number of frames Nf have been acquired. The predetermined number of frames Nf can be from four to ten, for example. In the present embodiment the predetermined number of frames Nf is four. Further, the data in this case is data relating to light emitting patterns Pl, and is defined by count values CNT of the respective lamps 310, 312, 314, and 316a to 316c in each of the frames F (details thereof will be described later with reference to
In step S10, the computation unit 52 compares the acquired data of the predetermined number of frames Nf with teacher data to thereby confirm the presence of the light emitting lamps Ll.
As shown in
Further, as shown in
In addition, by comparing the characteristic vectors Vc that are stored in the storage unit 54 with the characteristic vectors Vc (count values CNT) of the four frames F1 to F4 that have actually been detected, the computation unit 52 determines which one of the light emitting patterns Pl the traffic signal corresponds to, or matches the traffic signal with any one of the light emitting patterns Pl. Furthermore, the computation unit 52 specifies the light emitting lamps Ll based on the determined light emitting pattern Pl.
The computation unit 52 performs the process of
As noted above, according to the present embodiment, the search region 322 of the search window 320 is corrected using the sensor information Is (e.g., the vehicle velocity V, the yaw rate Yr, and the map information Im).
In general, the traffic signal 300 exists to the side of or above the traveling lane 200 and/or the opposing lane 202. For this reason, there is a low possibility for the traffic signal 300 to exist at a position that is separated or distanced from the traveling lane 200 and the opposing lane 202. Thus, according to the present embodiment, the position in the widthwise direction of the search region 322 is set to match with the trajectory of the lanes 210l, 210r. In this case, the length in the widthwise direction of the search region 322 becomes shorter than the initial settings. Accordingly, the range over which the search window 320 is made to move (or scan) within the search region 322 becomes narrower.
If the vehicle velocity V is high, there is a greater necessity to notify the driver concerning the illuminated state of a traffic signal 300 that is comparatively far away, whereas if the vehicle velocity V is low, there is less of a need to notify the driver concerning the illuminated state of a traffic signal 300 that is comparatively far away. Thus, according to the present embodiment, the position and size of the search region 322 is changed depending on the vehicle velocity V. More specifically, if the vehicle velocity V is high, the search region 322 is widened to cover a region at which the distance L from the camera 20 is relatively long. On the other hand, if the vehicle velocity V is low, the search region 322 is narrowed to cover a region at which the distance L from the camera 20 is relatively short. Owing to this feature, the traffic signal 300 can be detected using a search region 322 that corresponds to the vehicle velocity V.
The trajectory of the lanes 210l, 210r is calculated based on the current peripheral image 100. For example, if the absolute value of a left-leaning yaw rate Yr is relatively large, it can be said that there is a high necessity to know the illuminated state of a traffic signal 300 that is located on the left side of the trajectory of the lanes 210l, 210r. Similarly, if the absolute value of a right-leaning yaw rate Yr is relatively large, it can be said that there is a high necessity to know the illuminated state of a traffic signal 300 that is located on the right side of the trajectory of the lanes 210l, 210r. Thus, according to the present embodiment, the position in the widthwise direction of the search region 322 is modified depending on the yaw rate Yr. For example, the left side of the search region 322 is shifted responsive to an increase in the absolute value of the left-leaning yaw rate Yr.
Within the map information Im, the distance information Ilmap representing distance to the traffic signal 300 is utilized to determine which one of the search window 320 and the search region 322 should be used. For example, if the next traffic signal 300 is located at a relatively far position from the vehicle 10, the computation unit 52 does not set the search region 322 on the upper side of the image 100. Conversely, if the next traffic signal 300 is located at a relatively near position from the vehicle 10, the computation unit 52 does not set the search region 322 on the lower side of the image 100.
Information of the height H (height information Ihmap) of the traffic signal 300 within the map information Im is combined with the lane information Il or the distance information Ilmap, whereby the range of the search region 322 in the Y-axis direction (height direction) is limited.
If information of the shape (shape information) of the traffic signal 300 is included in the map information Im, by combining the shape information with the lane information Il or the distance information Ilmap, the range of the search region 322 is changed in the x-axis direction (horizontal direction) and the y-axis direction (vertical direction). For example, compared to a case in which the shape of the light emitting section 304 is horizontally elongate, in the case in which the shape of the light emitting section 304 is vertically elongate, the x-axis direction of the search region 322 is made short, and the y-axis direction is made long. By this feature, the scope (and the position) of the search region 322 can be set corresponding to the shape of the light emitting section 304.
The driving assistance unit 16 performs driving assistance for the vehicle 10 based on the recognition result of the recognition device 14 (i.e., the presence or absence of the traffic signal 300 and the light emitting state of the light emitting section 304), the sensor information Is, etc. More specifically, the brake ECU 82 specifies the illuminated state of the traffic signal 300 and the distance to the traffic signal 300 based on the traffic signal information Isig from the recognition device 14, etc. For example, in the case that the vehicle 10 is not decelerated in front of the traffic signal 300 despite the fact that the traffic signal 300 is a red-light signal, the brake ECU 82 actuates an automatic braking action by the hydraulic mechanism 80.
Further, the warning ECU 92 specifies the illuminated state of the traffic signal 300 and the distance to the traffic signal 300 based on the traffic signal information Isig from the recognition device 14, etc. For example, in the case that the vehicle 10 is not decelerated in front of the traffic signal 300 despite the fact that the traffic signal 300 is a red-light signal, the warning ECU 92 displays a warning message on the display device 90.
As has been described above, according to the present embodiment, the traffic signal 300 is recognized by a combination of the plurality of the images 100 of frames F (see
In the present embodiment, the recognition device 14 includes the storage unit 54 in which the light emitting patterns Pl of a plurality of frames F are stored as teacher data (see,
According to the present embodiment, if one of a red-light signal or an arrow signal is recognized in a certain frame F, the traffic signal detecting unit 62 (traffic signal recognizing unit) makes it easier for the other of the red-light signal or the arrow signal to be recognized in a next frame F thereafter (steps S5 to S8 of
The present invention is not limited to the above embodiment, but various alternative or additional arrangements may be adopted therein based on the disclosed content of the present specification. For example, the following arrangements may be adopted.
In the above embodiments, the recognition device 14 is incorporated in a vehicle 10. However, the invention is not limited to this feature, and the recognition device 14 may be incorporated in other types of objects. For example, the recognition device 14 may be used in mobile objects such as ships or aircraft, etc. Further, such objects are not limited to mobile objects, and insofar as an apparatus or system is provided that detects the presence of traffic signals 300, the recognition device 14 may be incorporated in such other apparatus or systems.
The sensor unit 12 of the above embodiment includes the camera 20, the vehicle velocity sensor 22, the yaw rate sensor 24, and the map information supplying device 26 (see,
Alternatively, other sensors can be used in addition to or in place of one or more of the vehicle velocity sensor 22, the yaw rate sensor 24, and the map information supplying device 26. As examples of such sensors, there can be used an inclination sensor for detecting an inclination A [deg] of the vehicle 10 (vehicle body). Further, the computation unit 52 can correct the position in the Y direction (vertical direction) of the search window 320 and the search region 322 corresponding to the inclination A.
In the above embodiment, the camera 20 is assumed to be fixedly attached to the vehicle 10. However, for example, from the standpoint of acquiring a peripheral image 100 in the direction of travel of the vehicle 10 (or mobile object), the invention is not necessarily limited to this feature. For example, the camera 20 may be incorporated in a mobile information terminal possessed by a pedestrian who is passing outside of the vehicle 10.
The camera 20 of the above embodiment is premised on being attached to the vehicle 10, and having fixed specifications including magnification, angle of view, etc. However, for example, from the standpoint of acquiring a peripheral image 100 in the direction of travel of the vehicle 10 (or mobile object), the invention is not limited to this feature. For example, the camera 20 may have variable specifications.
The camera 20 of the above embodiment is premised on being a single camera (monocular camera). However, for example, from the standpoint of acquiring a peripheral image 100 in the direction of travel of the vehicle 10 (or mobile object), a stereo camera can also be used.
In the above embodiment, the map DB 32 of the map information supplying device 26 is arranged inside the vehicle 10 (see,
According to the above embodiment, the computation unit 52 includes the lane detecting unit 60 and the traffic signal detecting unit 62 (see,
The driving assistance Unit 16 of the above embodiment includes the brake device 70 and the warning device 72 (see,
Alternatively, other driving assistance devices can be provided in addition to or in place of the brake device 70 and/or the warning device 72. As examples of such other types of driving assistance devices, there can be included a device (high efficiency driving support device) that carries out notifications with the aim of improving energy efficiency (fuel consumption, etc.) The high efficiency driving support device can assist in high efficiency driving by prompting the driver to control the vehicle velocity V so as not to have to stop the vehicle 10 at traffic signals 300.
The warning device 72 of the above embodiment serves to provide notification of the existence of the traffic signal 300 by means of a display on the display device 90 (see
In the above embodiment, the traffic signal 300 has been described by way of example as having the green-light lamp 310, the yellow-light lamp 312, the red-light lamp 314, the left turn permission lamp 316a, the straight forward permission lamp 316b, and the right turn permission lamp 316c (see,
According to the above embodiment, the search region of the search window 320 is set using the image information Ii, the vehicle velocity V, the yaw rate Yr, and the map information Im (step S2 of
According to the above embodiment, the region occupied by the search window 320 was assumed to include a plurality of pixels. However, for example, from the standpoint of detecting any of the light emitting lamps Ll, the invention is not limited to this feature. For example, the region of the search window 320 may be one pixel, and an emitted color may be detected by one pixel each. In addition, if the computation unit 52 detects an emission color corresponding to a light emitting lamp Ll, the presence of any of the light emitting lamps Ll can be identified by pattern matching around the periphery of the detected emission color.
According to the above embodiment, in one frame image 100, if one of the red-light lamp 314 or the arrow lamps 316a to 316c is detected, for the following three frames F thereafter, the brightness threshold THb for the arrow lamps 316a to 316c or the red-light lamp 314 is lowered (steps S5 to S8 of
Further, according to the above embodiment, in the frame image 100 that is the current object of calculation, if one of the red-light lamp 314 and the arrow lamps 316a to 316c is detected, for the subsequent frames F, the brightness threshold THb for the arrow lamps 316a to 316c or the red-light lamp 314 is lowered (steps S5 to S8 of
For example, concerning a frame image 100 that is the current calculation target, in a case where a pixel or a pixel group whose brightness is slightly less than the brightness threshold THb for determining the red-light lamp 314 or the arrow lamps 316a to 316c, is detected, if the arrow lamps 316a to 316c or the red-light lamp 314 was already detected in a frame image 100 that was the previous calculation target, then the red-light lamp 314 or the arrow lamps 316a to 316c can be determined. Alternatively, concerning a frame image 100 that is the current calculation target, in a case where a pixel or a pixel group whose brightness is slightly less than the brightness threshold THb for determining the red-light lamp 314 or the arrow lamps 316a to 316c, is detected, if the arrow lamps 316a to 316c or the red-light lamp 314 is detected in a frame image 100 that is the next calculation target, then the red-light lamp 314 or the arrow lamps 316a to 316c may be determined in the frame image 100 that is the current calculation target (but has already become the previous calculation target at the time of this determination).
According to the above embodiment, in one frame image 100, if one of the red-light lamp 314 and the arrow lamps 316a to 316c is detected, the brightness threshold THb for the arrow lamps 316a to 316c or the red-light lamp 314 is lowered (steps S5 to S8 of
In the above embodiment, determination of the arrow lamps 316a to 316c or the red-light lamp 314 using the brightness threshold THb has mainly been described (steps S3, S5 to S8 of
According to the above embodiment, the light emitting lamps Ll are identified by comparing the acquired data with teacher data (step S10 of
Steps S21 to S29 of
According to the first modification, the traffic signal detecting unit 62 (traffic signal recognizing unit) confirms the light emitting lamps Ll that are included in the one of the frames F that has the greatest number Nll of light emitting lamps Ll, as being the light emitting lamps Ll (step S30 of
Step S41 of
For example, among the frames F1 to F4 shown in
In step S42, the computation unit 52 extracts light emitting lamps Ll the respective count values CNT of which are greater or equal to a count threshold THcnt. The count threshold THcnt is a threshold value for specifying the light emitting lamps Ll, and in the example of
In step S43, the computation unit 52 determines whether or not there are light emitting lamps Ll that were extracted in step S42. If there are no extracted light emitting lamps Ll (step S43: YES), then it is determined that there are no light emitting lamps Ll in the current calculation cycle. Therefore, the current process is terminated, and after elapse of a predetermined time period, the process is repeated from step S41.
If there are extracted light emitting lamps Ll (step S43: NO), then in step S44, the computation unit 52 makes a judgment as to whether or not there is only one extracted light emitting lamp Ll. If only one light emitting lamp Ll is extracted (step S44: YES), then in step S45, the computation unit 52 confirms that the extracted light emitting lamp Ll is emitting light.
If more than one light emitting lamp Ll are extracted (step S44: NO), then it is determined that plural light emitting lamps Ll are extracted. In this case, in step S46, the computation unit 52 determines whether or not each of mutual differences ΔC in the count values CNT of the plurality of extracted light emitting lamps Ll,
respectively, is greater than or equal to a predetermined threshold value THΔc. Although the threshold value THΔc in the example of
If the difference ΔC is greater than or equal to the threshold value THΔc (step S46: YES), one light emitting lamp Ll whose count value CNT is smaller can be presumed to be of low reliability. Thus, in step S47, the computation unit 52 confirms only the other light emitting lamp Ll whose count value CNT is larger, as being the light emitting lamp Ll.
If the difference ΔC is not greater than or equal to the threshold value THΔc (step S46: NO), then any of the light emitting lamps Ll can be presumed to be of high reliability. Thus, in step S48, the computation unit 52 confirms that the respective light emitting lamps Ll are emitting light.
According to the second modification, the traffic signal detecting unit 62 (traffic signal recognizing unit) confirms a light emitting lamp Ll whose count value CNT (recognition count) in a plurality of frames F has exceeded the count threshold THcnt (recognition count threshold), as being the light emitting lamp Ll (steps S45, S47 and S48 of
Further, according to the second modification, if there are plural light emitting lamps Ll whose respective count values CNT (recognition count) have exceeded the count threshold THcnt (step S44 of