Through the abovementioned prior applications, this application claims the priority under 35 USC 119 of German Patent Application 10 2004 025 541.5 filed on May 25, 2004, and German Patent Application 10 2004 026 591.7 filed on Jun. 1, 2004. The entire disclosures of the foreign priority applications are incorporated herein by reference.
The present invention relates to a monitoring unit or system for monitoring, recording or imaging the outside or exterior environment of a motor vehicle, for example in a motor vehicle's direction of travel, including at least one camera system having an image-recording sensor.
Intelligent Advanced Driver Assistance Systems (ADAS) will play an increasingly important role in modern motor vehicles of the future. Future vehicles will contain, for example, monitoring units such as camera systems having, for instance, digital CMOS (Complementary Metal-Oxide Semiconductor) or CCD (Charge-Coupled Device) image sensors as aids that monitor, record or image the outside environment, for example in the motor vehicle's direction of travel.
In connection with such image processing systems used in the automotive sector for detecting the driving environment, a pure black/white (B/W), that is to say monochrome, image recording is more advantageous than color image recording for most tasks. However, applications do exist where color information obtained from, for example, the three RGB primary colors red (R), green (G), and blue (B), and/or other colors such as, for instance, yellow (Y) etc., can be important for attaining a higher confidence level of the output vector generated by an image processing system.
An instance thereof is traffic sign recognition, wherein it is possible to recognize, by way of the, where applicable, individual color information (R and/or G and/or B and/or other colors such as, for instance, Y etc.), whether the sign concerned is a prohibition sign or a sign giving orders, or one that purely provides information.
Another function is the recognition of colored lane markings, for example, in roadwork or construction areas. Color information is helpful here too, and is necessary to be able, for example, to distinguish between the normal white markings that are no longer valid and the additional yellow lane markings that pertain within the roadwork or construction area.
Conversely, purely monochrome (B/W) image recording is sufficient for a recognition of objects such as obstacles, other vehicles, bicyclists or other persons, because color information (R, G, B, Y etc.) will as a rule not provide better recognition quality in this context.
In view of the above, it is an object of one or more embodiments of the invention to provide an improved monitoring unit or imaging system for monitoring, recording or imaging the exterior environment outside of a motor vehicle, especially in the motor vehicle's direction of travel, which monitoring unit includes at least one camera system having an image-recording sensor. It is another object of one or more embodiments of the invention to provide an improved driver assistance system for a motor vehicle, in particular for traffic-sign and/or traffic-lane detection. The invention further aims to avoid or overcome the disadvantages of the prior art, and to achieve additional advantages, as apparent from the present specification. The attainment of these objects is, however, not a required limitation of the claimed invention.
Embodiments of the invention further develop generic monitoring units or imaging systems for imaging the outside environment of a motor vehicle in the direction of travel, in that the monitoring unit includes at least one camera system having an image-recording sensor having color coding (R, G, B, Y, . . .) in partial areas but otherwise monochrome coding (B/W). Embodiments of the present invention thus proceed from an image-recording sensor that is substantially embodied or coded as monochrome (B/W), and additionally has color coding (R and/or G and/or B and/or other colors such as, for instance, Y etc.) in partial areas thereof.
For the purpose of recognizing or assigning specific colors of traffic signs, it is proposed to provide a color coding (R and/or G and/or B and/or other colors such as, for instance, Y etc.) of vertical stripes and/or areas on the right-hand and/or left-hand image edge. Because traffic signs move from the center of the image outward from the perspective of a camera mounted in the front area of a motor vehicle traveling in a straight line, the color of the sign and the basic information associated therewith (prohibition, orders, other information) can be advantageously determined when the sign is located within the color-coded stripes or, as the case may be, areas.
For the purpose of recognizing the e.g. yellow or white color of a lane marking, it is proposed to provide a color coding (R and/or G and/or B and/or other colors such as, for instance, Y etc.) of horizontal stripes and/or areas on the sensor's bottom image edge, preferably in the area where the camera has a view onto the road directly over the hood, particularly in the case of a customary passenger automobile. The camera can, of course, be arranged analogously when the inventive monitoring unit is employed in a truck or van etc. This area is not absolutely essential for image evaluation with regard to object detection or lane detection, and can thus advantageously be used for color-recognition purposes. Because traffic lanes can be seen from the center/top of the central image area down to the bottom left-hand or right-hand image area when the camera is mounted on the front of a vehicle traveling in a straight line, then in addition to the traffic lane's position the color can advantageously be determined and made available to an image-processing system e.g. image processor.
In order to obtain the desired color information (R and/or G and/or B and/or other colors such as, for instance, Y etc.), it s is proposed to attach a tiny color filter, where applicable specifically accommodated to the application, in front of each individual cell (pixel) of the pertinent stripes or areas.
In a first embodiment, the color-coded stripes and/or areas are embodied as, for example, a single color (R; Y; etc. . . . ). The vertical stripes and/or areas expediently have, for example, a red (R) color coding, and the horizontal stripes and/or areas preferably have a yellow (Y) color coding.
Alternatively or additionally, the color-coded horizontal and vertical stripes and/or areas can be embodied as a combination of two colors (R, G). In particular, vertical stripes and/or areas having red (R) and green (G) color coding have proved useful for increasing the contrast of signs that give orders and are placed in front of trees.
In a further embodiment, for obtaining the desired color information (R, G, B), it is proposed to arrange a tiny color filter in one of the three RGB primary colors red (R), green (G), and blue (B) in front of each individual cell (pixel) of the pertinent stripes or areas, whereby the filters are preferably arranged in the so-called “Bayer pattern”.
Another embodiment of the invention further comprises an assistance system having a monitoring unit of the aforementioned type. The monitoring unit's advantages will in this way also come to bear within the scope of an overall system, in particular for traffic-sign and/or traffic-lane detection. The ratio of monochrome coding (B/W) to partial color coding is therein preferably 80:20%. Depending on the focus of the specific application, the partial color-coded areas can also occupy 25% or up to 40% of the sensor surface.
The main advantage of a monitoring unit according to an embodiment of the invention for imaging the outside environment in a motor vehicle's direction of travel or, as the case may be, of an assistance system for motor vehicles including such a monitoring unit, having a partially color-coded (R and/or G and/or B and/or other colors such as, for instance, Y etc.) camera is that all relevant data for imaging or detecting driving environments can for the first time be obtained using just one camera. Owing to the camera's substantially monochrome (B/W) image recorder, there will be no constraints on sensitivity so that reliable evaluation will be ensured even in poor light conditions. The color coding (R and/or G and/or B and/or other colors such as, for instance, Y etc.) in the sensor's edge area will not compromise those applications for which the purely monochrome (B/W) image is more advantageous.
By contrast, the color coding (R and/or G and/or B and/or other colors such as, for instance, Y etc.) having vertical stripes and/or areas on the right-hand and/or left-hand edge of the sensor's image field will provide reliable information about the color (R, G, B, Y, . . . ) of traffic signs. The color coding (R, G, B, Y, . . . ) in the bottom image area will provide reliable information about the color of lane markings.
A single camera can thus be used for all relevant applications, and that will advantageously save costs and mounting or installation space.
In order that the invention may be clearly understood, it will now be described in connection with example embodiments thereof, with reference to the accompanying drawings, wherein:
Such color information (R, G, B, Y, . . .) can be important for attaining a higher confidence level of the output vector generated by an image processing system, in particular for the recognition of traffic signs, wherein by way of the color information (R and/or G and/or B and/or other colors such as, for instance, Y etc.) it can be recognized whether the sign is a prohibition sign or an affirmative requirement sign giving orders, as in the case of the speed-limit sign on the right of the image section shown
A further function is the recognition of colored lane markings associated with roadworks or construction areas. In this regard, it is helpful and necessary to distinguish between the normal white markings, which are no longer valid, and the additional yellow lane markings that apply in the roadwork or construction area.
Conceivable solutions featuring an exclusively color-coded (R, G, B, Y, . . . ) image recorder or sensor are not only more computationally intensive and thus more costly, but also have the disadvantage that monochrome (B/W) images are more favorable or advantageous for a number of exterior scene evaluation applications, in particular for object detection of obstacles, other vehicles, and bicyclists, persons and the like, or night-time applications.
To resolve this conflict of opposed requirements, one or more embodiments of the present invention propose the use, in a monitoring unit, of for example a specially embodied CCD sensor 10 having color coding (R, G, B, Y, . . .) in partial areas 11, 12, 13, but otherwise having monochrome coding (B/W). It should be understood that “monochrome” actually means that the sensor cells are sensitive to all colors of light and only brightness levels of the light are registered, so that such monochrome cells without color coding cannot distinguish between different colors of light, and color filters are arranged over cells that are to be color coded. Thus, an embodiment of the present invention proceeds from a sensor 10 that is embodied or coded substantially as monochrome (B/W), and that additionally has color coding (R and/or G and/or B and/or other colors such as e.g. Y, etc.) in partial areas. That could be achieved by, for instance, a color coding (R, G, B, Y, . . . ) of vertical stripes (not shown) or areas on the right-hand image edge 11 and left-hand image edge 12, and would be helpful, for example, for assigning the correct color (R, G, B, Y, . . . ) to traffic signs. Because the traffic signs move from the center of the image outward, from the perspective of a camera mounted in the front area of a motor vehicle traveling in a straight line, the color (R, G, B, Y, . . . ) of the sign can be determined at the time when the sign is located within the color-coded (R, G, B, Y, . . . ) stripes or areas 11, 12.
An expedient approach for recognizing the color of the traffic lane (yellow or white, for example) is also to provide color coding, in particular yellow (Y) coding, in the bottom area 13 of the sensor 10, preferably in the area where the camera has a view onto the road directly over the hood in the case of a customary passenger automobile. This area is not absolutely essential for an image evaluation with regard to object detection or lane detection, and can thus be used for color-recognition (Y). Because traffic lanes can be seen from the center/top of the central image area down to the bottom left-hand and right-hand image area when the camera is mounted on the front of a vehicle traveling in a straight line, then in addition to the position of the traffic lane, also the color can be determined and made available to the image-processing system. This feature is of course not restricted to motor vehicles having a hood, but rather can be realized analogously when the inventive monitoring unit is employed in particular in a truck or van etc.
Especially also in the case of night-vision applications, the highest possible sensitivity is necessary across the entire, which is to say unfiltered, wavelength range, including the near infrared. Because, however, for this a much smaller detection angle generally has to be detected, it suffices to provide a smaller central area without color coding.
Color information useful in the context of road traffic can be obtained within the stripes or areas 11, 12, 13 of the sensor 10 that are coded as a Bayer pattern, by means of such color interpolations, and can thus be made available to an image-processing system.
The corresponding exemplary image according to
The color information does not necessarily have to consist of the three primary colors. It is also possible to use color filters of only a single color such as, for instance, red (R) for recognizing signs giving orders or yellow (Y) for recognizing lane markings associated with, for example, roadworks or construction areas. Combinations of red (R) and green (G) color filters or other color filters specially adapted to the application have also proved useful for increasing the contrast of signs that give orders and are placed in front of trees. Such color filters can advantageously also be arranged spaced apart from one another, being located, for example, on every other pixel in a row and/or column; namely in any desired combination with non-coded (N) pixels or pixels coded in another color or where applicable specially adapted color filters.
Arrangements that offer this type of advantage and further examples of color coding, which can be used instead of the “Bayer pattern”, are shown in
The main advantage of a partially color-coded camera is that all relevant data for imaging or recording driving environments can be obtained using just one camera. Owing to the camera's substantially monochrome (B/W) image sensor there is no constraint or limitation on the sensitivity so that a reliable evaluation will be ensured even in poor light conditions. The color coding (R, G, B, Y, . . . ) in defined edge areas 11, 12, 13 of the sensor 10 will not compromise such applications for which the purely monochrome (B/W) image is more favorable. By contrast, the color coding (R, G, B, Y, . . . ) having vertical stripes on the left-hand image edge 11 and the right-hand image edge 12 of the sensor's image field will provide reliable information about the color (R, G, B, Y, . . . ) of traffic signs, and the color coding (R, G, B, Y, . . . ) in the bottom image area 13 will provide reliable information about the color of traffic lanes.
Thus, a single camera can be used for all applications, and that will advantageously save costs and mounting or installation space.
The present invention is thus especially suitable for implementation in an assistance system for motor vehicles, in particular for traffic-sign and/or traffic-lane detection. It will advantageously increase road-traffic safety not only in combination with existing assistance systems for motor vehicles such as blind-spot detection, lane departure warning (LDW), lane monitoring, night vision, etc.
Although the invention has been described with reference to specific example embodiments, it will be appreciated that it is m intended to cover all modifications and equivalents within the scope of the appended claims. It should also be understood that the present disclosure includes all possible combinations of any individual features recited in any of the appended claims. The abstract of the disclosure does not define or limit the claimed invention, but rather merely abstracts certain features disclosed in the application.
Number | Date | Country | Kind |
---|---|---|---|
10 2004 025 541 | May 2004 | DE | national |
10 2004 026 591 | Jun 2004 | DE | national |
This application is a Continuation under 35 USC 120 of copending U.S. patent application Ser. No. 14/636,773 filed on Mar. 3, 2015, which was a Continuation under 35 USC 120 of U.S. patent application Ser. No. 10/593,840 filed on Sep. 22, 2006 as the U.S. National Stage under 35 USC 371 of PCT International Application PCT/EP2005/052080 filed on May 6, 2005 and issued as U.S. Pat. No. 9,524,439 on Dec. 20, 2016. The entire disclosures of the prior applications are incorporated herein by reference. This application is also related to U.S. patent application Ser. No. 14/636,617 filed on Mar. 3, 2015 and issued as U.S. Pat. No. 9,704,048 on Jul. 11, 2017.
Number | Name | Date | Kind |
---|---|---|---|
3971065 | Bayer | Jul 1976 | A |
4330797 | Yokokawa et al. | May 1982 | A |
5221963 | Hashimoto et al. | Jun 1993 | A |
5398077 | Cok et al. | Mar 1995 | A |
5987174 | Nakamura et al. | Nov 1999 | A |
6173108 | Ohashi | Jan 2001 | B1 |
6320176 | Schofield et al. | Nov 2001 | B1 |
6320618 | Aoyama | Nov 2001 | B1 |
6455831 | Bandera et al. | Sep 2002 | B1 |
6573490 | Hochstein | Jun 2003 | B2 |
6977683 | Okada | Dec 2005 | B1 |
7259367 | Reime | Aug 2007 | B2 |
8588920 | Naughton | Nov 2013 | B2 |
9270899 | Ivanchenko | Feb 2016 | B1 |
9335264 | Kroekel et al. | May 2016 | B2 |
20010052938 | Itoh | Dec 2001 | A1 |
20020039142 | Zhang | Apr 2002 | A1 |
20020044209 | Saito | Apr 2002 | A1 |
20020081029 | Marugame | Jun 2002 | A1 |
20030001121 | Hochstein | Jan 2003 | A1 |
20030048493 | Pontifex et al. | Mar 2003 | A1 |
20040008410 | Stam et al. | Jan 2004 | A1 |
20040090550 | Park | May 2004 | A1 |
20040091133 | Monji | May 2004 | A1 |
20040141057 | Pallaro et al. | Jul 2004 | A1 |
20050146629 | Muresan | Jul 2005 | A1 |
20060050082 | Jeffrey | Mar 2006 | A1 |
20060072319 | Dziekan et al. | Apr 2006 | A1 |
20060145220 | Hwang | Jul 2006 | A1 |
20060215049 | Sandini et al. | Sep 2006 | A1 |
20070159544 | Hu | Jul 2007 | A1 |
20080043099 | Stein et al. | Feb 2008 | A1 |
20100134616 | Seger et al. | Jun 2010 | A1 |
20120038801 | Yamada | Feb 2012 | A1 |
20120200733 | Utsugi | Aug 2012 | A1 |
20120293695 | Tanaka | Nov 2012 | A1 |
20130070109 | Gove et al. | Mar 2013 | A1 |
20130258112 | Baksht | Oct 2013 | A1 |
20140063306 | Scott | Mar 2014 | A1 |
20140354773 | Venkataraman et al. | Dec 2014 | A1 |
20150213319 | Frenzel et al. | Jul 2015 | A1 |
20160080659 | Kroekel et al. | Mar 2016 | A1 |
Number | Date | Country |
---|---|---|
0 830 267 | Jul 2002 | EP |
1 418 089 | May 2004 | EP |
1 764 835 | Mar 2007 | EP |
09-035065 | Feb 1997 | JP |
11-351862 | Dec 1999 | JP |
2002-026304 | Jan 2002 | JP |
2004-040409 | Feb 2004 | JP |
2004-104646 | Apr 2004 | JP |
2004-304706 | Oct 2004 | JP |
WO 96038319 | Dec 1996 | WO |
WO 01070538 | Sep 2001 | WO |
WO 09027134 | Mar 2009 | WO |
Entry |
---|
Office Action in U.S. Appl. No. 10/593,840, dated Apr. 11, 2011, 12 pages. |
Office Action in U.S. Appl. No. 10/593,840, dated Sep. 14, 2011, 14 pages. |
Final Office Action in U.S. Appl. No. 10/593,840, dated Feb. 17, 2012, 17 pages. |
Advisory Action in U.S. Appl. No. 10/593,840, dated May 1, 2012, 6 pages. |
USPTO Patent Trial and Appeal Board, Decision on Appeal in U.S. Appl. No. 10/593,840, mailed May 18, 2015, 7 pages. |
U.S. Office Action in U.S. Appl. No. 10/593,840, dated Nov. 23, 2015, 33 pages. |
U.S. Office Action in U.S. Appl. No. 14/636,617, dated Feb. 9, 2016, 24 pages. |
U.S. Final Office Action in U.S. Appl. No. 10/593,840, dated May 31, 2016, 23 pages. |
U.S. Final Office Action in U.S. Appl. No. 14/636,617, dated Aug. 10, 2016, 29 pages. |
U.S. Advisory Action in U.S. Appl. No. 14/636,617, dated Nov. 30, 2016, 3 pages. |
Gislin Dagnelie, “Visual Prosthetics”, Physiology, Bioengineering, Rehabilitation, Springer, Chapter 1, 2011, pp. 3 to 21. |
Robert F. Schmidt et al., “Neuro- and Sinnesphysiologie” (“Neuro- and Sensory Physiology”), Springer Instructional Book, 5th Edition, Chapter 10, 2006, Heidelberg, Germany, pp. 243 to 273, with partial English translation of pp. 265 to 267. |
Number | Date | Country | |
---|---|---|---|
20180357497 A1 | Dec 2018 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14636773 | Mar 2015 | US |
Child | 16105184 | US | |
Parent | 10593840 | US | |
Child | 14636773 | US |