The present invention relates to vehicular cameras and more particularly to rearview vehicular cameras that display overlays onto the camera image.
Vehicular cameras are used for a variety of purposes, such as to assist a driver in avoiding obstacles behind a vehicle when backing up. Some cameras add overlays onto the camera image to assist the driver in determining distances to obstacles behind the vehicle, vehicle trajectory and other useful information. The overlays may be static or may be dynamic. A dynamic overlay is an overlay that is changed by the camera based on certain inputs. For example, some cameras display a predicted vehicle trajectory based on certain factors such as steering wheel angle. The overlays, whether static or dynamic, will change depending on the angle of mounting of the camera, the height of the camera off the ground, distance from the camera horizontally to the rear axle of the vehicle, the steering gear ratio for the vehicle, and possibly other factors. As a result, cameras for different vehicles up until now have had different programming and thus have different part numbers associated with them. This results in a potentially large number of part numbers and inventory. A particular vehicle family, such as a particular truck, may have numerous vehicle configurations that will impact the overlays that are displayed by the rearview camera. Such configurations would include, for example, regular cab with short bed, regular cab with long bed, extended cab with short bed and extended cab with long bed.
It would be desirable to reduce the number of separate part numbers that are associated with variations on programming for essentially the same camera.
In one aspect, the invention is directed to a vehicular camera including a housing, a lens, an image sensor positioned for receiving images from the lens, a processor, and a memory. The memory contains a plurality of overlays. The processor is programmed to receive first input data from a vehicle in which the camera is to be mounted, wherein the first input data correspond to the configuration of the vehicle, and select a particular overlay to display based at least in part on the input received.
The present invention will now be described by way of example only with reference to the attached drawings in which:
Reference is made to
The overlays 22-1, 22-2, 22-3, 22-4, 22-5, 22-6, 22-7, 22-8, 22-9 . . . 22-17 (
As shown in
The input data may further include second input data which corresponds to the angle of the steering wheel in the vehicle 11. The steering wheel is shown at 30 in
The processor 24 uses the first and second input data to identify which overlay 22 to use on the images 20. The processor 24 may achieve this in any suitable way. One such way is by using the first and second input data as input parameters for a lookup table shown at 32 that is stored in the memory 21.
The lookup table 32 is shown in more detail in
It can be seen that the lookup table 32 does not require a substantial amount of the memory 21. Furthermore it can be seen that the total number of overlays 22 that needs to be stored in the memory 21 is no more than would need to be stored for the vehicle configuration 11a. It will be noted that for the 4 vehicle configurations shown in the lookup table 32, 13 of the overlays 22 (i.e., overlays 22-1 to 22-13 are common to all of the vehicle configurations, a further one overlay (22-14) is common to 3 of them, a further 2 overlays (22-15 and 22-16) are common to 2 of them, and only 2 overlays (22-17 and 22-18) are unique to one of them. Accordingly, the amount of memory consumed by providing the capability of handling 4 different vehicle configurations is not substantially more than the amount of memory already provided on such image processing boards when handling a single vehicle configuration. Additionally, the use of a lookup table is not computationally stressful for the processor 24.
However, it is alternatively possible that instead of a lookup table to determine which overlay 22 to use, the processor 24 could use the steering wheel angle data and the vehicle configuration data to calculate the projected vehicle trajectory and to then select an overlay 22 that is suitable. As another alternative, it is possible for the overlays to be mathematically generated by the processor 24 based on the steering wheel angle data and the vehicle configuration data. In other words, the processor 24 could, using the steering wheel angle data and the vehicle configuration data, calculate the curve on which to draw an overlay 22 instead of grabbing a premade overlay 22 from memory. In such an embodiment, the processor 24 could calculate an entirely new overlay each time it samples the steering wheel angle input, or it could calculate an adjustment to make to the previously drawn overlay each time it samples the steering wheel angle input. In either case, the processor 24 would be capable of drawing a continuous range of overlays 22 as compared to embodiments wherein a premade overlay 22 is pulled from memory and used over a range of steering wheel angles. In such an embodiment, the vehicle configuration data can be used to modify the formulas used by the processor 24 to determine the appropriate curve of the overlay 22. These modifications to the formulas (e.g., values for certain constants in the formulas) may be stored in an array or a lookup table stored in memory 21, and which is accessed by the processor 24 based on the vehicle configuration data. The aforementioned lookup table described above is the preferred approach, however.
It will be noted that, in part, many of the overlays 22 are common to the different vehicle configurations because the vehicle configurations are part of the same vehicle family. As such, many of the parameters that would impact the appearance of the overlays would be the same for all members of the vehicle family. Such parameters would include for example, the lateral distance of the camera from the edge of the vehicle, the height of the camera from the ground and the angle of the camera relative to horizontal.
Reference is made to
In some cases the particular steering gear mechanism 36 used on the vehicle 11 may not be reflected in the vehicle configuration data (i.e., the first input data) that is transmitted to the camera 10. It will be understood of course that without knowing which steering gear mechanism (more particularly, which gear ratio) is used, the camera 10 does not have enough information based solely on the vehicle configuration and the steering wheel angle to determine the projected vehicle trajectory. In the particular exemplary case shown in
In order to determine which of the two steering gear mechanisms 36a or 36b is used on the vehicle 11, the camera 10 is activated and notified when the steering wheel 30 (
While the example overlays shown and described herein relate to the predicted vehicle trajectory, it will be understood that other overlays relating to other vehicle properties could be displayed. Additionally it will be understood that the overlays 22 shown and described may not be the only overlays shown on the images 20. Additionally dynamic and/or static overlays could also be shown on the images by the camera.
The processor 24 and memory 21 have been shown in
Similarly, the memory 21 may alternatively reside on a board to which the image sensor 12 is integrally attached, or on a board that is separate from the board to which the image sensor 12 is attached. Alternatively the memory 21 may reside in part on the board to which the image sensor 12 is attached and in part on a board that is separate from the board to which the image sensor 12 is attached, in which case the two portions of the memory would collectively be referred to as the memory 21. In yet another alternative, it is possible for the memory 21 to comprise an external processor that is outside the housing 14 of the camera 10 that cooperates with one or more memories that are contained within the housing 14. In such an embodiment, such an external memory may be positioned anywhere within the vehicle.
While the above description constitutes a plurality of embodiments of the present invention, it will be appreciated that the present invention is susceptible to further modification and change without departing from the fair meaning of the accompanying claims.
The present application is a continuation of U.S. patent application Ser. No. 16/786,094, filed Feb. 10, 2020, now U.S. Pat. No. 10,957,114, which is a continuation of U.S. patent application Ser. No. 16/234,760, filed Dec. 28, 2018, now U.S. Pat. No. 10,559,134, which is a continuation of U.S. patent application Ser. No. 16/029,750, filed Jul. 9, 2018, now U.S. Pat. No. 10,169,926, which is a continuation of U.S. patent application Ser. No. 14/117,759, filed Nov. 14, 2013, now U.S. Pat. No. 10,019,841, which is a 371 national phase filing of PCT Application No. PCT/US2011/036967, filed May 18, 2011.
Number | Name | Date | Kind |
---|---|---|---|
5289321 | Secor | Feb 1994 | A |
5359363 | Kuban et al. | Oct 1994 | A |
5410346 | Saneyoshi et al. | Apr 1995 | A |
5414461 | Kishi et al. | May 1995 | A |
5444478 | Lelong et al. | Aug 1995 | A |
5574443 | Hsieh | Nov 1996 | A |
5793308 | Rosinski et al. | Aug 1998 | A |
5949331 | Schofield et al. | Sep 1999 | A |
6155377 | Tokunaga et al. | Dec 2000 | A |
6256561 | Asanuma | Jul 2001 | B1 |
6578017 | Ebersole et al. | Jun 2003 | B1 |
6631994 | Suzuki et al. | Oct 2003 | B2 |
7295227 | Asahi et al. | Nov 2007 | B1 |
7843451 | Lafon | Nov 2010 | B2 |
7930160 | Hosagrahara et al. | Apr 2011 | B1 |
8405724 | Jeon et al. | Mar 2013 | B2 |
8451107 | Lu et al. | May 2013 | B2 |
10019841 | Gibson et al. | Jul 2018 | B2 |
10169926 | Gibson et al. | Jan 2019 | B2 |
10559134 | Gibson et al. | Feb 2020 | B2 |
10957114 | Gibson et al. | Mar 2021 | B2 |
20020120371 | Leivian et al. | Aug 2002 | A1 |
20020128754 | Sakiyama et al. | Sep 2002 | A1 |
20060287826 | Shimizu et al. | Dec 2006 | A1 |
20070038422 | Wang et al. | Feb 2007 | A1 |
20070120657 | Schofield et al. | May 2007 | A1 |
20070194899 | Lipman | Aug 2007 | A1 |
20080077882 | Kramer | Mar 2008 | A1 |
20080129539 | Kumon | Jun 2008 | A1 |
20080231701 | Greenwood et al. | Sep 2008 | A1 |
20080266541 | Yung et al. | Oct 2008 | A1 |
20080300745 | Goossen et al. | Dec 2008 | A1 |
20090079828 | Lee | Mar 2009 | A1 |
20090096937 | Bauer | Apr 2009 | A1 |
20090179916 | Williams et al. | Jul 2009 | A1 |
20120316779 | Kanno et al. | Dec 2012 | A1 |
20140032184 | Carrasco et al. | Jan 2014 | A1 |
20140290468 | Arnold | Oct 2014 | A1 |
20200134396 | Porta | Apr 2020 | A1 |
Number | Date | Country |
---|---|---|
59114139 | Jul 1984 | JP |
05133336 | Sep 1984 | JP |
6080953 | May 1985 | JP |
6079889 | Oct 1986 | JP |
6272245 | May 1987 | JP |
62122487 | Jun 1987 | JP |
6414700 | Jan 1989 | JP |
04114587 | Apr 1992 | JP |
0550883 | Mar 1993 | JP |
0577657 | Mar 1993 | JP |
5213113 | Aug 1993 | JP |
06227318 | Aug 1994 | JP |
074170 | Jan 1995 | JP |
07105496 | Apr 1995 | JP |
2630604 | Jul 1997 | JP |
Entry |
---|
Ballard, Dana H. et al., “Computer Vision”, 1982, p. 88-89, sect. 3.4.1. |
International Search Report and Written Opinion dated Sep. 9, 2011 for PCT Application No. PCT/US2011/036967. |
Tokumaru et al., “Car Rear-View TV System with CCD Camera,” National Technical Report vol. 34, No. 3, pp. 329-336, Jun. 1988 (Japan). |
Wang et al., CMOS Video Cameras, article, 1991, 4 pages, University of Edinburgh, UK. |
“Edmunds”, combined screen shots from http://www.edmunds.com/bmw/5-series/2010/features-specs.html?sub=sedan&style=101166700 (last visited Apr. 20, 2015) and http:/web.archive.org/web/20110223063738/http://www.edmunds.com/bmw/5-series/2010/features-specs.html (last visited Apr. 20, 2015) (Edmunds last saved Feb. 23, 2011). |
Kannan, Saravanan et al., “An Intelligent Driver Assistance System (I-DAS) for Vehicle Safety Modelling Using Ontology Approach.” International Journal of Ubicomp (IJU) vol. 1, No. 3 (Jul. 2010), pp. 15-29. |
Number | Date | Country | |
---|---|---|---|
20210209861 A1 | Jul 2021 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16786094 | Feb 2020 | US |
Child | 17301004 | US | |
Parent | 16234760 | Dec 2018 | US |
Child | 16786094 | US | |
Parent | 16029750 | Jul 2018 | US |
Child | 16234760 | US | |
Parent | 14117759 | US | |
Child | 16029750 | US |