The present invention relates generally to a control system for a vehicle and, more particularly, to a control system that includes cabin monitoring capabilities.
Cabin monitoring systems are known that monitor an interior cabin of the vehicle to determine presence of an occupant in the vehicle. Examples of such systems are described in U.S. Pat. Nos. 8,258,932; 6,485,081 and 6,166,625, which are hereby incorporated herein by reference in their entireties.
The present invention provides a driver assistance system or control system that utilizes a thermal imaging system for in cabin monitoring and that uses the system (such as in a highly or fully autonomous vehicle) to control a climate control system of the vehicle so that no occupant of the vehicle has a too cold or too hot heating scenario. The system may control the vehicle climate control system or may individually control separate climate control zones of the vehicle responsive to determination of the presence of one or more occupants in the vehicle. The system may control the climate control system of the vehicle to a preselected temperature selected by the particular occupant and part of a stored profile of that occupant.
These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.
Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes an interior cabin monitoring system 12 that includes at least one interior viewing sensor, such as an imaging sensor or camera and such as a thermal sensing camera, which monitors the cabin 10a of the vehicle (
The present invention uses in-cabin sensors or cameras (such as, for example, thermal sensing cameras) to measure and monitor the temperatures at passengers in the vehicle (such as, for example, the skin temperature of passengers). For a group of people detected in the vehicle, different zones (climate zones) of the vehicle cabin may be taken into account to provide temperature control for each zone.
Once the individual books or enters a vehicle (such as a fully autonomous vehicle or robo taxi or the like, or for a typical taxi or Uber vehicle or the like), the in cabin thermal sensors can be adapted to his or her personal selected pre-settings. The system can identify if the passenger has too warm or too cold environment and can control the heating/climate control based on measurement data and also taking personal settings (if available) into account.
Thus, the in cabin monitoring systems acts as a sensor and control to personalize the interior of the vehicle (or only a portion or zone of the interior of the vehicle where the person is sitting). Optionally, the system may control more than temperature/climate. For example, the system may control or adjust sound settings or lighting in the vehicle in accordance with the detected passenger's personal preferences or settings. Optionally, for example, if the interior sensor system identifies that the person is sleepy, the system may change the settings of the interior music, lights, etc., optionally responsive to customized or selected presettings that pertain to the particular identified individual to ensure individual preferences are considered.
With reference to
Optionally, a user of the autonomous vehicle transportation service may, such as by using his or her smartphone, set up a user profile that includes various personal preference settings, such as preferred temperature settings, lighting settings, audio settings and/or the like. Then the system, responsive to that particular user being identified in the vehicle (such as via the cabin monitoring system or via a signal from that person's smartphone or the like), the system may adjust the temperature, lighting and/or audio according to the user's selected preferences.
Optionally, the system may, when cloud services are available, communicate with a remote server or system to obtain the personalized profile or database for a particular user and can then adjust the cabin temperature (or lighting or audio) according to the user's profile. This approach is beneficial for autonomous public transportation systems (such as robo taxis or the like), whereby the control system of the vehicle can adjust the settings to the desired or selected levels or characteristics when the user has booked or called for the autonomous vehicle and before the user enters the vehicle. The system may be operable to only control a zone of the vehicle at which the user is seated, while similarly controlling other zones of the vehicle in accordance with the user preferences of the occupant(s) in the other zone(s). Optionally, the system may store multiple profiles of potential passengers (such as family members or coworkers that use the same vehicle) and may control the climate control system (and/or lighting and/or audio system) of the vehicle responsive to identification (such as via image processing of image data captured by a vehicle camera) of the particular passenger's face or other biometric identification (e.g., retinal scan, fingerprints or the like).
The system may also or otherwise utilize aspects of other cabin monitoring systems, such as those described in U.S. Pat. Nos. 8,258,932; 6,485,081 and 6,166,625, which are hereby incorporated herein by reference in their entireties. The system may utilize aspects of various cabin or passenger or driver monitoring systems, such as head and face direction and position tracking systems and/or eye tracking systems and/or gesture recognition systems. Such head and face direction and/or position tracking systems and/or eye tracking systems and/or gesture recognition systems may utilize aspects of the systems described in U.S. Publication Nos. US-2016-0137126; US-2015-0352953; US-2015-0296135; US-2015-0294169; US-2015-0232030; US-2015-0022664; US-2015-0015710; US-2015-0009010 and/or US-2014-0336876, which are hereby incorporated herein by reference in their entireties.
The system includes an image processor operable to process image data captured by the camera or cameras, such as for detecting objects or other vehicles or pedestrians or the like in the field of view of one or more of the cameras. For example, the image processor may comprise an image processing chip selected from the EYEQ family of image processing chips available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel, and may include object detection software (such as the types described in U.S. Pat. Nos. 7,855,755; 7,720,580 and/or 7,038,577, which are hereby incorporated herein by reference in their entireties), and may analyze image data to detect vehicles and/or other objects. Responsive to such image processing, and when an object or other vehicle is detected, the system may generate an alert to the driver of the vehicle and/or may generate an overlay at the displayed image to highlight or enhance display of the detected object or vehicle, in order to enhance the driver's awareness of the detected object or vehicle or hazardous condition during a driving maneuver of the equipped vehicle.
For example, the vision system and/or processing and/or camera and/or circuitry may utilize aspects described in U.S. Pat. Nos. 9,233,641; 9,146,898; 9,174,574; 9,090,234; 9,077,098; 8,818,042; 8,886,401; 9,077,962; 9,068,390; 9,140,789; 9,092,986; 9,205,776; 8,917,169; 8,694,224; 7,005,974; 5,760,962; 5,877,897; 5,796,094; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 6,636,258; 7,145,519; 7,161,616; 7,230,640; 7,248,283; 7,295,229; 7,301,466; 7,592,928; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, and/or U.S. Publication Nos. US-2014-0340510; US-2014-0313339; US-2014-0347486; US-2014-0320658; US-2014-0336876; US-2014-0307095; US-2014-0327774; US-2014-0327772; US-2014-0320636; US-2014-0293057; US-2014-0309884; US-2014-0226012; US-2014-0293042; US-2014-0218535; US-2014-0218535; US-2014-0247354; US-2014-0247355; US-2014-0247352; US-2014-0232869; US-2014-0211009; US-2014-0160276; US-2014-0168437; US-2014-0168415; US-2014-0160291; US-2014-0152825; US-2014-0139676; US-2014-0138140; US-2014-0104426; US-2014-0098229; US-2014-0085472; US-2014-0067206; US-2014-0049646; US-2014-0052340; US-2014-0025240; US-2014-0028852; US-2014-005907; US-2013-0314503; US-2013-0298866; US-2013-0222593; US-2013-0300869; US-2013-0278769; US-2013-0258077; US-2013-0258077; US-2013-0242099; US-2013-0215271; US-2013-0141578 and/or US-2013-0002873, which are all hereby incorporated herein by reference in their entireties. The system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in International Publication Nos. WO 2010/144900; WO 2013/043661 and/or WO 2013/081985, and/or U.S. Pat. No. 9,126,525, which are hereby incorporated herein by reference in their entireties.
Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of the invention, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law including the doctrine of equivalents.
The present application is a continuation of U.S. patent application Ser. No. 15/888,774, filed Feb. 5, 2018, which claims the filing benefits of U.S. provisional application Ser. No. 62/455,111, filed Feb. 6, 2017, which is hereby incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
62455111 | Feb 2017 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15888774 | Feb 2018 | US |
Child | 17031958 | US |