Modern automotive vehicles can include cabin monitoring systems to detect and classify in-vehicle occupants. These systems can include one or more components, including cameras, radar sensors, microphones, speakers, and/or display consoles. Although these systems can display a rear-seat occupant (e.g., a baby in a car seat) on a console display in response to driver input, a driver may still be distracted as they try to determine whether the occupant is distressed or as they soothe a fussy child.
This document describes one or more aspects of an in-vehicle occupant monitoring and calming system. In one example, the system includes a processor that receives occupant data from occupancy-monitoring sensors (e.g., microphones, cameras, radar sensors, ultrasonic sensors) of a vehicle. Based on the occupant data, the processor can determine whether the occupant is distressed and provide an image or video of the occupant to the driver. The processor can also display driver-selectable options to calm the distressed occupant. The options can include playing an audio or video file for the occupant, adjusting the ambient lighting of the vehicle, or rocking or gently vibrating the occupant's seat. In this way, the described system can monitor vehicle occupants and automatically display a video of the distressed occupant to the driver. In addition, the driver can calm the distressed occupant without removing their attention from driving.
This document also describes methods performed by the above-summarized system and other configurations of the system set forth herein and means for performing these methods.
This Summary introduces simplified concepts related to an in-vehicle occupant monitoring and calming system, which are further described in the Detailed Description and Drawings. This Summary is not intended to identify essential features of the claimed subject matter, nor is it intended for use in determining the scope of the claimed subject matter.
The details of one or more aspects of an in-vehicle occupant monitoring and calming system are described in this document with reference to the following drawings. The same numbers are used throughout the drawings to reference like features and components:
As described above, some modern vehicles include cabin monitoring systems to detect in-vehicle occupants. These systems can include one or more components, including cameras, radar sensors, microphones, speakers, and/or display consoles. Although these systems can display a rear-seat occupant (e.g., a baby in a car seat), a driver may still be distracted as they try to determine whether the occupant is distressed (e.g., a fussy child) or as they try to soothe the occupant.
The techniques of this disclosure relate to an in-vehicle occupant monitoring and calming system. An example system can include a processor that receives occupant data from occupancy-monitoring sensors of a vehicle. The occupancy-monitoring sensors can include audio sensors that detect sounds made by an occupant, cameras that track the movement of the occupant, and other sensors that monitor the occupant's biometric data. Based on the occupant data, the processor can determine whether the occupant is distressed and provide an image or video of the occupant to the driver. The processor can also display driver-selectable options to calm the distressed occupant. The options can include playing an audio or video file for the occupant, adjusting the ambient lighting of the vehicle, or rocking or gently vibrating the occupant's seat. In this way, the described system can monitor vehicle occupants and automatically display an image or video of the distressed occupant to the driver without requiring the driver to request the image or video. In addition, the driver can soothe the distressed occupant without removing their attention from the operation of the vehicle.
This is just one example of the described in-vehicle occupant monitoring and calming system. This document describes other examples and implementations.
The vehicle 102 includes one or more interior sensors 104, an occupant calming system 106, and a display 114. The interior sensors 104 are mounted to, or integrated within, an interior portion of the vehicle 102 to detect aspects of the occupant 118 and other passengers. The interior sensors 104 can include one or more cameras, microphones, radar sensors, ultrasonic sensors, and/or infrared cameras that monitor the occupant 118. In particular, the interior sensors 104 can be positioned to have a field of view that includes one or more occupants 118.
The occupant calming system 106 can include an occupant monitoring system 108, a display controller 110, and a feedback controller 112. Using data from one or more interior sensors 104, the occupant monitoring system 108 can monitor the occupant 118. In particular, the occupant monitoring system 108 can determine whether the occupant 118 is distressed or may soon become distressed. Although the term “distressed” is used in this description, it should be understood that the disclosed techniques and apparatuses can monitor for and address any forms of discomfort, stress, agitation, fussiness, sadness, or pain.
As described in greater detail below, the occupant monitoring system 108 can use camera data to track key body points of the occupant 118 and determine whether the occupant 118 is distressed or will soon become distressed (e.g., be awakened). The occupant monitoring system 108 can, for example, provide the camera data to a machine-learned model (e.g., a deep neural network) to recognize actions of the occupant 118 that indicate distress or potential distress soon. For example, the occupant monitoring system 108 can monitor camera data for certain facial expressions or strong or sudden arm or leg movements.
As another example, the occupant monitoring system 108 can use audio data from one or more microphones to monitor the occupant 118. A machine-learned model can be trained to recognize typical sounds associated with a baby or young child that is distressed or will soon be distressed. The occupant monitoring system 108 can also use the audio data to determine from which seat the distress sounds are coming. The occupant monitoring system 108 can also use thermographic camera data to monitor the body temperature of the occupant 118.
The display controller 110 can control data and user interface options provided to the driver 116 via the display 114. For example, the display controller 110 can process camera data to provide a cropped and/or enhanced image or video of the occupant 118 on the display 114. The image or video can be automatically provided to the driver in response to the occupant monitoring system 108 determining that the occupant 118 is distressed. The display controller 110 can also determine one or more user-selectable options to provide to the driver 116 via the display 114. For example, the display controller 110 can present one or more touchscreen buttons to the driver 116 to soothe or calm the occupant 118.
The feedback controller 112 can control feedback operations to calm or soothe the occupant 118. For example, the feedback controller 112 can adjust the interior lighting of the vehicle 102 by dimming or slightly illuminating the vehicle's interior. The feedback controller 112 can also adjust the color or hue of the interior lights to provide a more calming effect. Similarly, the feedback controller 112 can play music (e.g., white noise, nursery songs) to calm the occupant 118. The music can be loaded from the driver's smartphone, an infotainment system of the vehicle 102, or a remote computer system. The feedback controller 112 can also control vibration motors and/or other motors to provide a soothing motion or rocking sensation to the seat occupied by the occupant 118.
The display 114 provides the driver 116 with information regarding the occupant 118 and feedback control options. For example, the display 114 can be integrated into a center console or dashboard of the vehicle 102. As described above, the display 114 can provide an image or video of the occupant 118 when the occupant monitoring system 108 determines that the occupant 118 is distressed or may soon become distressed. The display 114 can also provide options for the driver 116 to soothe the occupant 118 without having to remove their attention from the roadway. In this way, the driver 116 can be alerted to the occupant's distress and calm the occupant 118 without removing their attention from the operation of the vehicle 102.
The interior sensors 104 can include one or more cameras 202, microphones 204, infrared (IR) cameras 206, and radar sensors 208. The cameras 202 can capture an image or video of the vehicle cabin. For example, the cameras 202 can provide color images of the vehicle cabin to provide on the display 114. The IR cameras 206 can detect whether the occupant 118 is a living being. The IR cameras 206 can use an infrared light source (e.g., a vertical-cavity surface-emitting laser (VCSEL) or IR light-emitting diode (LED)) to provide lighting for interior sensing during low ambient light conditions. The IR cameras 206 can provide monochrome IR images that the occupant monitoring system 108 can process. The vehicle 102 can position the cameras 202 and the IR cameras 206 to have fields of view for regions of interest that correspond to the seating positions within the vehicle cabin. The vehicle 102 can include the cameras 202 (e.g., red-green-blue (RGB) cameras) and the IR cameras 206 capturing the same seating position(s). In other implementations, an RGB camera and IR camera can be integrated into a single camera device. The cameras 202 can also include three-dimensional time-of-flight cameras that measure the time for light pulses to leave the camera and reflect back on the camera's imaging array.
The microphones 204 can detect audio content in the vehicle 102, including sounds associated with an infant waking up or becoming agitated. The vehicle 102 can also include thermographic cameras or thermal cameras that detect a temperature of the occupant 118 based on a thermal image. The radar sensor 208 or another sensor (e.g., an ultrasonic sensor, a time-of-flight camera) can detect the heart rate of the occupant 118 by monitoring radio waves, including monitoring for an irregular heart rate of the occupant 118.
The processor 210 (e.g., an electronic control unit or control circuit) can be a microprocessor or a system-on-chip configured to perform the techniques described in this disclosure. The processor 210 can include one or more application-specific integrated circuits (ASICs), field-programmable gate arrays (FPGAs), or one or more general-purpose hardware processors programmed to perform the techniques described herein. The processor 210 can execute computer-executable instructions stored in the CRM 212. The CRM 212 can be a memory or storage media, including non-volatile memory, such as electrically erasable programmable read-only memory (EEPROM) for storing one or more computer-executable instructions, routines, thresholds, and captured data. The CRM 212 may include other examples of non-volatile memory, such as flash memory, read-only memory (ROM), programmable read-only memory (PROM), and erasable PROM (EPROM). The processor 210 or the CRM 212 may also include volatile memory (e.g., dynamic random-access memory (DRAM), static random-access memory (SRAM)).
The processor 210 can process sensor data from the interior sensors 104 and determine whether the occupant 118 is distressed or about to become distressed. In response to driver input, the processors 210 can control the operation of the interior lights 214, the speakers 216, the rear entertainment system 218, and/or the seat motors 220 to calm the occupant 118.
The occupant calming system 106 can be stored in the CRM 212. As described for
The occupant monitoring system 108 can similarly use data from the microphones 204, the IR cameras 206, and the radar sensors 208 to determine occupant distress. The audio data from the microphones 204 can be used to monitor for sounds associated with a fussy child or a child about to become fussy. The thermographic camera data can indicate raising body temperatures or irregular body temperatures. Likewise, the occupant monitoring system 108 can identify irregular or raised heart rates or breathing that indicate occupant distress. Like the camera data, the occupant monitoring system 108 can use a machine-learned model to analyze the data from the microphones 204, the IR cameras 206, the radar sensors 208, and/or other sensors to detect occupant distress.
In response to the occupant monitoring system 108 determining that the occupant 118 is distressed or about to become distressed, the display controller 110 can control the display 114 to provide an image or video of the occupant 118 to the driver 116. The display controller 110 can also control the interior lights 214 to improve the visibility of the occupant 118 in the captured image or video. For example, the display controller 110 can slowly or partially illuminate the interior lights above or near the occupant 118 to improve the clarity of the captured image or video. The interior lights 214 can include cabin lighting in the ceiling, door panels, or other areas of the vehicle 102. The display controller 110 can also provide selectable options for the driver 116 to calm the occupant 118. The selectable options can include changing the interior lighting, playing music, playing a video, or rocking the occupant's seat. The driver 116 can select a particular option via a touchscreen of the display 114, audio input, a button near the display 114, or similar input means.
In response to the driver 116 selecting one of the selectable options, the feedback controller 112 can perform certain functions to try to calm the occupant 118. For example, the feedback controller 112 can control the interior lights 214 to dim or brighten the ambient lighting. The feedback controller 112 can also control a central entertainment system or the rear entertainment system 218 to play music or multimedia files. The rear entertainment system 218 can include the speakers 216 and/or a display dedicated to the occupant's seat or the rear portion of the vehicle 102. The music can include white noise, ocean noises, nursery songs, or any other audio file. The multimedia files can include a movie, content from a streaming service, a live video of the driver 116, or any other video data.
The feedback controller 112 can also control the seat motors 220 (e.g., actuators) in the occupant's seat to try to calm the occupant 118. The seat motors 220 can be integrated into the bottom and/or backrest of the seat. The feedback controller 112 can activate the seat motors 220 to mimic a rocking movement, vibrations, or other soothing motions to calm the occupant 118.
At 304, the occupant monitoring system 108 obtains audio data 302 from the microphones 204 in the vehicle 102. The occupant monitoring system 108 or a machine-learned model (e.g., a deep neural network) thereof can use the audio data 302 to detect sounds associated with the occupant 118 becoming or being distressed. For example, the occupant monitoring system 108 can determine that a child is awakening from a change in their breathing, soft or intermittent crying sounds, or similar sounds. The distress sounds can also include soft speaking or statements directed to the driver (e.g., “mom”) from a toddler or young child. The occupant monitoring system 108 can also utilize the audio data 302 to determine from which seat the distress sounds are coming In response to detecting distressed sounds, the occupant monitoring system 108 can forward the audio data 302 or the distress sounds to another machine-learned model or component thereof.
At 310, the occupant monitoring system 108 obtains camera data 308 (e.g., images or video) from the cameras 202 in the vehicle 102. The occupant monitoring system 108 or a machine-learned model thereof can use the camera data 308 to track key body points of the occupant 118. The key body points can include arms, shoulders, hips, head, eyes, or other body locations. For example, the occupant monitoring system 108 can analyze the camera data 308 to determine if the occupant 118 has extensive or strong body movements that indicate distress.
At 312, the occupant monitoring system 108 obtains the audio data 302 or the distress sounds identified in operation 304, along with the tracking of key body points and biometric data 306. The biometric data 306 can include the body temperature of the occupant 118 from a thermographic camera or heart rate data from the radar sensor 208 or a time-of-flight camera.
The occupant monitoring system 108 or a machine-learned model can fuse the input data (or a subset thereof) and analyze it to detect whether the occupant 118 is distressed or becoming distressed. For example, the machine-learned model can be trained to associate certain movements of one or more key body points with the occupant 118 becoming or being distressed. Certain facial expressions or rapid arm or leg movements can be associated with distress or rising distress. Similarly, the biometric data 306 can be used to identify or complement a determination that the occupant 118 is distressed. For example, changes in heart rate or body temperature can be monitored to improve the occupant monitoring. To reduce false detections, the occupant monitoring system 108 or the machine-learned model can determine a probability that the occupant 118 is distressed or about to become distressed and compare the probability to a configurable threshold (e.g., seventy-five percent) before determining that the occupant 118 is distressed. The occupant monitoring system 108 or the machine-learned model can also use the input data to locate the occupant 118 in the vehicle 102.
The input data to operation 312, including the audio data 302, the biometric data 306, and the camera data 308, can be periodically updated by the occupant monitoring system 108 to ensure accurate and timely determinations of occupant distress. For example, the occupant monitoring system 108 can update the input data at fifteen-second intervals to account for temporary changes in occupant distress.
At 314, the occupant monitoring system 108 or the display controller 110 captures video with a region of interest (ROI) associated with the seat in which the occupant 118 is located. In particular, the display controller 110 can crop the camera data 308 to focus on the distressed occupant. For example, the audio data 302 can be used to determine the region of interest and which occupant has become distressed to assist with the cropping. In this way, the irrelevant background is removed from an ROI video 318 displayed to the driver 116.
At 316, the display controller 110 can also adjust the interior lights 214 in the vehicle 102 to improve the clarity of the video by improving the image exposure. The display controller 110 can also perform other image processing (e.g., enhancement, contrast adjustment) to improve the ROI video 318 of the occupant 118.
The display controller 110 can then output the ROI video 318 to the display 114. For example, the ROI video 318 can be shown on a center console display. The display controller 110 can also cause one or more calming options 320 to be displayed with the ROI video 318 on the display 114. The calming options 320 are driver-selectable user input (UI) options that the feedback controller 112 can cause to be performed to try to calm the occupant 118. For example, the calming options 320 can include playing an audio or multimedia file for the occupant 118 using the speakers 216 and/or the rear entertainment system 218. The audio or multimedia file can be obtained from a local memory in the vehicle, a remote computer system (e.g., an online music service), or the driver's smartphone or another electronic device (e.g., a music app installed into an entertainment system of the vehicle 102). The audio file can be nursery music, white noise, or a favorite song of the occupant. The multimedia file can be a movie, television show episode, or other multimedia content.
The calming options 320 can also include displaying a live video of the driver's face to the rear entertainment system 218 so the driver can talk, sing, or otherwise calm the occupant 118. The calming options 320 can also include adjusting the ambient lighting by adjusting the interior lights 214 or raising or lowering automatic sunshades for the rear windows of the vehicle 102. Further, the calming options 320 can include applying a soothing motion pattern to the seat occupied by the occupant 118. For example, the feedback controller 112 can activate actuators or motors in or attached to the seat to induce a rocking or vibration sensation to the occupant 118. Similarly, the feedback controller 112 can activate motors to move the seat forward and backward to gently rock the occupant 118.
At 402, occupant data from one or more occupancy-monitoring sensors of a vehicle is obtained. For example, the occupant monitoring system 108 can obtain the audio data 302 from the microphones 204, the camera data 308 from the cameras 202 or the IR cameras 206, or the biometric data 306. The vehicle 102 can include additional interior sensors 104 to obtain occupant data for the rear passengers.
The occupant data can include the audio data 302 from the rear seat(s) of the vehicle or the camera data 308 of the rear seat(s) of the vehicle. The biometric data 306 can be obtained from the IR cameras 206, radar sensor 208, a thermographic camera, an ultrasonic sensor, or a time-of-flight camera. The biometric data 306 can include a body temperature, a breathing pattern, or the heart rate of the occupant 118.
At 404, it is determined whether an occupant of the vehicle is distressed based on the occupant data. The occupant is seated in a rear seat of the vehicle and can be an infant, toddler, or young child. For example, the occupant monitoring system 108 can determine whether the occupant 118 is distressed based on the occupant data. The occupant monitoring system 108 can provide the occupant data as an input to a machine-learned model. The machine-learned model can then determine whether changes in the occupant data are associated with distress. For example, the distress determination can be based on the audio data 302 and whether sounds or changes in breathing from the occupant are associated with distress. It can also include tracking, using the camera data 308, positions of one or more key body points of the occupant 118, and determining changes in the positions of the key body points are above a movement threshold. The key body points can include the head, shoulders, eyes, arms, shoulders, legs, or hips of the occupant 118. Similarly, the distress determination can be based on whether changes in the biometric data 306 associated with the occupant 118 are associated with distress.
The machine-learned model can also track changes in the occupant data and determine a probability that the occupant is distressed based on those changes or a current data cycle of occupant data. The distress probability can then be compared to a distress threshold, which can be a configurable confidence value. The machine-learned model can determine that the occupant 118 is distressed based on a determination that the distress probability is greater than the distress threshold.
At 406, in response to determining that the occupant is distressed, an image or video of the occupant is displayed on a display located in a field of view of a driver of the vehicle. For example, the display controller 110 can display the ROI video 318 on the display 114 in response to a determination by the occupant monitoring system 108 that the occupant 118 is distressed. The display controller 110 can apply a processing action to the ROI video 318 before displaying it to improve its clarity. For example, the processing action can include cropping the ROI video 318 to focus on the occupant 118, adjusting the brightness or contrast of the ROI video 318, and/or adjusting the ambient lighting of the vehicle 102.
At 408, one or more selectable options for the driver to select are displayed on the display to calm or soothe the occupant. For example, the display controller 110 can provide selectable calming options 320 on the display 114 for calming the occupant 118. The calming options 320 can include playing an audio or multimedia file on the rear entertainment system 218 of the vehicle. Similarly, the calming options 320 can also include displaying an image or video of the driver 116 to the occupant via the rear entertainment system 218. The driver 116 can also be given the option to adjust the ambient lighting of the vehicle 102 by dimming or brightening the interior lights of the vehicle or raising or lowering sunshades of the vehicle 102. The calming options 320 can further include activating actuators or motors in or attached to the rear seat to introduce a vibration or motion pattern to the occupant 118. Similarly, the feedback controller 112 can activate motors in or attached to the rear seat to move the rear seat forward and backward to introduce a rocking motion.
As described with respect to
The display controller 110 can also display one or more calming options 320 on the display 114. In the illustrated environment 500, four calming options 320 are presented to the driver 116. In other implementations, fewer or additional calming options 320 can be presented. The types of calming options 320 are described with respect to
In response to the driver 116 selecting one or more of the calming options 320, the feedback controller 112 can perform the calming action associated with the selected calming option 320. For example, the feedback controller 112 can cause a real-time video of the driver 116 to be displayed on the rear entertainment system 218 of the vehicle 102 so the occupant 118 can see the driver's face. As another example, the feedback controller 112 can cause a particular multimedia file to be played on the speakers 216 or the rear entertainment system 218. In this way, the occupant calming system 106 can assist the driver 116 in monitoring and calming the distress of the occupant 118 with minimal distraction from operation of the vehicle 102.
In the following section, examples are provided.
Example 1. A method comprising: obtaining occupant data from one or more occupancy-monitoring sensors of a vehicle; determining, based on the occupant data, whether an occupant of the vehicle is distressed, the occupant being seated in a rear seat of the vehicle; and in response to determining that the occupant is distressed, displaying, on a display located in a field of view of a driver of the vehicle, an image or video of the occupant.
Example 2. The method of Example 1, wherein: the one or more occupancy-monitoring sensors comprise one or more microphones; and the occupant data comprise audio data from the rear seat of the vehicle.
Example 3. The method of Example 2, wherein determining whether the occupant is distressed comprises determining, based on the audio data, whether sounds or changes in breathing from the occupant are associated with distress.
Example 4. The method of any one of the preceding Examples, wherein: the one or more occupancy-monitoring sensors comprise one or more cameras; and the occupant data comprise camera data of the rear seat of the vehicle.
Example 5. The method of Example 4, wherein: the method further comprises tracking, using the camera data, positions of one or more key body points of the occupant; and determining whether the occupant is distressed comprises determining whether changes in the positions of the one or more of the key body points are above a movement threshold.
Example 6. The method of Example 5, wherein the key body points include at least one of a head, shoulders, eyes, arms, legs, or hips of the occupant.
Example 7. The method of any one of the preceding Examples, wherein: the one or more occupancy-monitoring sensors comprise at least one of an infrared camera, a radar sensor, a time-of-flight camera, a thermographic camera, or an ultrasonic sensor; the occupant data comprise biometric data associated with the occupant, the biometric data including at least one of a body temperature, a breathing pattern, or a heart rate of the occupant; and determining whether the occupant is distressed comprises determining, based on the biometric data, whether changes in the biometric data associated with the occupant are associated with distress.
Example 8. The method of any one of the preceding Examples, wherein determining whether the occupant is distressed comprises: providing the occupant data as an input to a machine-learned model, the occupant data comprising at least one of camera data, audio data, or biometric data associated with the occupant; and determining, by the machine-learned model, whether changes in the occupant data are associated with distress.
Example 9. The method of Example 8, wherein determining whether the changes in the occupant data are associated with distress comprises: determining, based on the changes in the occupant data and using the machine-learned model, a probability that the occupant is distressed; and determining whether the probability that the occupant is distressed is greater than a distress threshold, the distress threshold being a configurable confidence value.
Example 10. The method of any one of the preceding Examples, the method further comprises: before displaying the image or video of the occupant, applying a processing action to the image or video of the occupant, the processing action comprising at least one of: cropping the image or video to focus on the occupant; adjusting a brightness or contrast of the image or video; or adjusting ambient lighting of the vehicle.
Example 11. The method of any one of the preceding Examples, the method further comprises: displaying, on the display, one or more selectable options for the driver to select to calm or soothe the occupant.
Example 12. The method of Example 11, wherein the one or more selectable options comprise: playing an audio file in the vehicle; playing a multimedia file on a rear entertainment system of the vehicle; displaying an image or video of the driver to the occupant via the rear entertainment system; adjusting ambient lighting of the vehicle by dimming or brightening interior lights of the vehicle; raising or lowering sunshades of the vehicle; activating actuators or motors in or attached to the rear seat to introduce a vibration or motion pattern; or activating additional motors in or attached to the rear seat to move the rear seat forward and backward to introduce a rocking motion.
Example 13. The method of any one of the preceding Examples, wherein the occupant comprises an infant, a toddler, or a young child.
Example 14. A system, comprising: a processor configured to perform the method of any one of Examples 1 through 13.
Example 15. A non-transitory computer-readable media that stores computer-executable instructions that, when executed by a processor of a vehicle, cause the processor to perform the method of any one of Examples 1 through 13.
While various embodiments of the disclosure are described in the foregoing description and shown in the drawings, it is to be understood that this disclosure is not limited thereto but may be variously embodied to practice within the scope of the following claims. From the foregoing description, it will be apparent that various changes may be made without departing from the spirit and scope of the disclosure as defined by the following claims.
The use of “or” and grammatically related terms indicates non-exclusive alternatives without limitation unless the context clearly dictates otherwise. As used herein, a phrase referring to “at least one of” a list of items refers to any combination of those items, including single members. As an example, “at least one of: a, b, or c” is intended to cover a, b, c, a-b, a-c, b-c, and a-b-c, as well as any combination with multiples of the same element (e.g., a-a, a-a-a, a-a-b, a-a-c, a-b-b, a-c-c, b-b, b-b-b, b-b-c, c-c, and c-c-c or any other ordering of a, b, and c).
This application claims the benefit under 35 U.S.C. 119(e) of U.S. Provisional Application No. 63/363,291 filed Apr. 20, 2022, the disclosure of which is hereby incorporated by reference in its entirety herein.
Number | Date | Country | |
---|---|---|---|
63363291 | Apr 2022 | US |