The present invention relates to a system and method for use of a sensor matrix in a vehicle interior with use cases for application.
The present invention relates to a system and method for use of a sensor matrix in a vehicle interior in which data is used to provide an output signal comprising an enhancement of the input signal and/or an augmentation of the input signal.
It is known to provide a system and method for use of sensor/detectors in a vehicle interior.
It would be advantageous to provide an improved system and method for use of a sensor matrix in a vehicle interior.
It would be advantageous to provide an improved system and method for use of a sensor matrix in a vehicle interior with interaction with at least one vehicle occupant.
It would be advantageous to provide an improved system and method for use of a sensor matrix in a vehicle interior to create a “smart cabin” environment with interactive connectivity to vehicle systems and/or networks for a vehicle occupant.
It would be advantageous to provide an improved system and method for use of a sensor matrix in a vehicle interior to provide for interactive connectivity to vehicle systems and/or networks in a wide variety of situations and/or use cases.
It would be advantageous to provide an improved system and method for use of a sensor matrix in a vehicle interior to provide for interactive connectivity to vehicle systems and/or networks in a wide variety of situations and/or use cases to provide for needs of vehicle owners and/or occupants and/or passengers/riders.
It would be advantageous to provide an improved system and method for use of a sensor matrix in a vehicle interior to provide for interactive connectivity to vehicle systems and/or networks in a wide variety of situations and/or use cases for vehicles such as personal vehicles or commercial vehicles or autonomous vehicles.
It would be advantageous to provide an improved system and method for use of a sensor matrix in a vehicle interior to provide for interactive connectivity to vehicle systems and/or networks in a wide variety of situations and/or use cases for vehicles such as personal vehicles or commercial vehicles or autonomous vehicles using data to provide an output signal comprising an enhancement of the input signal and/or an augmentation of the input signal.
It would be advantageous to provide an improved system and method for use of a sensor matrix in a vehicle interior to use data to provide an output signal comprising an enhancement of the input signal and/or an augmentation of the input signal in which the output signal may comprise a signal based on application of artificial intelligence and/or the output signal may comprise a signal based on application of augmented reality.
It would be advantageous to provide an improved system and method for use of a distributed sensor matrix configured to obtain data from within the vehicle; at least one data source may comprise the sensor matrix and/or a vehicle system and/or data storage and/or a network.
The present invention relates to a system for an interior of a vehicle comprising a vehicle system configured to use data from at least one data source and to provide a user interface configured for interaction with an occupant comprising a sensor arrangement comprising at least one sensor configured to obtain an input signal; and a computing system configured to process the input signal from the sensor to facilitate an operation and to provide an output signal. The output signal may be sent to at least one of (a) the vehicle system and/or (b) the user interface. The computing system may be configured to use data to provide an output signal comprising an enhancement of the input signal and/or an augmentation of the input signal. The output signal may comprise a signal based on application of artificial intelligence. The output signal may comprise a signal based on application of augmented reality. The sensor arrangement may comprise a distributed sensor matrix configured to obtain data from within the vehicle; data may comprise the input signal. The at least one data source may comprise the sensor matrix and/or a vehicle system and/or data storage and/or a network. The system may comprise a component configured to present the user interface; the component may comprise a user interface system configured to present the user interface; the user interface system may comprise an input device and/or an output device; the user interface system may comprise an input device and an output device; the user interface system may comprise an input device configured to obtain an input signal and an output device configured to present an output signal. The user interface system may comprise at least one sensor; the user interface system may comprise an input device; the input device may comprise at least one sensor. The user interface system may comprise an output device; the output device may comprise an information display. The sensor arrangement may comprise a set of sensors; the set of sensors may be configured to provide a sensor matrix; the sensor matrix may comprise a sensor field. The sensor matrix may comprise a distributed sensor matrix within the interior of the vehicle. The computing system may comprise a computing device with a processor. The processor may be configured to use data to provide an output signal comprising an enhancement of the input signal. The processor may be configured to operate with a data enhancement module configured with artificial intelligence to use data to provide an output signal comprising an enhancement of the input signal; the output signal may comprise information provided at a display. The processor may be configured to use data to operate with a data augmentation module configured for augmented reality to provide an output signal comprising an augmentation of the input signal; augmentation may comprise at least one of augmented audio and/or augmented video. The data source may comprise at least one of data storage and/or a network; the network may comprise the internet; the data source may comprise a language model for data enhancement comprising machine learning for artificial intelligence. The data source may comprise an augmented reality model for data augmentation; the input device may be configured to obtain data based on reality and the output device may be configured to present data based on augmented reality; the output signal may comprise a signal based on augmented reality. The input signal may comprise at least one of (a) proximity; (b) action; (c) motion; (d) sound; (c) condition; (f) characteristic; (g) presence/absence of occupant; (h) position of occupant; (i) interaction with occupant; (j) detected input from occupant; (k) directed input from occupant; (l) detected event; (m) detected condition; (n) vehicle interior control/activity; (o) smart cabin control/activity; (p) data. The operation may comprise at least one of (a) application of artificial intelligence and/or (b) application of augmented reality; (c) operation of vehicle system; (d) network communication; (e) instruction for vehicle system; (f) operation of vehicle systems; (g) interaction with external systems; (h) data storage; (i) data analytics/analysis; (j) comparison of data to threshold values; (k) monitoring; (l) providing a report based on the input signal; (m) vehicle interior control; (n) smart cabin control; (o) smart cabin activity. The output signal may comprise at least one of (a) audio-visual signal; (b) visual signal; (c) visual display; (d) audible signal; (c) haptic signal; (f) notice/alert; (g) communication to network; (h) communication to a device; (i) communication to a vehicle system; (j) data transmission; (k) data storage; (l) vehicle location; (m) vehicle status; (n) occupant status; (o) communication over a network; (p) report over a network; (q) information. The at least one sensor may comprise at least one of (a) a transducer; (b) a capacitive sensor; (c) an odor detector; (d) an accelerometer; (c) a microphone; (f) a camera; (g) a light detector; (h) an ultraviolet light detector; (i) a thermal detector; (j) RFID detector; (k) tag detector; (l) a sensor array; (m) a sensor matrix; (n) a multi-function sensor; (o) a composite sensor; (p) an audio-visual sensor; (q) a video recorder; (r) an audio recorder; (s) a thermal sensor; (t) a bio-metric sensor. The sensor/sensor arrangement may comprise a sensor matrix; the sensor matrix may comprise at least one of (a) a field; (b) multiple sensors; (c) multiple fields; (d) a field and at least one sensor; the input signal from the sensor may comprise a signal detected (a) by a sensor and/or (b) from within a field; the field may be provided in the interior of the vehicle. The user interface may comprise at least one of (a) a control for the occupant; (b) a display; (c) an audio system; (d) an audio-visual system; (c) an infotainment system; (f) a haptic interface; the vehicle system may comprise at least one of (a) a climate control system; (b) a navigation system; (c) a network; (d) a vehicle network; (e) a seating system; (f) a window control system; (g) a lighting system; (h) a communication system; (i) an autonomous vehicle operation system; (j) a steering system; (k) a drive system; (l) a powertrain system; (m) a starter system; (n) an audio system; (o) a display system; (p) an HVAC system; (q) an emergency warning system; (r) a safety system; (s) an occupant comfort system; (t) an occupant/sensory system; (u) a vehicle environment system; (v) a network/external communication system.
The present invention relates to a method for using a sensor matrix in an interior of a vehicle comprising a vehicle system using data from at least one data source and providing a user interface configured for interaction with an occupant comprising the steps of obtaining an input signal from the sensor matrix; processing the input signal into an output signal at a processor; and performing an operation relating to the input signal. The input signal may comprise at least one of (a) proximity; (b) action; (c) motion; (d) sound; (c) condition; (f) characteristic; (g) presence/absence of occupant; (h) position of occupant; (i) interaction with occupant; (j) detected input from occupant; (k) directed input from occupant; (l) detected event; (m) detected condition; (n) vehicle interior control/activity; (o) smart cabin control/activity; (p) data. The operation may comprise at least one of (a) application of artificial intelligence and/or (b) application of augmented reality; (c) operation of vehicle system; (d) network communication; (c) instruction for vehicle system; (f) operation of vehicle systems; (g) interaction with external systems; (h) data storage; (i) data analytics/analysis; (j) comparison of data to threshold values; (k) monitoring; (l) providing a report based on the input signal; (m) vehicle interior control; (n) smart cabin control; (o) smart cabin activity. The output signal may comprise at least one of (a) audio-visual signal; (b) visual signal; (c) visual display; (d) audible signal; (c) haptic signal; (f) notice/alert; (g) communication to network; (h) communication to a device; (i) communication to a vehicle system; (j) data transmission; (k) data storage; (l) vehicle location; (m) vehicle status; (n) occupant status; (o) communication over a network; (p) report over a network; (q) information. The processor may be operated by at least one of (a) a control program; (b) a software program. The at least one data source may comprise the sensor matrix and/or a vehicle system and/or data storage and/or a network.
The present invention relates to a system for an interior of a vehicle comprising a vehicle system configured to use data from at least one data source and to provide a user interface configured for interaction with an occupant comprising a sensor arrangement comprising at least one sensor configured to obtain an input signal; and a computing system configured to process the input signal from the sensor to facilitate an operation and to provide an output signal. The output signal may be sent to at least one of (a) the vehicle system and/or (b) the user interface. The computing system may be configured to use data to provide an output signal comprising an enhancement of the input signal and/or an augmentation of the input signal. The output signal may comprise a signal based on application of artificial intelligence. The output signal may comprise a signal based on application of augmented reality. The sensor arrangement may comprise a distributed sensor matrix configured to obtain data from within the vehicle. Data may comprise the input signal. At least one data source may comprise the sensor matrix and/or a vehicle system and/or data storage and/or a network. The system may comprise a component configured to present the user interface. The component may comprise a user interface system configured to present the user interface. The user interface system may comprise an input device and/or an output device. The user interface system may comprise an input device and an output device. The user interface system may comprise an input device configured to obtain an input signal and an output device configured to present an output signal. The user interface system may comprise at least one sensor; the user interface system may comprise an input device; the input device may comprise at least one sensor. The user interface system may comprise an output device; the output device may comprise an information display. The sensor arrangement may comprise a set of sensors; the set of sensors may be configured to provide a sensor field. The sensor arrangement may comprise a sensor matrix; the sensor matrix may comprise a sensor field. The sensor matrix may comprise a distributed sensor matrix within the interior of the vehicle. The computing system may comprise a computing device with a processor. The processor may be configured to use data to provide an output signal comprising an enhancement of the input signal. The processor may be configured to operate with a data enhancement module configured for machine learning to use data to provide an output signal comprising an enhancement of the input signal. The processor may be configured to operate with a data enhancement module configured with artificial intelligence to use data to provide an output signal comprising an enhancement of the input signal. The processor may be configured to operate with a data enhancement module configured to with generative artificial intelligence to use data to provide an output signal comprising an enhancement of the input signal; the input signal may comprise a prompt and the output signal comprise information generated by the data enhancement module from the prompt. The output signal may comprise information provided at a display. The processor may be configured to use data to provide an output signal comprising an augmentation of the input signal. The processor may be configured to use data to operate with a data augmentation module configured for augmented reality to provide an output signal comprising an augmentation of the input signal. The processor may be configured to use data to operate with a data augmentation module configured for augmented reality to provide an output signal comprising an augmentation of the input signal; augmentation may comprise application of data from a data source. The processor may be configured to use data to operate with a data augmentation module configured for augmented reality to provide an output signal comprising an augmentation of the input signal; augmentation may comprise at least one of augmented audio and/or augmented video. The output signal may comprise information provided at a display. The data source may comprise at least one of data storage and/or a network; the network may comprise the internet. The data source may comprise a language model for data enhancement comprising machine learning for artificial intelligence. The data source may comprise an augmented reality model for data augmentation. The input device may be configured to obtain data based on reality and the output device may be configured to present data based on augmented reality. The output signal may comprise a signal based on augmented reality. The input signal may comprise at least one of (a) proximity; (b) action; (c) motion; (d) sound; (c) condition; (f) characteristic; (g) presence/absence of occupant; (h) position of occupant; (i) interaction with occupant; (j) detected input from occupant; (k) directed input from occupant; (l) detected event; (m) detected condition; (n) vehicle interior control/activity; (o) smart cabin control/activity; (p) data. The operation may comprise at least one of (a) operation of vehicle system; (b) network communication; (c) instruction for vehicle system; (d) operation of vehicle systems; (e) interaction with external systems; (f) data storage; (g) data analytics/analysis; (h) comparison of data to threshold values; (i) monitoring; (j) providing a report based on the input signal; (k) vehicle interior control; (l) smart cabin control; (m) smart cabin activity. The operation may comprise at least one of (a) application of artificial intelligence and/or (b) application of augmented reality. The output signal may comprise at least one of (a) audio-visual signal; (b) visual signal; (c) visual display; (d) audible signal; (c) haptic signal; (f) notice/alert; (g) communication to network; (h) communication to a device; (i) communication to a vehicle system; (j) data transmission; (k) data storage; (l) vehicle location; (m) vehicle status; (n) occupant status; (o) communication over a network; (p) report over a network; (q) information. The at least one sensor may comprise at least one of (a) a transducer; (b) a capacitive sensor; (c) an odor detector; (d) an accelerometer; (c) a microphone; (f) a camera; (g) a light detector; (h) an ultraviolet light detector; (i) a thermal detector; (j) RFID detector; (k) tag detector. The at least one sensor may comprise at least one of (a) a sensor array; (b) a sensor matrix; (c) a multi-function sensor; (d) a composite sensor; (c) an audio-visual sensor; (f) a video recorder; (g) an audio recorder; (h) a thermal sensor; (i) a bio-metric sensor. The sensor may comprise a sensor matrix; the sensor matrix may comprise at least one of (a) a field; (b) multiple sensors; (c) multiple fields; (d) a field and at least one sensor; the input signal from the sensor may comprise a signal detected (a) by a sensor and/or (b) from within a field; the field may be provided in the interior of the vehicle. The input device may comprise a control for the occupant. The user interface may comprise at least one of (a) a display; (b) an audio system; (c) an audio-visual system; (d) an infotainment system; (c) a haptic interface. The processor may be operated by at least one of (a) a control program; (b) a software program. The vehicle system may comprise at least one of (a) a climate control system; (b) a navigation system; (c) a network; (d) a vehicle network; (c) a seating system; (f) a window control system; (g) a lighting system; (h) a communication system; (i) an autonomous vehicle operation system; (j) a steering system; (k) a drive system; (l) a powertrain system; (m) a starter system; (n) an audio system; (o) a display system; (p) an HVAC system; (q) an emergency warning system; (r) a safety system; (s) an occupant comfort system; (t) an occupant/sensory system; (u) a vehicle environment system; (v) a network/external communication system. The operation may comprise operation of the vehicle and the output signal may comprise at least one of (a) an alert signal; (b) an emergency communication; (c) vital signs of the occupant.
The present invention relates to a method for using a sensor matrix in an interior of a vehicle comprising a vehicle system using data from at least one data source and providing a user interface configured for interaction with an occupant comprising the steps of obtaining an input signal from the sensor matrix; processing the input signal into an output signal; and performing an operation relating to the input signal. Processing the input signal into an output signal may comprise using data to provide the output signal comprising an enhancement of the input signal and/or an augmentation of the input signal. The input signal may comprise at least one of (a) proximity; (b) action; (c) motion; (d) sound; (c) condition; (f) characteristic; (g) presence/absence of occupant; (h) position of occupant; (i) interaction with occupant; (j) detected input from occupant; (k) directed input from occupant; (l) detected event; (m) detected condition; (n) vehicle interior control/activity; (o) smart cabin control/activity; (p) data. The operation may comprise at least one of (a) operation of vehicle system; (b) network communication; (c) instruction for vehicle system; (d) operation of vehicle systems; (e) interaction with external systems; (f) data storage; (g) data analytics/analysis; (h) comparison of data to threshold values; (i) monitoring; (j) providing a report based on the input signal; (k) vehicle interior control; (l) smart cabin control; (m) smart cabin activity. The output signal may comprise at least one of (a) audio-visual signal; (b) visual signal; (c) visual display; (d) audible signal; (c) haptic signal; (f) notice/alert; (g) communication to network; (h) communication to a device; (i) communication to a vehicle system; (j) data transmission; (k) data storage; (l) vehicle location; (m) vehicle status; (n) occupant status; (o) communication over a network; (p) report over a network; (q) information. At least one data source may comprise the sensor matrix and/or a vehicle system and/or data storage and/or a network.
The present invention relates to a method for using a sensor matrix in an interior of a vehicle comprising a vehicle system using data from at least one data source and providing a user interface configured for interaction with an occupant comprising the steps of obtaining an input signal from the sensor matrix; processing the input signal into an output signal; and performing an operation relating to the input signal. The input signal may comprise at least one of (a) proximity; (b) action; (c) motion; (d) sound; (c) condition; (f) characteristic; (g) presence/absence of occupant; (h) position of occupant; (i) interaction with occupant; (j) detected input from occupant; (k) directed input from occupant; (l) detected event; (m) detected condition; (n) vehicle interior control/activity; (o) smart cabin control/activity; (p) data. The operation may comprise at least one of (a) operation of vehicle system; (b) network communication; (c) instruction for vehicle system; (d) operation of vehicle systems; (e) interaction with external systems; (f) data storage; (g) data analytics/analysis; (h) comparison of data to threshold values; (i) monitoring; (j) providing a report based on the input signal; (k) vehicle interior control; (l) smart cabin control; (m) smart cabin activity. The output signal may comprise at least one of (a) audio-visual signal; (b) visual signal; (c) visual display; (d) audible signal; (c) haptic signal; (f) notice/alert; (g) communication to network; (h) communication to a device; (i) communication to a vehicle system; (j) data transmission; (k) data storage; (l) vehicle location; (m) vehicle status; (n) occupant status; (o) communication over a network; (p) report over a network; (q) information. The step of processing the input signal into an output signal may comprise use of data to provide the output signal comprising an enhancement of the input signal and/or an augmentation of the input signal. At least one data source may comprise the sensor matrix and/or a vehicle system and/or data storage and/or a network.
The present invention relates to a system for using a sensor configured to obtain an input signal in an interior of a vehicle comprising a vehicle system and providing a user interface configured for interaction with an occupant comprising a processor configured to process an input signal from the sensor to facilitate an operation and to provide an output signal. The output signal may be sent to at least one of (a) the vehicle system and/or (b) the user interface. The input signal may comprise at least one of (a) proximity, (b) action, (c) motion, (d) sound; (e) condition, (f) characteristic, (g) presence/absence of occupant, (h) position of occupant, (i) interaction with occupant, (j) detected input from occupant, (k) directed input from occupant, (l) detected event; (m) detected condition; (n) vehicle interior control/activity, (o) smart cabin control/activity. The operation may comprise at least one of (a) operation of vehicle system, (b) network communication, (c) instruction for vehicle system, (d) operation of vehicle systems, (e) interaction with external systems, (f) data storage, (g) data analytics/analysis, (h) comparison of data to threshold values; (i) monitoring; (j) providing a report based on the input signal; (k) vehicle interior control (l) smart cabin control, (m) smart cabin activity. The output signal may comprise at least one of (a) audio-visual signal, (b) visual signal, (c) visual display, (d) audible signal, (c) haptic signal, (f) notice/alert, (g) communication to network, (h) communication to device, (i) communication to vehicle system; (j) data transmission, (k) data storage; (l) vehicle location; (m) vehicle status; (n) occupant status; (o) communication over a network; (p) report over a network. The sensor may comprise a sensor array. The sensor may comprise a sensor matrix. The sensor may comprise at least one sensor. The at least one sensor may comprise at least one of (a) a transducer; (b) a capacitive sensor; (c) an odor detector; (d) an accelerometer; (c) a microphone; (f) a camera; (g) a light detector; (h) an ultraviolet light detector; (i) a thermal detector; (j) RFID detector; (k) tag detector. The at least one sensor may comprise at least one of (a) a sensor array; (b) a sensor matrix; (c) a multi-function sensor; (d) a composite sensor; (c) an audio-visual sensor; (f) a video recorder; (g) an audio recorder; (h) a thermal sensor; (i) a bio-metric sensor. The sensor may comprise a sensor matrix. The sensor matrix may comprise a field. The sensor matrix may comprise multiple sensors. The sensor matrix may comprise multiple fields. The sensor matrix may comprise a field and at least one sensor. The input signal from the sensor matrix may comprise a signal detected (a) by a sensor and/or (b) from within a field. The field may be provided in the interior of the vehicle. The sensor matrix may be installed and/or operated with a vehicle interior component. The user interface may comprise an input device and an output device. The input device may comprise a control for the occupant. The input device may comprise a virtual switch; the virtual switch may be configured to be operated by at least one of (a) gesture detection and/or (b) movement by the occupant. The output device may comprise a display. The user interface may be configured to obtain the input signal. The user interface may be configured to present the output signal. The user interface may comprise at least one of (a) a display; (b) an audio system; (c) an audio-visual system; (d) an infotainment system. The user interface may comprise a haptic interface. The vehicle interior component may comprise at least one of (a) a trim panel; (b) an instrument panel; (c) a door panel; (d) a console; (c) a floor console; (f) an overhead console: (g) an overhead system; (h) a seat; (i) a steering wheel; (j) a door pillar. The processor may be operated by a control program. The control program may comprise a software program. The vehicle system may comprise at least one of (a) a climate control system; (b) a navigation system; (c) a network; (d) a vehicle network; (c) a seating system; (f) a window control system; (g) a lighting system; (h) a communication system; (i) an autonomous vehicle operation system; (j) a steering system; (k) a drive system; (l) a powertrain system; (m) a starter system; (n) an audio system; (o) a display system; (p) an HVAC system; (q) an emergency warning system; (r) a safety system; (s) an occupant comfort system; (t) an occupant/sensory system; (u) a vehicle environment system; (v) a network/external communication system. The vehicle may comprise an autonomous vehicle. The sensor may comprise a sensor matrix configured to obtain an input signal from within the vehicle interior. The input signal may comprise vital signs of the occupant. The operation may comprise operation of a vehicle system. The output signal may comprise a report. The operation may comprise operation of the vehicle and the output signal may comprise an alert signal. The alert signal may comprise an emergency communication. The operation may comprise taking autonomous control of the vehicle. The operation may comprise parking the vehicle. The emergency communication may comprise vital signs of the occupant. The report may comprise vehicle location. The input signal may comprise detection of an event in the vehicle. The operation may comprise determination of the event and the output signal may comprise a report. The event may comprise a potential medical concern for the occupant. The potential medical concern may be an illness of the occupant. The operation may comprise taking autonomous control of the vehicle. The operation may comprise taking autonomous control of the vehicle. The operation may comprise remediation of the event for the vehicle. Remediation of the event for the vehicle may comprise sanitizing the vehicle. The report may comprise a communication of vehicle location. The input signal may comprise incapacitation of the occupant. The operation may comprise determination of the condition and the output signal may comprise a report relating to the condition. The incapacitation may be of the operator of the vehicle; and the operation may comprise taking control of the vehicle. The report may comprise an emergency communication. The output signal may comprise activating an emergency signal for the vehicle. The report may comprise the location of the vehicle. The input signal may be based on a physical parameter of the occupant. The operation may comprise comparison of the physical parameter of the occupant based on a threshold value for the physical parameter. The input signal may comprise detection relating to a status of an item. The operation may comprise determination of the status of the item and the output signal may comprise a report based on the status of the item. The input signal may comprise the status of the item; the status of the item may comprise either (a) detected or (b) not detected. If the status of the item is not detected the operation may comprise monitoring for the item and the report may comprise an alert that the item is not detected. If the status of the item is detected the report may comprise a communication to a contact point; the contact point may be in communication over a network. The report may comprise an electronic message. The input signal may be based on a tag associated with the item. The tag may comprise an RFID tag. The input signal may comprise detection of an incident and the operation may comprise determination of the incident and the output signal may comprise a report on the incident. The incident may comprise a vehicle collision. The operation may comprise monitoring of the status of the vehicle occupant. The report may comprise a communication with emergency responders. The input signal may comprise data relating to the incident. The operation may comprise analysis of the incident. The report may comprise data relating to the incident. The report may comprise analysis of the incident. The input signal may comprise detection of tampering with the vehicle. The operation may comprise verification of tampering with the vehicle and the output signal may comprise a report relating to tampering with the vehicle. The report may be a communication to law enforcement agency. The input signal may comprise a video recording of the tampering. The report may comprise the video recording of the tampering. The operation may comprise determination of damage to the vehicle. The report may comprise determination of damage to the vehicle. The input signal may comprise detection of attention of an operator of the vehicle. The operation may comprise assessment of attention of the operator and the output signal may comprise a report based on attention of the operator. If the attention of the operator is below a threshold value the report may comprise an alert. The threshold value may comprise an awaken state. The threshold value may comprise a distracted state. The report may comprise a signal at the user interface as an alert to the operator. The operation may comprise haptic feedback for the operator. The output signal may comprise a sound at the user interface. The input signal may comprise detection of a potential hazard. The operation may comprise determination of the potential hazard and the output signal may comprise a report relating to the potential hazard. The operation may comprise determination of status of the occupant; the report may comprise status of the occupant. Determination of the potential hazard may comprise classification of the potential hazard. Determination of the potential hazard may comprise a traffic-related hazard and the report may comprise a warning. The input signal may comprise detection of a vehicle condition. The operation may comprise determination of the vehicle condition and the output signal may comprise a report relating to the vehicle condition. The vehicle condition may be reported to the occupant of the vehicle. The user interface may comprise a gesture-operated interface for the occupant. The vehicle condition may relate to status of a safety system of the vehicle reported to the occupant. The input signal may comprise detection of a vehicle condition. The operation may comprise determination of the vehicle condition and the output signal may comprise a report relating to the vehicle condition. The vehicle condition may comprise autonomous operation of the vehicle. The input signal may comprise detection of readiness of an occupant to operate the vehicle. Readiness may comprise status of the occupant. Status may comprise position of the occupant. The report may comprise an alert to the occupant. The alert may comprise a communication to take control of the vehicle.
The present invention relates to a method for using a sensor matrix in an interior of a vehicle comprising a vehicle system and providing a user interface configured for interaction with an occupant comprising the steps of obtaining an input signal from the sensor matrix, processing the input signal into an output signal, and performing an operation relating to the input signal. The input signal may comprise at least one of (a) proximity, (b) action, (c) motion, (d) sound; (c) condition, (f) characteristic, (g) presence/absence of occupant, (h) position of occupant, (i) interaction with occupant, (j) detected input from occupant, (k) directed input from occupant, (l) detected event; (m) detected condition; (n) vehicle interior control/activity, (o) smart cabin control/activity. The operation may comprise at least one of (a) operation of vehicle system, (b) network communication, (c) instruction for vehicle system, (d) operation of vehicle systems, (c) interaction with external systems, (f) data storage, (g) data analytics/analysis, (h) comparison of data to threshold values; (i) monitoring; (j) providing a report based on the input signal; (k) vehicle interior control (l) smart cabin control, (m) smart cabin activity. The output signal may comprise at least one of (a) audio-visual signal, (b) visual signal, (c) visual display, (d) audible signal, (c) haptic signal, (f) notice/alert, (g) communication to network, (h) communication to device, (i) communication to vehicle system; (j) data transmission, (k) data storage; (l) vehicle location; (m) vehicle status; (n) occupant status; (o) communication over a network; (p) report over a network. The method may comprise the step of performing an operation based on the output signal. The step of performing an operation relating to the input signal may comprise performing an operation based on the output signal. The step of processing the input signal into an output signal may comprise use of a database. The step of performing an operation may comprise providing a communication. The communication may comprise a report based on the input signal. The output signal may comprise the report. A system comprising the sensor matrix may be provided in the interior of the vehicle. The method may comprise the step of activating a field for the sensor matrix. The method may comprise the step of actuating a field for the sensor matrix. The step of obtaining an input signal may comprise detecting the input signal. The input signal may comprise a signal representative of at least one of (a) motion by the occupant; (b) action by the occupant; (c) a condition in the interior; (d) a condition of the occupant; (c) a characteristic of the occupant. The method may comprise the step of conditioning the input signal. The step of conditioning the input signal may comprise filtering the input signal. Filtering the input signal may comprise at least one of (a) separating noise and/or (b) calibration. The step of processing the input signal may comprise calibration. The step of performing an operation may comprise performing an operation at the vehicle system. The method may comprise the step of providing an output at the user interface based on the output signal. The method may comprise the step of interaction at the user interface. The sensor matrix may comprise a sensor. The sensor matrix may comprise at least one sensor. The at least one sensor may comprise at least one of (a) a transducer; (b) a capacitive sensor; (c) an odor detector; (d) an accelerometer; (e) a microphone; (f) a camera; (g) a light detector; (h) an ultraviolet light detector; (i) a thermal detector; (j) RFID detector; (k) tag detector. The at least one sensor may comprise at least one of (a) a sensor array; (b) a sensor matrix; (c) a multi-function sensor; (d) a composite sensor; (e) an audio-visual sensor; (f) a video recorder; (g) an audio recorder; (h) a thermal sensor; (i) a bio-metric sensor. The sensor matrix may comprise a field. The sensor matrix may comprise multiple sensors. The sensor matrix may comprise multiple fields. The sensor matrix may comprise a field and at least one sensor. The input signal from the sensor matrix may comprise a signal detected (a) by a sensor and/or (b) from within a field. Obtaining the input signal from the sensor matrix may comprise detecting (a) by a sensor and/or (b) from within a field. The method may comprise the step of providing a field in the interior of the vehicle. The step of providing the field may be performed by the sensor matrix. The user interface may comprise at least one of (a) a display; (b) an audio system; (c) an audio-visual system; (d) an infotainment system. The step of processing the input signal may be performed on a processor. The processor may be operated by a control program. The step of processing the input signal may be performed by a control program. The step of processing the input signal may be performed on a controller. The vehicle system may comprise at least one of (a) a climate control system; (b) a navigation system; (c) a network; (d) a vehicle network; (e) a seating system; (f) a window control system; (g) a lighting system; (h) a communication system; (i) an autonomous vehicle operation system; (j) a steering system; (k) a drive system; (l) a powertrain system; (m) a starter system; (n) an audio system; (o) a display system; (p) an HVAC system; (q) an emergency warning system; (r) a safety system; (s) an occupant comfort system; (t) an occupant/sensory system; (u) a vehicle environment system; (v) a network/external communication system. The method may comprise the step of communication of an output based on the output signal to a network. The network may comprise a telecommunication network. The network may comprise access through a mobile device. The output may comprise a telecommunication signal. The telecommunication signal may comprise an emergency signal. The output may comprise a network communication. The input signal may comprise incapacitation of the occupant. The operation may comprise determination of the condition and the output signal may comprise a report relating to the condition. The incapacitation may be of the operator of the vehicle and the operation may comprise taking control of the vehicle. The input signal may comprise detection relating to a status of an item. The operation may comprise determination of the status of the item and the output signal may comprise a report based on the status of the item. The input signal may comprise the status of the item; the status of the item may comprise either (a) detected or (b) not detected. If the status of the item is not detected the operation may comprise monitoring for the item and the report may comprise an alert that the item is not detected. The input signal may comprise detection of an incident. The operation may comprise determination of the incident and the output signal may comprise a report on the incident. The incident may comprise a vehicle collision. The input signal may comprise detection of tampering with the vehicle. The operation may comprise verification of tampering with the vehicle and the output signal may comprise a report relating to tampering with the vehicle. The operation may comprise determination of damage to the vehicle. The report may comprise determination of damage to the vehicle. The input signal may comprise detection of attention of an operator of the vehicle. The operation may comprise assessment of attention of the operator and the output signal may comprise a report based on attention of the operator. If the attention of the operator is below a threshold value the report may comprise an alert. The threshold value may comprise an awaken state. The threshold value may comprise a distracted state. The output signal may comprise a sound at the user interface. The input signal may comprise detection of a potential hazard. The operation may comprise determination of the potential hazard and the output signal may comprise a report relating to the potential hazard. The operation may comprise determination of status of the occupant; the report may comprise status of the occupant. The input signal may comprise detection of a vehicle condition. The operation may comprise determination of the vehicle condition and the output signal may comprise a report relating to the vehicle condition.
The present invention relates to a system for using a sensor matrix in an interior of a vehicle comprising a vehicle system and providing a user interface configured for interaction with an occupant comprising (a) the sensor matrix configured to obtain an input signal from within the vehicle interior and (b) a processor configured to process the input signal and to provide an output signal. The output signal may be sent to at least one of (a) the vehicle system and/or (b) the user interface. The input signal may comprise at least one of (a) proximity, (b) action, (c) motion, (d) condition, (e) characteristic, (f) presence/absence of occupant, (g) position, (h) interaction with occupant, (i) detected input from occupant, (j) directed input from occupant, (k) vehicle interior control/activity, (l) smart cabin control/activity. The operation may comprise at least one of (a) operation of vehicle system, (b) network communication, (c) instruction for vehicle system, (d) operation of vehicle systems, (e) interaction with external systems, (f) data storage, (g) data analytics/analysis, (h) smart cabin control. The output signal may comprise at least one of (a) audio-visual signal, (b) visual signal, (c) visual display, (d) audible signal, (c) haptic signal, (f) notice/alert, (g) communication to network, (h) communication to device, (i) communication to vehicle system; (j) data transmission, (k) data storage. The sensor matrix may comprise at least one sensor. The sensor may comprise at least one of (a) a transducer; (b) a capacitive sensor; (c) an odor detector; (d) an accelerometer; (e) a microphone; (f) a camera; (g) a light detector; (h) an ultraviolet light detector. The sensor matrix may comprise a field. The sensor matrix may comprise multiple sensors. The sensor matrix may comprise multiple fields. The sensor matrix may comprise a field and at least one sensor. The input signal from the sensor matrix may comprise a signal detected (a) by a sensor and/or (b) from within a field. The field may be provided in the interior of the vehicle. The sensor matrix may be installed and/or operated with a vehicle interior component. The vehicle interior component may comprise at least one of (a) a trim panel; (b) an instrument panel; (c) a door panel; (d) a console; (c) a floor console; (f) an overhead console: (g) an overhead system; (h) a seat; (i) a steering wheel. The user interface may comprise at least one of (a) a display; (b) an audio system; (c) an audio-visual system; (d) an infotainment system. The user interface may comprise a haptic interface. The processor may be operated by a control program. The control program may comprise a software program. The vehicle system may comprise at least one of (a) a climate control system; (b) a navigation system; (c) a network; (d) a vehicle network; (e) a seating system; (f) a window control system; (g) a lighting system; (h) a communication system; (i) an autonomous vehicle operation system; (j) a steering system; (k) a drive system; (l) a powertrain system; (m) a starter system; (n) an audio system; (o) a display system; (p) an HVAC system; (q) an emergency warning system; (r) a safety system; (s) an occupant comfort system; (t) an occupant/sensory system; (u) a vehicle environment system; (v) a network/external communication system; (w) location/navigation system; (x) system for mobile device interactivity/connectivity.
The present invention relates to a method for using a sensor matrix in an interior of a vehicle comprising a vehicle system and providing a user interface configured for interaction with an occupant comprising the steps of obtaining an input signal from the sensor matrix, processing the input signal into an output signal and performing an operation based on the output signal. The input signal may comprise at least one of (a) proximity, (b) action, (c) motion, (d) condition, (e) characteristic, (f) presence/absence of occupant, (g) position, (h) interaction with occupant, (i) detected input from occupant, (j) directed input from occupant, (k) vehicle interior control/activity, (l) smart cabin control/activity. The operation may comprise at least one of (a) operation of vehicle system, (b) network communication, (c) instruction for vehicle system, (d) operation of vehicle systems, (c) interaction with external systems, (f) data storage, (g) data analytics/analysis, (h) smart cabin control. The output signal may comprise at least one of (a) audio-visual signal, (b) visual signal, (c) visual display, (d) audible signal, (e) haptic signal, (f) notice/alert, (g) communication to network, (h) communication to device, (i) communication to vehicle system; (j) data transmission, (k) data storage. A system comprising the sensor matrix may be provided in the interior of the vehicle. The method may comprise the step of actuating a field for the sensor matrix. The step of obtaining an input signal may comprise detecting the input signal. The input signal may comprise a signal representative of at least one of (a) motion by the occupant; (b) action by the occupant; (c) a condition in the interior; (d) a condition of the occupant; (c) a characteristic of the occupant. The method may comprise the step of conditioning the input signal. The step of conditioning the input signal may comprise filtering the input signal; filtering the input signal may comprise at least one of (a) separating noise and/or (b) calibration. The step of processing the input signal may comprise calibration. The step of performing an operation may comprise performing an operation at the vehicle system. The system and method may further comprise the step of providing an output at the user interface based on the output signal. The system and method may further comprise the step of interaction at the user interface. The sensor matrix may comprise at least one sensor. The sensor may comprise at least one of (a) a transducer; (b) a capacitive sensor; (c) an odor detector; (d) an accelerometer; (c) a microphone; (f) a camera; (g) a light detector; (h) an ultraviolet light detector. The sensor matrix may comprise a field. The sensor matrix may comprise multiple sensors. The sensor matrix may comprise multiple fields. The sensor matrix may comprise a field and at least one sensor. The input signal from the sensor matrix may comprise a signal detected (a) by a sensor and/or (b) from within a field. Obtaining the input signal from the sensor matrix may comprise detecting (a) by a sensor and/or (b) from within a field. The method may comprise the step of providing a field in the interior of the vehicle. The step of providing the field may be performed by the sensor matrix. The user interface may comprise at least one of (a) a display; (b) an audio system; (c) an audio-visual system; (d) an infotainment system. The step of processing the input signal may be performed on a processor. The processor may be operated by a control program. The step of processing the input signal may be performed by a control program. The step of processing the input signal may be performed on a controller. The vehicle system may comprise at least one of (a) a climate control system; (b) a navigation system; (c) a network; (d) a vehicle network; (c) a seating system; (f) a window control system; (g) a lighting system; (h) a communication system; (i) an autonomous vehicle operation system; (j) a steering system; (k) a drive system; (l) a powertrain system; (m) a starter system; (n) an audio system; (o) a display system; (p) an HVAC system; (q) an emergency warning system; (r) a safety system; (s) an occupant comfort system; (t) an occupant/sensory system; (u) a vehicle environment system; (v) a network/external communication system. The method may comprise the step of communication of an output based on the output signal to a network. The network may comprise a telecommunication network. The network may comprise access through a mobile device. The output may comprise a telecommunication signal. The telecommunication signal may comprise an emergency signal.
The present invention relates to a system for using a sensor matrix in an interior of a vehicle comprising a vehicle system configured for interaction with an occupant and/or device/object comprising (a) the sensor matrix configured to obtain an input signal from within the vehicle interior and (b) a processor configured to process the input signal and to provide an output signal. The output signal may be sent to at least one of (a) the vehicle system and/or (b) the user interface. The system and method may provide for an improved interaction with at least one vehicle occupant and/or with vehicle systems and/or with networks/communications and/or with a user interface/device.
Referring to
Referring to
As indicated schematically according to an exemplary embodiment in
According to an exemplary embodiment as shown schematically in
As indicated schematically in
As indicated schematically in
As indicated schematically in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
According to an exemplary embodiment as shown schematically in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
According to an exemplary embodiment as shown schematically in
According to an exemplary embodiment as shown schematically in
According to an exemplary embodiment as shown schematically in
According to an exemplary embodiment as shown schematically in
According to an exemplary embodiment as shown schematically in
According to an exemplary embodiment as shown schematically in
According to an exemplary embodiment as shown schematically in
According to an exemplary embodiment as shown schematically in
Input/signal (sensor matrix)—Signal from ongoing monitoring/detection by sensor matrix of occupant health/vital signs (e.g. heart rate/rhythm, respiration rate, oxygenation, blood pressure, etc.); sensor matrix may obtain additional information; information (such as also from vehicle systems) may include GPS/location information as well as routing.
Operation (vehicle systems/network)—Comparison of signal with threshold values (individual, personalized for occupant, aggregated, etc.); inquiry with occupant if abnormal/dangerous level indicated; interact with occupant and/or provide command to vehicle systems (e.g. actuate autonomous operation of vehicle, direct vehicle to safe location/side of road, assume braking/steering, etc.); provide alert output signal.
Output/signal (network/user interface)—Signal providing report of vital signs; report of status of vehicle systems; vehicle route and/or location; identification of occupant/vehicle; signal communicating alert to contacts/family members and to emergency responders, police, dispatchers, third-party services, etc. according to protocol.
Input/signal (sensor matrix)—Signal from vehicle interior indicating abnormal/unusual condition or event such as physical illness (e.g. vomiting, incontinence, etc.). Detection by sensor matrix (individual or aggregate sensor/signal) such as odor, visual/camera, sound, moisture, etc.) (e.g. detecting/observing physical effect and/or bio-effect such as detecting/smelling the vomit chemistry/smell).
Operation (vehicle systems/network)—Comparison of signal with threshold values; inquiry with occupant if abnormal/dangerous level indicated; interact with occupant and/or provide command to vehicle systems (e.g. actuate autonomous operation of vehicle, direct vehicle to safe location/side of road, assume braking/steering, etc.); provide alert output signal; if an autonomous/taxi vehicle, remove from service for cleaning/sanitizing (e.g. at service center/location). Allocation of cost for service/cleaning to occupant/responsible party.
Output/signal (network/user interface)—Signal providing report of vital signs; report of status of vehicle systems; vehicle route and/or location; signal communicating alert to contacts/family members and to emergency responders, police, dispatchers, third-party services, etc. according to protocol.
Input/signal (sensor matrix)—Signal from vehicle interior indicating child or pet within vehicle and/or person in distress/incapacitated (e.g. by health, medication, intoxication, other condition, etc.) in vehicle (abnormal condition). Detection by sensor matrix (individual or aggregate sensor/signal) such as visual/camera, sound, moisture, odor, etc.) (e.g. detecting/observing physical effect and/or bio-effect).
Operation (vehicle systems/network)—Comparison of signal with threshold values; inquiry with occupant if abnormal/dangerous condition indicated; interact with occupant and/or provide command to vehicle systems (e.g. actuate autonomous operation of vehicle, direct vehicle to safe location, control ventilation/temperature for occupants, including fresh air and thermal management). Coordination with time of day/day of week and/or schedule information for calibration and screening of situation for reporting and commands to vehicle systems.
Output/signal (network/user interface)—Signal providing report of vital signs; report of status of vehicle systems; vehicle route and/or location; signal communicating alert to contacts/family members and to emergency responders, police, dispatchers, third-party services, etc. according to protocol; specific output signal to report detail of situation (child, pet, incapacitated person, etc.). Report if child or pet/dog left behind (monitoring of system, seating, interior, etc.), including contact person/authority (e.g. by telephone/call, text, e-mail, messaging, etc.). Driver may be notified and if no response then emergency services are notified. Alarm/alert from vehicle such as light-flash and audible/horn on exterior of vehicle after alert to operator/owner.
Input/signal (sensor matrix)—Signal from vehicle interior indicating item left in vehicle. Detection of item/object by sensor matrix (including potential interaction with item and/or mobile device, RFID tag, other sensor/tag on item, etc.); object/item may be registered/paired with system for identification by sensor matrix.
Operation (vehicle systems/network)—Vehicle system may provide an alert to driver/occupant at moment of exiting vehicle (audible signal, etc.); ongoing monitoring of vehicle interior to provide information (e.g. watching valuables, wallet, purse, electronic device, phone, etc.). Passenger alert for item/valuable left behind in an autonomous vehicle/taxi; vehicle may remain at location (with notification or indication/signal such as flashing lights, etc.) until alert is cleared by passenger/rider.
Output/signal (network/user interface)—Operator or passenger/rider and/or authority/contact informed of condition through call/messaging (e.g.by phone, text and e-mail, etc.); vehicle route and/or location.
Input/signal (sensor matrix)—Incident/accident detected by sensor matrix (e.g. including accelerometer, camera, microphone, other vehicle systems such as airbag deployment, etc.) For example, incident where airbag deploys; sensor senses number of airbags that deployed and the location of the deploying airbag in the vehicle.
Operation (vehicle systems/network)—Detection that vehicle has been in accident is communicated with vehicle systems and network. Alters and call for assistance in initiated (after attempt/inquiry with vehicle occupant) with information content (such as number of airbags that deployed and the location within the vehicle and if the occupant is out of position/occupant seat classification, movement/activity or exit from vehicle/ejected) to enhance preparedness of emergency rescue (allowing to act more effectively/quickly on arrival at scene as well as to coordinate response such as fire/police and particular nature of emergency response needed).
Output/signal (network/user interface)—Vehicle/occupant status reported (including state of health/vital signs); signal may be routed to authorities/others (e.g. video, audio, communications, etc.); vehicle route and/or location. Communications with authorities/emergency rescue/dispatch with information content (identification of occupants) to assist preparation; contact by messaging (phone/text etc.) to family/others.
Input/signal (sensor matrix)—Potential hazard (e.g. bio-hazard, toxic condition, potentially harmful substance, etc.) detected or sensed in vehicle interior (e.g. within cabin/air) by sensor matrix.
Operation (vehicle systems/network)—Detection of potential danger/dangerous chemical (as compared to threshold) communicated as alert; inquiry for operator/driver or occupant; vehicle systems commanded to take appropriate measures to ensure occupant safety such as lowering windows or keeping windows up or turning on/off ventilation system.
Output/signal (network/user interface)—Vehicle/occupant status reported (including state of health/vital signs, level of danger/hazard, etc.); alert communicated to/through vehicle systems and user interface; signal may be routed to authorities/others (e.g. video, audio, communications, etc.); vehicle route and/or location; messaging to/from contacts/persons for vehicle.
Input/signal (sensor matrix)—Vehicle system status monitored including with connectivity to vehicle systems and/or with sensor matrix.
Operation (vehicle systems/network)—Operator/driver and/or occupant is provided an alert for state of health of vehicle and/or vehicle system (e.g., low tire pressure, low oil, battery state of charge, engine/motor issues, etc.).
Output/signal (network/user interface)—Vehicle status reported; alert communicated to/through vehicle systems and user interface; signal may be routed to authorities/others (e.g. video, audio, communications, etc.); vehicle route and/or location; messaging to/from contacts/persons.
Input/signal (sensor matrix)—Sensor matrix detects possible tampering/theft attempt from vehicle (including by interaction with other vehicle systems such as door locks, exterior sensors/cameras, etc.).
Operation (vehicle systems/network)—Enhanced/improved tampering and theft protection by interaction of sensor matrix with a more comprehensive detection system throughout the vehicle interior. Alarm alerts to inform vehicle owner and/or authorities of potential break-in and of status detected (e.g. broken window or other) and/or presence of intruder (e.g. identity detection and/or verification with data base or by owner); audio/camera image may be transmitted to owner/authorities (e.g. police/security); interaction with other alert systems). Theft of vehicle may be monitored and reported.
Output/signal (network/user interface)—Vehicle/occupant status reported; signal may be routed to authorities/others (e.g. video, audio, communications, etc.); vehicle route and/or location. Communications with authorities/emergency rescue/dispatch with information content (identification of violator) to assist preparation; contact by messaging (phone/text etc.) to family/others.
Input/signal (sensor matrix)—Driver status and/or attention monitoring and management by sensor matrix (e.g. detection of signals such as audio, video, touch, position, movement, vital signs, etc.); possible loss of acuity/attention by driver may be indicated/predicted and reported.
Operation (vehicle systems/network)—Alert to potential driver attention loss and/or drowsiness and distraction prediction; command to vehicle systems responsive to drowsiness (e.g. control/systems for keeping driver awake and not-distracted); alert (e.g. audible/audiovisual); operation may include interaction at steering wheel and/or seating or other components (e.g. alert, haptic feedback, etc.).
Output/signal (network/user interface)—Vehicle/occupant status reported (including state of health/vital signs, attentiveness, drowsiness, etc.); alert communicated to/through vehicle systems and user interface; signal may be routed to authorities/others (e.g. video, audio, communications, etc.); report as to vehicle route and/or location (and or remaining distance of travel); messaging to/from contacts/persons.
Input/signal (sensor matrix)—Sensor matrix signals (from one sensor or multiple sensors, including vital signal monitoring, posture/position, operation of vehicle systems/responsiveness, etc.).
Operation (vehicle systems/network)—Driver Monitoring Systems (DMS) conforming to legal rules/compliance including privacy (e.g. no sensitive data is to be stored but instead deleted immediately after processing). Driver Drowsiness and Attention Warning (system that assesses the driver alertness through sensors including potential input from vehicle systems and performs analysis to alert/warn the driver if needed). Advanced Driver Distraction Warning (system intended to enhance driver attention and sustained attention such as in high-traffic situation with alerts/warning when distraction/inattention is detected).
Output/signal (network/user interface)—Alerts and reports and other communications to enhance driver/operator performance and safety; monitor and provide recommendations and other alerts/communications and messaging.
Input/signal (sensor matrix)—Sensor matrix detection and input signals (from one sensor or multiple sensors, including vital signal monitoring, posture/position, operation of vehicle systems/responsiveness, etc.).
Operation (vehicle systems/network)—System for semi-automated vehicles and/or autonomous vehicles (that may also be operated by driver) to be equipped with a Driver Availability Monitoring System (system to assess whether the driver is in a position to take over the driving function from an automated vehicle in particular situations, where appropriate).
Output/signal (network/user interface)—Alerts and reports as to availability or lack of availability; other communications to enhance vehicle and/or driver/operator performance and safety; monitor and provide recommendations for vehicle operation/use and other alerts/communications and messaging (e.g. report for vehicle/fleet manager).
Input/signal (sensor matrix)—Sensor matrix configured as to provide a virtual switch for operator/occupant; may be operated at user interface and/or on panel and/or device.
Operation (vehicle systems/network)—Sensor matrix input functions as a switch (e.g. based on detecting the presence of the human gesture by hand, finger); switch function may be used for any application in vehicle or for operation of selected vehicle systems.
Output/signal (network/user interface)—Feedback may be provided by sensor matrix/user interface; execution of switch function indicated for operation (e.g. successful gesture recognition by sensor matrix).
Input/signal (sensor matrix)—Sensor matrix configured to provide input signal for empathetic/empathic system to measure human state/emotion for occupants in vehicle interior; multi-sensor integration for sensor matrix (may be calibrated for optimization of input signal/detection of state).
Operation (vehicle systems/network)—System to monitor and optimize experience of vehicle operator/occupants for safety, protection, enhanced user experience (e.g. monitor potential bias, prevent safety violations such as abuse and exploitation, etc.); use multiple data streams (multi-mode input from sensor matrix); include location and position/posture of occupant, sound/vibrations, biometrics (body temperature, vital signals, etc.), chemical reactions (such as spills), bio-effects (such as particulates, pollutants, chemicals, etc. in the air); capabilities that can be integrated with data from sensor matrix (e.g., with sensor technology including such as radar on chip, resistive, capacitive, MEMS, optical), cameras, thermal imaging, audio, sensor for acceleration and deceleration of the vehicle) and any other data stream available in the vehicle. Computation may employ algorithm/control program and/or embedded libraries (update to build and evolve/refine human models for state detection/monitoring empathic condition). System may integrate “smart cabin” processing in real-time all/multiple data streams from sensor matrix to enhance accuracy of empathic system, including camera-based sensors/systems (e.g. using and identifiable faces/expressions) and detection of voice/tone of voice (expressions and word usage, etc.); system may be configured to process signals/data in real time to identify/anticipate occupant behaviors and sense when something is different or wrong (e.g. odor imaging sensor to smell alcohol on breath combined with a camera and image processing as well as thermal imaging of body temperature, etc.); by machine learning, system may build dataset of occupant behaviors for enhancement of experience in vehicle, including user preference implementation and development (as well as environment/interior conditions/settings to anticipate occupant needs/wants and to facilitate useful actions/behaviors including to enhance positive emotion/empathic monitoring of emotion/mood changes). Composite data collection from sensor matrix and other data sources; system may update database/threshold value development for detection and classification of conditions, events, etc.
Output/signal (network/user interface)—Report on empathic state for vehicle operator and occupants (in real time); integrate with vehicle systems in response to empathic state as detected; provide report to communication system/network outside of vehicle (including if assistance is appropriate, with sensor data such as audio-visual output).
Input/signal (sensor matrix)—Empathic “Smart Cabin” system to obtain and process multiple data streams in real-time to identify condition/emotional state of vehicle occupants; sensor matrix comprising multiple sensors (e.g. multi-modal input) to provide data for real-time/synchronized analysis.
Operation (vehicle systems/network)—System is configured to obtain and condition/process signals from sensor matrix and to use database/data sets (e.g. libraries, artificial intelligence, networks) and computation to ascertain human emotional state/mood for vehicle occupant (e.g. selecting specific data streams/inputs to identify more accurate/useful); intent to identify emotional state (e.g. positive satisfaction/happiness and/or neutral focused and/or negative such as unhappy, in need of protection from bias, abuse and exploitation, etc.) of occupants using input/sensor data and database and/or computation (including over network/cloud or in vehicle).
Output/signal (network/user interface)—Report on status and/or identification of emotional state/mood/affect (e.g. positive satisfaction/happiness and/or neutral focused and/or negative such as unhappy, in need of protection from bias, abuse and exploitation, etc.) for occupants in vehicle; data and output can be privacy-managed (e.g. only select non-sensitive data transferred to network/cloud).
Input/signal (sensor matrix)—Vehicle occupant seat/status classification to be detected using signal from sensor matrix (enhanced multi-sensor integration or single-sensor).
Operation (vehicle systems/network)—System may be configured to determine status of seating/protection such as seat restraint/belt for occupants in vehicle; alert provided and/or integration with other vehicle systems (in addition to existing alarm/alert for vehicle seat).
Output/signal (network/user interface)—Report/alert as to seating status such as seat belt status may be set/selected and/or differentiated according to vehicle operator and/or occupant preferences; may connect to other vehicle systems to enhance efficacy of alert (disable privileges, etc.).
Input/signal (sensor matrix)—Sensor matrix configuration to provide enhanced precision Occupant Classification System (OCS) for vehicle seat (e.g. enhancement with current standards such as a transducer/mat); sensor matrix can combine multi-sensor inputs, including standard sensors with cameras and enhanced digital processing/signal integration.
Operation (vehicle systems/network)—Occupant Classification System (OCS) can be enhanced and/or integrated with other systems for Child Presence Detection (CPD) and Low Risk Deployment (LRD) (of airbag system). Occupant classification can use signal from sensor matrix to classify occupant: rear-facing infant seat, child, adult, and empty seat. System would implement automatic airbag deactivation for child seats (e.g. distinguishing occupant presence such as between a one-year old, a three to six year old and the five percent female size); monitor vehicle passenger seat for occupancy and classification of occupant (if any); if child in a child seat or unoccupied, system may disable the airbag/provide alert; for adult occupant ensure airbag is deployed in the event of an accident (e.g. deployment adjustment for position of occupant). Sensor system/matrix input signal can determine seat belt status and provide alert/reminder function; detection can determine the child seat, child age/size, child position and child seat belt status. System can use data/database for determination of proper values for occupants.
Output/signal (network/user interface)—Output signal can report classification status to operator (allow user/operator override if required); data available for other vehicle systems including for communication to network/for messaging if collision event or other reportable incident.
Vehicle Data—Obtain vehicle data including from network and other interface (e.g. diagnostic port/connector, such as engine status, gear changes, braking, acceleration, temperature, tire pressure, etc.).
Audio/Sound Data—Microphone/audio input (e.g. detect vehicle interior condition, state of being of occupants, operator monitoring, including states/moods such as excitement, fear, happy, sad, stress, and occupant characteristics such as age, gender, etc.).
Biometric Sensor Data—State of health/status of operator and occupants (e.g. such as from vital sensing, monitoring skin temperature, heart rate, breathing/respiratory rate, ECG, galvanic skin response, etc.).
Facial Data—Identification operator/occupant and state/status (e.g. identity, status, age, gender, mood (i.e. angry, happy, sad, surprised, etc.), etc.).
Camera Data Stream—Facial Analysis, Full Body Analysis, Full Cabin/Occupant Analysis, Object Analysis with vision processing (correlate image to features to be identified) and machine learning, image identification and annotation (e.g. using camera/software enhancements, etc.)
Sensor Field (Creation/detection)—Contactless sensors using field technology/calibration.
Sensor Devices—Configuration for use of any of a variety of types of available/future sensor devices including resistance sensors, basic capacitive touch sensors, localized ultrasound touch sensing, force detection (e.g. transducer, MEMS strain sensor, piezoelectric, etc.), audio (microphone or vibration sensor by piezoelectric device), integrated/single-chip sensors, etc.), accelerometers (measure the acceleration and deceleration of the vehicle), collision warning systems, active sensors (data from lidar, radar or sonar), odor detection/imaging sensor (e.g. capturing odor information, which human may or may not perceive, convert odor to visual pattern, mimicking olfactory system, detecting bio-hazards, etc.), silicon photodiodes (e.g. detectors, photo IC used in light/radiation detection of solar radiation to adjust the airflow or cooling of automatic climate control), other known/convention sensor technology, multi-mode/multi-sensor integration for sensor matrix effect/configuration.
Privacy Management—System data signals (including input and output) to be privacy-managed (e.g. only select non-sensitive data transferred to network/cloud for storage/access).
Data Interface/Sources—Sensor matrix configuration to optimize sensors and data connectivity/streams in a particular use case/application in context and integrated into the vehicle, vehicle interior and with vehicle systems; data storage/use to be managed in legal compliance/ethical standards (e.g. privacy, personal/sensitive information protection). Multi-sensor/multi-modal system may include hardware and software with different types of data streams (e.g. from sensor matrix) from multiple sources (e.g. transducers, cameras, contactless human radar, biometrics, chemical reactions (spills), bio-effects microphones, sensed vibration, accelerometers, thermal imaging/measurement, collision warning sensors, etc.) using machines and interfaces configured to operate processing multiple data streams in real time with on-board diagnostics.
Referring to
As indicated schematically according to an exemplary embodiment in
According to an exemplary embodiment as shown schematically in
As indicated schematically in
According to an exemplary embodiment as shown schematically in
As indicated schematically in
According to an exemplary embodiment as shown schematically in
According to an exemplary embodiment as shown schematically in
According to an exemplary embodiment as shown schematically in
According to an exemplary embodiment as shown schematically in
According to an exemplary embodiment as shown schematically in
According to an exemplary embodiment as shown schematically in
According to an exemplary embodiment as shown schematically in
According to an exemplary embodiment as shown schematically in
According to an exemplary embodiment as indicated schematically in the FIGURES/TABLES, the improved system and method may be configured to be implemented using known/conventional components/systems and technology and other future-developed components/systems and technology including but not limited to sensor/detector technology, computing technology, computing systems/devices (including processors/microprocessors, systems, computers, controllers, control programs, programming, etc.), software development/technology, machine learning technology, artificial intelligence technology, language/data models, augmentation technology, augmented reality technology, data sources, data analytics, data storage, data acquisition/aggregation, network technology, etc.
As indicated schematically in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in 16A-16H, 17A-17B, 18A-18B, 19A-19B, 20A-20B, 21A-21B, 22A-22C, 23A-23B and 24, a computing system may comprise a computing device with a processor. The processor may be configured to use data to provide an output signal comprising an enhancement of the input signal. The processor may be configured to operate with a data enhancement module configured for machine learning to use data to provide an output signal comprising an enhancement of the input signal. The processor may be configured to operate with a data enhancement module configured with artificial intelligence to use data to provide an output signal comprising an enhancement of the input signal. The processor may be configured to operate with a data enhancement module configured to with generative artificial intelligence to use data to provide an output signal comprising an enhancement of the input signal; the input signal may comprise a prompt and the output signal comprise information generated by the data enhancement module from the prompt. The output signal may comprise information provided at a display. The processor may be configured to use data to provide an output signal comprising an augmentation of the input signal. The processor may be configured to use data to operate with a data augmentation module configured for augmented reality to provide an output signal comprising an augmentation of the input signal. The processor may be configured to use data to operate with a data augmentation module configured for augmented reality to provide an output signal comprising an augmentation of the input signal; augmentation may comprise application of data from a data source. The processor may be configured to use data to operate with a data augmentation module configured for augmented reality to provide an output signal comprising an augmentation of the input signal; augmentation may comprise at least one of augmented audio and/or augmented video. The output signal may comprise information provided at a display.
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in 2A-2D, 3A-3T, 15A-15E, 16A-16H, 17A-17B, 18A-18B, 19A-19B, 20A-20B, 21A-21B, 22A-22C, 23A-23B and 24, a method for using a sensor matrix in an interior of a vehicle comprising a vehicle system using data from at least one data source and providing a user interface configured for interaction with an occupant comprising the steps of obtaining an input signal from the sensor matrix; processing the input signal into an output signal; and performing an operation relating to the input signal. The input signal may comprise at least one of (a) proximity; (b) action; (c) motion; (d) sound; (e) condition; (f) characteristic; (g) presence/absence of occupant; (h) position of occupant; (i) interaction with occupant; (j) detected input from occupant; (k) directed input from occupant; (l) detected event; (m) detected condition; (n) vehicle interior control/activity; (o) smart cabin control/activity; (p) data. See also TABLES A-1 and A-2. The operation may comprise at least one of (a) operation of vehicle system; (b) network communication; (c) instruction for vehicle system; (d) operation of vehicle systems; (e) interaction with external systems; (f) data storage; (g) data analytics/analysis; (h) comparison of data to threshold values; (i) monitoring; (j) providing a report based on the input signal; (k) vehicle interior control; (l) smart cabin control; (m) smart cabin activity. See also TABLES A-1 and A-2. The output signal may comprise at least one of (a) audio-visual signal; (b) visual signal; (c) visual display; (d) audible signal; (e) haptic signal; (f) notice/alert; (g) communication to network; (h) communication to a device; (i) communication to a vehicle system; (j) data transmission; (k) data storage; (l) vehicle location; (m) vehicle status; (n) occupant status; (o) communication over a network; (p) report over a network; (q) information. See also TABLES A-1 and A-2. The step of processing the input signal into an output signal may comprise use of data to provide the output signal comprising an enhancement of the input signal and/or an augmentation of the input signal. See also TABLES A-1 and A-2. At least one data source may comprise the sensor matrix and/or a vehicle system and/or data storage and/or a network.
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in
As indicated schematically according to an exemplary embodiment in 2A-2D, 3A-3T, 15A-15E, 16A-16H, 17A-17B, 18A-18B, 19A-19B, 20A-20B, 21A-21B, 22A-22C, 23A-23B and 24, a method for using a sensor matrix in an interior of a vehicle comprising a vehicle system using data from at least one data source and providing a user interface configured for interaction with an occupant comprising the steps of obtaining an input signal from the sensor matrix; processing the input signal into an output signal; and performing an operation relating to the input signal. See also TABLES A-1 and A-2. Processing the input signal into an output signal may comprise using data to provide the output signal comprising an enhancement of the input signal and/or an augmentation of the input signal. See
As indicated schematically according to an exemplary embodiment in
It is important to note that the present inventions (e.g. inventive concepts, etc.) have been described in the specification and/or illustrated in the FIGURES of the present patent document according to exemplary embodiments; the embodiments of the present inventions are presented by way of example only and are not intended as a limitation on the scope of the present inventions. The construction and/or arrangement of the elements of the inventive concepts embodied in the present inventions as described in the specification and/or illustrated in the FIGURES is illustrative only. Although exemplary embodiments of the present inventions have been described in detail in the present patent document, a person of ordinary skill in the art will readily appreciate that equivalents, modifications, variations, etc. of the subject matter of the exemplary embodiments and alternative embodiments are possible and contemplated as being within the scope of the present inventions; all such subject matter (e.g. modifications, variations, embodiments, combinations, equivalents, etc.) is intended to be included within the scope of the present inventions. It should also be noted that various/other modifications, variations, substitutions, equivalents, changes, omissions, etc. may be made in the configuration and/or arrangement of the exemplary embodiments (e.g. in concept, design, structure, apparatus, form, assembly, construction, means, function, system, process/method, steps, sequence of process/method steps, operation, operating conditions, performance, materials, composition, combination, etc.) without departing from the scope of the present inventions; all such subject matter (e.g. modifications, variations, embodiments, combinations, equivalents, etc.) is intended to be included within the scope of the present inventions. The scope of the present inventions is not intended to be limited to the subject matter (e.g. details, structure, functions, materials, acts, steps, sequence, system, result, etc.) described in the specification and/or illustrated in the FIGURES of the present patent document. It is contemplated that the claims of the present patent document will be construed properly to cover the complete scope of the subject matter of the present inventions (e.g. including any and all such modifications, variations, embodiments, combinations, equivalents, etc.); it is to be understood that the terminology used in the present patent document is for the purpose of providing a description of the subject matter of the exemplary embodiments rather than as a limitation on the scope of the present inventions.
It is also important to note that according to exemplary embodiments the present inventions may comprise conventional technology (e.g. as implemented and/or integrated in exemplary embodiments, modifications, variations, combinations, equivalents, etc.) or may comprise any other applicable technology (present and/or future) with suitability and/or capability to perform the functions and processes/operations described in the specification and/or illustrated in the FIGURES. All such technology (e.g. as implemented in embodiments, modifications, variations, combinations, equivalents, etc.) is considered to be within the scope of the present inventions of the present patent document.
The present application is a continuation in part of PCT/International Patent Application No. PCT/US2022/74851 titled “SYSTEM AND METHOD FOR USE OF A SENSOR MATRIX IN A VEHICLE INTERIOR” filed Aug. 11, 2022, which claims the benefit of U.S. Provisional Patent Application No. 63/232,489 titled “SYSTEM AND METHOD FOR USE OF A SENSOR MATRIX IN A VEHICLE INTERIOR (WITH USE CASES)” filed Aug. 12, 2021. The present application claims priority to and incorporates by reference in full the following patent applications: (a) U.S. Provisional Patent Application No. 63/232,489 titled “SYSTEM AND METHOD FOR USE OF A SENSOR MATRIX IN A VEHICLE INTERIOR (WITH USE CASES)” filed Aug. 12, 2021; (b) PCT/International Patent Application No. PCT/US2022/74851 titled “SYSTEM AND METHOD FOR USE OF A SENSOR MATRIX IN A VEHICLE INTERIOR” filed Aug. 11, 2022.
Number | Date | Country | |
---|---|---|---|
63232489 | Aug 2021 | US |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/US22/74851 | Aug 2022 | WO |
Child | 18438592 | US |