Seat system

Information

  • Patent Grant
  • 11059490
  • Patent Number
    11,059,490
  • Date Filed
    Tuesday, March 17, 2020
    4 years ago
  • Date Issued
    Tuesday, July 13, 2021
    2 years ago
Abstract
A seat system includes a seat, a sensor assembly, and/or an electronic control unit (ECU) connected with the sensor assembly. The sensor assembly may be configured to sense a breathing pattern of a user associated with the seat. The ECU may be configured to communicate with a remote server, and at least one of the ECU and the remote server may be configured to determine a medical state of said user according, at least in part, to the breathing pattern.
Description
TECHNICAL FIELD

The present disclosure generally relates to seat systems, including seat systems that may be used in connection with vehicles, such as automobiles.


BACKGROUND

This background description is set forth below for the purpose of providing context only. Therefore, any aspect of this background description, to the extent that it does not otherwise qualify as prior art, is neither expressly nor impliedly admitted as prior art against the instant disclosure.


Some seat systems may not be configured to monitor medical states of users/occupants. For example, some seat systems may not be configured to assess the medical state of injured occupants following a detected collision and/or prioritize users/occupants for emergency medical assistance.


There is a desire for solutions/options that minimize or eliminate one or more challenges or shortcomings of seat systems. The foregoing discussion is intended only to illustrate examples of the present field and is not a disavowal of scope.


SUMMARY

In embodiments, a seat system may include a seat, a sensor assembly, and/or an electronic control unit (ECU) connected with the sensor assembly. The sensor assembly may be configured to sense a breathing pattern of a user associated with the seat. The ECU may be configured to communicate with a remote server, and at least one of the ECU and the remote server may be configured to determine a medical state of said user according, at least in part, to the breathing pattern.


With embodiments, a vehicle seat system may include a seat assembly including a first seat and a second seat, a sensor assembly configured to obtain first breathing pattern information associated with a first user associated with the first seat and second breathing pattern information associated with a second user associated with the second seat, and/or an ECU connected with the sensor assembly. The ECU may be configured to communicate with a remote server. At least one of the ECU and the remote server may be configured to determine (i) a first medical state of said first user according, at least in part, to the first breathing pattern information, and (ii) a second medical state of said second user according, at least in part, to the second breathing pattern information.


In embodiments, a method of operating a seat system may include providing a sensor assembly connected with the seat and the ECU, sensing biomedical information of a user of the seat via the sensor assembly, determining a medical state of said user according, at least in part, to the biomedical information, and/or transmitting at least one of the biomedical information and the medical state to a remote server.


The foregoing and other potential aspects, features, details, utilities, and/or advantages of examples/embodiments of the present disclosure will be apparent from reading the following description, and from reviewing the accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS

While the claims are not limited to a specific illustration, an appreciation of various aspects may be gained through a discussion of various examples. The drawings are not necessarily to scale, and certain features may be exaggerated or hidden to better illustrate and explain an innovative aspect of an example. Further, the exemplary illustrations described herein are not exhaustive or otherwise limiting, and are not restricted to the precise form and configuration shown in the drawings or disclosed in the following detailed description. Exemplary illustrations are described in detail by referring to the drawings as follows:



FIG. 1 is a side view generally illustrating an embodiment of a seat system according to teachings of the present disclosure.



FIG. 2 is a front view generally illustrating an embodiment of a seat system according to teachings of the present disclosure.



FIG. 3 is a top view generally illustrating an embodiment of a seat system according to teachings of the present disclosure.



FIG. 4 is a schematic generally illustrating an embodiment of a seat system according to teachings of the present disclosure.



FIG. 5 is a schematic generally illustrating an embodiment of a seat system according to teachings of the present disclosure.



FIG. 6 is a flow-chart generally illustrating an embodiment of a method of operating a seat system according to teachings of the present disclosure.





DETAILED DESCRIPTION

Reference will now be made in detail to embodiments of the present disclosure, examples of which are described herein and illustrated in the accompanying drawings. While the present disclosure will be described in conjunction with embodiments and/or examples, it will be understood that they do not limit the present disclosure to these embodiments and/or examples. On the contrary, the present disclosure covers alternatives, modifications, and equivalents.


In embodiments, such as generally illustrated in FIG. 1, a seat system 20 may include a seat assembly 30, a track assembly 40, and/or an electronic control unit (ECU) 50. The ECU 50 may be connected (e.g., electrically) with the seat assembly 30. The ECU 50 may be connected with one or more sensor assemblies 60N. The sensor assemblies 60N may be disposed within, on, and/or proximate one or more seats 32N of the seat assembly 30 such that the ECU 50 may be configured to identify the medical states of one or more users (e.g., occupants). The ECU 50 may be configured to determine a condition of a user and/or transmit information regarding the medical condition of the user. The seat system 20 may, for example and without limitation, be used in connection with a vehicle 22 and/or other modes of transportation.


In embodiments, such as generally illustrated in FIGS. 1, 2 and 3, a seat assembly 30 may be connected with a track assembly 40. The seat assembly 30 may include one or more seats 32N. For example and without limitation, the seat assembly 30 may include a first seat 321, a second seat 322, a third seat 323, and/or a fourth seat 324. The first seat 321, the second seat 322, the third seat 323, and/or the fourth seat 324 may include seat backs 341, 342, 343, 344, and/or seat bases 361, 362, 363, 364, respectively. The seats 32N may be selectively connected (e.g., electrically and/or mechanically) to the track assembly 40. For example and without limitation, the seats 32N may be configured to be removed from the track assembly 40. The ECU 50 may be electrically connected to the seats 32N, such as via the track assembly 40. The ECU 50 may be configured to at least partially control operation of the first seat 321, the second seat 322, the third seat 323, and/or the fourth seat 324. The seats 32N may be connected with the track assembly 40 via respective support members 38N (e.g., support members 381, 382, 383, 384). The support members 38N may be selectively connected with the track assembly 40. For example and without limitation, the support members 38N may be configured to be inserted vertically and/or horizontally into the track assembly 40. The support members 38N may be configured to be removed vertically and/or horizontally from the track assembly 40. The support members 38N may be configured to move along the track assembly 40 (e.g., in the X-direction). For example and without limitation, the support members 38N may be selectively locked in a plurality of positions along the track assembly 40.


With embodiments, such as generally illustrated in FIGS. 1, 2, and 3, a track assembly 40 may be disposed on a mounting surface 24 (e.g., a vehicle floor). The track assembly 40 may be configured to receive the seats 32N substantially in the X-direction and/or the Z-direction. The seats 32N and/or the support members 38N may be configured to be selectively inserted into and/or selectively removed from the track assembly 40 in a plurality of locations along the track assembly 40. The track assembly 40 may, for example and without limitation, include one or more of a variety of shapes, sizes, and/or configurations, and/or may be configured such that seats 32N may move in at least two directions with respect to the track assembly 40 (e.g., within a vehicle 22). The track assembly 40 may include one or more track portions 42, 44 that may be configured to facilitate movement of the seats 32N in at least one direction. A first track portion 42 and/or a second track portion 44 may extend substantially parallel to each other and/or may extend generally in the X-direction. The seats 32N may be selectively connected to the track assembly 40 via the support members 38N. For example and without limitation, the first seat 321 and/or the third seat 323 may be connected with the first track portion 42, and/or the second seat 322 and/or the fourth seat 324 may be connected with the second track portion 44.


In embodiments, such as generally illustrated in FIG. 3, the track assembly 40 may include a third track portion 46 and/or a fourth track portion 48 that may extend generally in the Y-direction (e.g., substantially perpendicular to the first track portion 42 and/or the second track portion 44). The seats 321, 322, 323, 324 may be configured to move between the first track portion 42 and/or the second track portion 44 via the third track portion 46 and/or the fourth track portion 48. The ECU 50 may be configured to move the seats 321, 322, 323, 324 along the first track portion 42, the second track portion 44, the third track portion 46, and/or the fourth track portion 48.


With embodiments, some or all of the track portions 42, 44, 46, 48 may include first and second sets of fixed and movable tracks. The fixed tracks may be fixed to the mounting surface 24. The movable tracks may be connected to support members 38N and may be configured to move (e.g., slide) along the fixed tracks.


In embodiments, such as generally illustrated in FIGS. 1, 2, 3, and 4, the seat system 20 may include an ECU 50. For example and without limitation, the first seat 321, the second seat 322, the third seat 323, and/or the fourth seat 324 may be electrically connected with the ECU 50 (e.g., via a wired and/or wireless connection). The ECU 50 may be configured to control movement and/or positions of the first seat 321, the second seat 322, the third seat 323, and/or the fourth seat 324.


With embodiments, such as generally illustrated in FIGS. 1, 2, and 3, the seat system 20 may include and/or be connected to one or more vehicle sensors 54. The vehicle sensors 54 may, for example and without limitation, be configured to obtain crash information and/or detect a crash situation (e.g., if the vehicle 22 has experienced a collision). The vehicle sensors 54 may be connected with the ECU 50 such that the ECU 50 may receive information from the vehicle sensors 54 to determine whether a collision occurred. The vehicle sensors 54 may be sensors configured to measure a change in direction and/or sudden force (e.g., such as via force sensors, gyro-sensors, acceleration sensors, etc.). The vehicle sensors 54 may, for example, disposed in one or more of a variety of locations in and/or around a vehicle 22, such as at or about outer portions of a vehicle 22.


In embodiments, such as generally illustrated in FIGS. 2, 3, and 4, one or more seats 32N of the seat system 20 may include and/or be connected to a respective sensor assembly 60N. For example and without limitation, the first seat 321 may include a first sensor assembly 601, the second seat 322 may include a second sensor assembly 602, the third seat 323 may include a third sensor assembly 603, and/or the fourth seat 324 may include a fourth sensor assembly 604. The sensor assemblies 60N may include one or more of a variety of shapes, sizes, and/or configurations. The sensor assemblies 60N may include one or more of a variety of different types of sensors, such as contact sensors and/or contactless sensors. For example and without limitation, the sensor assemblies 60N may include, but are not limited to, occupancy sensors, pressure sensors, weight sensors, piezoelectric sensors, electrodermal sensors, photoplethysmographic sensors, seat bladders (e.g., fluid/air bladders), steering wheel sensors, biomedical sensors (e.g., configured to sense biomedical, biometric, and/or physiological information of a user), contact sensors, contactless sensors, cameras, vision devices, and/or radar sensors. The sensor assemblies 60N may be electrically connected with the ECU 50 (e.g., via a wired and/or wireless connection). The ECU 50 may be configured to receive user/occupant information from the first sensor assembly 601, the second sensor assembly 602, the third sensor assembly 603, and/or the fourth sensor assembly 604 corresponding with the first seat 321, the second seat 322, the third seat 323, and/or the fourth seat 324, respectively. The ECU 50 may be configured to analyze and/or interpret the user/occupant information, and/or the ECU 50 may be configured to transmit the user/occupant information for analyzing and/or interpretation, such as to a remote server 70. A remote server 70 may, for example and without limitation, include a computing device disposed outside of a vehicle 22.


With embodiments, such as generally illustrated in FIGS. 2, 3, and 4, one or more sensor assemblies 60N may include an occupancy sensor 62N. For example and without limitation, the first sensor assembly 601 may include a first occupancy sensor 621, the second sensor assembly 602 may include a second occupancy sensor 622, the third sensor assembly 603 may include a third occupancy sensor 623, and/or the fourth sensor assembly 604 may include a fourth occupancy sensor 624. The occupancy sensors 62N may include one or more of a variety of sensors. For example and without limitation, the occupancy sensors 62N may be a single sensor or any combination of sensors, such as pressure sensors, weight sensors, piezoelectric sensors, radar sensors, and/or proximity sensors. The occupancy sensors 62N may be disposed proximate a surface of the seat backs 34N, and/or the seat bases 36N. The ECU 50 may receive occupancy information from the occupancy sensors 62N, and/or the ECU 50 may be configured to determine the occupancy of the first seat 321, the second seat 322, the third seat 323, and/or the fourth seat 324 via the sensor assemblies 601, 602, 603, 604 (e.g., from the occupancy information).


In embodiments, such as generally illustrated in FIGS. 1, 2, 3, and 4, the ECU 50 may be connected with a detection system 64, which may, for example, include a radar sensor/system, a Lidar sensor/system, and/or one or more cameras, among others. The detection system 64 may be disposed within the vehicle 22 such that one or more of seats 32N are within a range or a field of view of the detection system 64 or component thereof. For example and without limitation, the detection system 64 may be disposed substantially towards a front of the vehicle 22 and/or substantially on a ceiling/roof 26 of a vehicle 22. The detection system 64 may be configured to obtain information about whether a seat 32N is occupied, such as by an occupant and/or by cargo. The detection system 64 may be connected (e.g., wired and/or wirelessly) with the ECU 50, and/or may be configured to transmit information to the ECU 50 regarding the state of occupancy for the seats 32N. The ECU 50 may determine the occupancy of the seats 32N based on the information from by the detection system 64. The ECU 50 and/or the detection system 64 may be configured for image processing and/or facial identification.


With embodiments, the ECU 50 and/or the detection system 64 may be configured to obtain reference occupancy information that may correspond to the seats 32N being unoccupied, and may compare current information to the reference information to determine if a seat 32N is occupied. Additionally or alternatively, the ECU 50 and/or the detection system 64 may be configured to determine whether a seat 32N is occupied by an occupant or by cargo. For example and without limitation, if the ECU 50 and/or the detection system 64 determines that a seat 32N is occupied and that whatever is occupying the seat 32N is moving (e.g., fidgeting, talking, adjusting a seat belt, etc.), the ECU 50 and/or the detection system 64 may determine that the seat 32N is occupied by an occupant and not just by cargo.


In embodiments, such as generally illustrated in FIGS. 1, 2, 3, and 4, one or more sensor assemblies 60N may include a biomedical sensor 66N. For example and without limitation, the first sensor assembly 601 may include a first biomedical sensor 661, the second sensor assembly 602 may include a second biomedical sensor 662, the third sensor assembly 603 may include a third biomedical sensor 663, and/or the fourth sensor assembly 604 may include a fourth biomedical sensor 664. The biomedical sensors 66N may be disposed at least partially within the seat backs 34N and/or the seat bases 36N. The biomedical sensors 66N may be disposed substantially proximate the user (e.g., a seat occupant) such as to increase the accuracy of the biomedical information sensed. The biomedical sensors 66N may include one or more of a variety of sensors. For example and without limitation, the biomedical sensors 66N may include sensors (e.g., piezoelectric sensors) configured to measure/read biomedical information (e.g., heart rate, breathing rate, blood pressure, fidget movements, etc.) of a user associated with (e.g., occupying, on, near, etc.) a seat 32N of the seat assembly 30. The ECU 50 may be electrically connected (e.g., wired and/or wirelessly) with the biomedical sensors 66N, and/or may be configured to receive biomedical information from the biomedical sensors 66N. The ECU 50 may be configured to analyze, interpret, and/or transmit the biomedical information. For example and without limitation, the ECU 50 may be configured to assess the medical state of a user/occupant from the biomedical information, and/or to identify a user/occupant.


With embodiments, such as generally illustrated in FIGS. 2 and 3, a biomedical sensor 66N may include a first sensor portion 66AN (e.g., first sensor portions 66A1, 66A2, 66A3, 66A4) and/or a second sensor portion 66BN (e.g., second sensor portions 66B1, 66B2, 66B3, 66B4). The first sensor portion 66AN and/or the second sensor portion 66BN may be disposed substantially within the seat backs 34N. The first sensor portion 66AN and/or the second sensor portion 66BN may be disposed substantially proximate an outer surface of the seat back 34N, such as to be sufficiently close to a user/occupant when or for collecting/reading/sensing information. The first sensor portion 66AN and/or the second sensor portion 66BN may extend substantially in a Z-direction and/or in a direction substantially parallel to the seat back 34N (e.g., such as to align along a back of a user/occupant). Such as generally shown in FIG. 2, the first sensor portion 66AN may be disposed substantially at a first side of the seat back 34N (e.g., a left side), and/or the second sensor portion 66BN may be disposed substantially on a second side of the seat back 34N (e.g., a right side and/or opposite the first side). The first sensor portion 66AN may be disposed to be substantially proximate a first lung of a user/occupant, and/or the second sensor portion 66BN may be disposed to be substantially proximate a second lung of a user/occupant. The first sensor portion 66AN and/or the second sensor portion 66BN may be configured to monitor respiratory information of the first lung and/or the second lung, respectively. For example and without limitation, the first sensor portion 66AN and/or the second sensor portion 66BN may be configured to individually measure a breathing rate, respiratory pattern, and/or a variety of other respiration information for each corresponding lung.


In embodiments, the ECU 50 may be configured to monitor one or more physiological parameters of a user/occupant (e.g., heart rate, heart rate variability, fidgets, sneezing, drowsiness symptoms, diabetes, kidney function, etc.) via the biomedical sensors 66N. Movement of a user associated with breathing may impair monitoring physiological parameters (e.g., effectively act as noise), and the ECU 50 may filter out noise created by breathing motions to more accurately identify physiological parameters of a user (e.g., by removing/ignoring the measured breathing pattern of a user and/or motion detected by the detection system 64, such as via motion artifact correction). Additionally or alternatively, the ECU 50 may be configured to detect whether a user is speaking, which may alter the breathing pattern of the user, and the ECU 50 may be configured to compensate for such breathing pattern alterations (e.g., ignore/remove the alterations and/or estimates thereof).


With embodiments, such as generally illustrated in FIGS. 1, 2, 3, 4, and 5, the ECU 50 may be connected with a remote server 70. The ECU 50 may be configured to transmit information to the remote server 70 and/or receive information from the remote server 70. The remote server 70 may be configured to transmit information to the ECU 50 and/or receive information from the ECU 50. For example and without limitation, the remote server 70 may transmit a user profile to the ECU 50, which may be disposed within a vehicle 22. The user profile may include biometric profile information (e.g., for identifying a user) and/or medical/biomedical profile information (e.g., representing the medical history of the user). The ECU 50 and/or the remote server 70 may be configured to interpret/analyze the biometric profile information and/or the medical profile information. For example and without limitation, the remote server 70 may be connected one or more of a variety of additional devices/locations (e.g., servers, medical databases, emergency medical services, etc.) that may be capable of transmitting information and/or analyzing sensed information from the seat assembly 30.


In embodiments, the ECU 50 may receive biometric profile information and/or the ECU 50 may be configured to compare biometric profile information with sensed biometric/biomedical information to identify a user and the corresponding seat 32N. If the biometric profile information is substantially consistent with and/or similar to the sensed biometric/biomedical information, the ECU 50 may determine that the correct user is seated in the corresponding seat 32N. If the biometric profile information is not substantially consistent with and/or similar to the sensed biometric/biomedical information, the ECU 50 may determine that an incorrect user is seated in the respective seat 32N.


In embodiments, such as generally illustrated in FIGS. 4 and 5, the ECU 50 may be configured to monitor a physical condition (e.g., a medical state) of a user occupying a seat 32N of the seat assembly 30. The seat system 20 (e.g., the ECU 50) may be configured to sense a breathing pattern of a user via a sensor assembly 60N. The ECU 50 may be configured to generate a breathing pattern waveform 80 for one or more occupied seats 32N. The ECU 50 may receive sensed biomedical information from the sensor assemblies 60N, and/or the ECU 50 may plot/graph the sensed biomedical information into a breathing pattern waveform 80 to analyze the physical condition of a user. The ECU 50 may be configured to identify a user, receive information from the biomedical sensors 66N, generate a breathing pattern waveform 80 corresponding to the user, and/or store the biomedical information for transmission and/or analysis.


In embodiments, a remote server 70 may include and/or be connected to a medical server 90. While the remote server 70 and the medical server 90 are shown as separate for illustrative purposes in the embodiments of FIGS. 4 and 5, a remote server 70 and a medical server 90 may be incorporated together, at least in part, and/or may share one or more components (e.g., processors, memory, storage, etc.). The medical server 90 may include medical records and/or a medical database. The ECU 50 may transmit the breathing pattern waveform 80 to the remote server 70 (e.g., the medical server 90) such that the remote server 70 may compare the breathing pattern waveform 80 to information or data, for example one or more breathing pattern samples 92 stored in the remote server 70 (and/or a storage device connected thereto). For example and without limitation, the remote server 70 may be configured to access a variety of breathing pattern samples 92 (e.g., pathological respiration signatures) corresponding with a variety of respiratory medical conditions. The breathing pattern samples 92 may include, but are not be limited to, respiration waveforms associated with Biot's respiration (sample 94), Kussmaul breathing (sample 96), and/or Cheyne-Strokes respiration (sample 98), such as generally illustrated in FIG. 5.


With embodiments, a user profile and/or a remote server 70 may include biometric profile information corresponding to a typical/normal breathing pattern for a user. The ECU 50 and/or the remote server 70 may analyze the breathing pattern waveform 80 to assess the physical condition of the user. For example and without limitation, if (i) the ECU 50 and/or the remote server 70 matches the breathing pattern waveform 80 with a breathing pattern sample 94, 96, 98, and/or (ii) the breathing pattern waveform 80 is substantially different from the typical breathing pattern for a user (as indicated by the user profile), the ECU 50 and/or the remote server 70 may connect/transmit corresponding information (e.g., matching breathing pattern sample 92, 94, 96, breathing pattern abnormality, etc.) to one or more other devices, such as to a medical server 90, which may be associated with a medical and/or emergency services provider. The medical server 90 may conduct further analysis of the breathing pattern waveform 80, such as to assess and/or confirm a physical condition (e.g., a medical state) of one or more users of the seats 32N.


In embodiments, a typical/normal breathing pattern for a particular user may be an abnormal breathing pattern (e.g., somewhat-irregular breathing pattern) compared to an average user. The ECU 50 and/or the remote server 70 may use the abnormal breathing pattern as a baseline (e.g., the typical/normal pattern for the particular user) for comparison with the generated breathing pattern waveform 80.


In embodiments, an ECU 50 and/or a remote server 70 may determine whether a collision has occurred and/or may transmit crash information (e.g., as sensed by the vehicle sensors 54) to another location/device, such as to the remote server 70 and/or the medical server 90. The medical server 90 may utilize the crash information and the breathing pattern waveform 80 in assessing a physical condition/medical state of users. For example and without limitation, the ECU 50 and/or the remote server 70 may determine the direction/location of a vehicle collision and/or may prioritize the assessment of the seat(s) 32N and the users therein substantially proximate the direction/location of the collision as users closer to the impact zone of the collision may require more immediate medical attention. Once the medical server 90 determines that a collision has occurred and/or that users are in need of medical attention, the medical server 90 may contact (e.g., dispatch) an emergency vehicle 130 (e.g., ambulance, fire truck, police car, etc.) to travel to the location of the vehicle 22.


With embodiments, an ECU 50, a remote server 70, and/or a medical server 90 may be configured to transmit user physical condition information to the emergency vehicle 130 and/or the operators thereof, such as via an electronic device 132 that may be associated with the emergency vehicle 130 and/or the operators. For example and without limitation, the user physical condition information may indicate a priority of assistance for the users of the vehicle 22. The medical server 90 may analyze the breathing pattern waveform 80 for each user and/or the medical server 90 may determine which users need medical assistance more urgently than other users within the vehicle 22. Determining the urgency for medical assistance may include evaluating pre-existing medical conditions and/or prior injuries, which may increase the urgency for medical assistance (e.g., high blood pressure, use of blood thinners, impaired immune system, etc.) or decrease the urgency for medical assistance (e.g., if a detected issue existed before the collision and was already being treated). Urgency information may be included in the user physical condition information, which may be provided to the emergency vehicle 130, and/or the user physical condition information may include a recommended priority of assistance for the users of the vehicle 22. Upon an emergency vehicle 130 arriving at the vehicle 22 after a collision, emergency vehicle operators/medical professionals may, for example and without limitation, assist users in the priority order as indicated by the user physical condition information. Medical personnel may consider the user physical condition information when preparing for and/or while assisting users.


With embodiments, an ECU 50, a remote server 70, and/or a medical server 90 may determine whether any users have been ejected from a respective seat 32N, such as a result of a collision, or if any users have exited the vehicle 22 (e.g., via the occupancy sensors 62N). For example and without limitation, the ECU 50 may determine which seats 32N were occupied (e.g., by a living being) prior to the collision and which seats 32N are unoccupied after the collision (e.g., immediately after, before users may deliberately exit the seats 32N). If any seats 32N that were occupied prior to the collision are unoccupied after the collision, the ECU 50 may provide an indication that a user and/or a number of users may have been ejected from a seat 32N and/or a vehicle 22, such as to a remote server 70 and/or a medical server 90. The ECU 50 may be configured to exclude seats 32N that are unoccupied after the collision if a restraint system of the seat 32N has been deactivated by a user (e.g., if a vehicle sensor 54 and/or the sensor assembly 60 detects that a user unbuckled a seat belt).


In embodiments, a medical server 90 may combine the information from one or more vehicle sensors 54 (e.g., accelerometer, speedometer, compass, GPS, etc.) and one or more occupancy sensors 62N to determine an expected location for a user that may have been ejected from the vehicle 22 as a result of the collision. For example and without limitation, the ECU 50 may provide a vehicle speed and/or deceleration at the time of the collision to the medical server 90, which may estimate a potential distance from the vehicle 22 that a user may be according, at least in part, to the speed/deceleration.


With embodiments, an ECU 50, a remote server 70, and/or a medical server 90 may be configured to obtain an authorization from a user to share health/medical information with appropriate third parties (e.g., medical professionals), such as to comply with Health Insurance Portability and Accountability Act (HIPAA) regulations. An ECU 50, a remote server 70, and/or a medical server 90 may not communicate any health or medical information about a user unless such an authorization has been obtained.


In embodiments, a method 100 of operating a seat system 20 may include providing a seat system 20, which may include an ECU 50 and a seat assembly 30 having one or more seats 32N (step 102). The method 100 may include the ECU 50 receiving a user profile from a remote server 70 (step 104). The user profile may include one or more of a variety of types of information. For example and without limitation, the user profile may include a typical breathing pattern of a user, a seat assignment, prior/current medical conditions, and/or biometric information. The method 100 may include sensing (e.g., via one or more biomedical sensors 66N, and/or one or more occupancy sensors 62N) biometric/biomedical information of a user occupying a seat 32, which may be used by the ECU 50 for identifying and/or confirming the identity of the user (step 106). The method 100 may include generating a breathing pattern waveform 80 for users seated in respective seats 32N of the seat system (step 108). The method 100 may include the ECU 50 comparing the breathing pattern waveform 80 with information from the corresponding user profile. The ECU 50 may determine whether the breathing pattern waveform 80 is substantially similar to the information from the user profile (e.g., an expected breathing pattern waveform for the user). If the breathing pattern waveform 80 is substantially similar to/consistent with the biomedical information, the ECU 50 may not take further diagnostic action. However, if the breathing pattern waveform 80 is materially different than the biomedical information from the user profile, the ECU 50 may transmit the information to a remote server 70 (step 110).


With embodiments, the remote server 70 may include a medical server 90 and/or medical database. The method 100 may include the remote server 70 determining a physical condition/medical state of a user (step 112), which may include analyzing the breathing pattern waveform 80. Analyzing a breathing pattern waveform 80 may include comparing the breathing pattern waveform 80 with a variety of waveforms corresponding to medical conditions. If the remote server 70 determines that a user requires medical assistance, the remote server 70 may transmit user physical condition information to a medical server 90 (e.g., a medical assistance provider) (step 114). The physical condition information may a priority order for the users of a seat system 20. Medical personnel associated with the emergency service may use the user physical status information, which may include the priority order, to determine the order in which to provide medical assistance to the users. Users with more traumatic/urgent conditions may be treated before other users according to the determined priority order.


In embodiments, one or more portions of a method of operating a seat system 20 (e.g., method 100) may be conducted while a vehicle 22 is in motion and/or when a vehicle 22 is stopped.


In embodiments, one or more activities that may be conducted by an ECU 50, a remote server 70, and/or a medical server 90 may, additionally or alternatively, be conducted by another one of the ECU 50, the remote server 70, and/or the medical server 90.


In examples, a computing device (e.g., ECU 50, remote server 70, medical server 90) may include an electronic controller and/or include an electronic processor, such as a programmable microprocessor and/or microcontroller. In embodiments, a computing device may include, for example, an application specific integrated circuit (ASIC). A computing device may include a central processing unit (CPU), a memory (e.g., a non-transitory computer-readable storage medium), and/or an input/output (I/O) interface. A computing device may be configured to perform various functions, including those described in greater detail herein, with appropriate programming instructions and/or code embodied in software, hardware, and/or other medium. In embodiments, a computing device may include a plurality of controllers. In embodiments, a computing device may be connected to a display, such as a touchscreen display.


Various examples/embodiments are described herein for various apparatuses, systems, and/or methods. Numerous specific details are set forth to provide a thorough understanding of the overall structure, function, manufacture, and use of the examples/embodiments as described in the specification and illustrated in the accompanying drawings. It will be understood by those skilled in the art, however, that the examples/embodiments may be practiced without such specific details. In other instances, well-known operations, components, and elements have not been described in detail so as not to obscure the examples/embodiments described in the specification. Those of ordinary skill in the art will understand that the examples/embodiments described and illustrated herein are non-limiting examples, and thus it can be appreciated that the specific structural and functional details disclosed herein may be representative and do not necessarily limit the scope of the embodiments.


Reference throughout the specification to “examples, “in examples,” “with examples,” “various embodiments,” “with embodiments,” “in embodiments,” or “an embodiment,” or the like, means that a particular feature, structure, or characteristic described in connection with the example/embodiment is included in at least one embodiment. Thus, appearances of the phrases “examples, “in examples,” “with examples,” “in various embodiments,” “with embodiments,” “in embodiments,” or “an embodiment,” or the like, in places throughout the specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more examples/embodiments. Thus, the particular features, structures, or characteristics illustrated or described in connection with one embodiment/example may be combined, in whole or in part, with the features, structures, functions, and/or characteristics of one or more other embodiments/examples without limitation given that such combination is not illogical or non-functional. Moreover, many modifications may be made to adapt a particular situation or material to the teachings of the present disclosure without departing from the scope thereof.


It should be understood that references to a single element are not necessarily so limited and may include one or more of such element. Any directional references (e.g., plus, minus, upper, lower, upward, downward, left, right, leftward, rightward, top, bottom, above, below, vertical, horizontal, clockwise, and counterclockwise) are only used for identification purposes to aid the reader's understanding of the present disclosure, and do not create limitations, particularly as to the position, orientation, or use of examples/embodiments.


Joinder references (e.g., attached, coupled, connected, and the like) are to be construed broadly and may include intermediate members between a connection of elements and relative movement between elements. As such, joinder references do not necessarily imply that two elements are directly connected/coupled and in fixed relation to each other. The use of “e.g.” in the specification is to be construed broadly and is used to provide non-limiting examples of embodiments of the disclosure, and the disclosure is not limited to such examples. Uses of “and” and “or” are to be construed broadly (e.g., to be treated as “and/or”). For example and without limitation, uses of “and” do not necessarily require all elements or features listed, and uses of “or” are inclusive unless such a construction would be illogical.


While processes, systems, and methods may be described herein in connection with one or more steps in a particular sequence, it should be understood that such methods may be practiced with the steps in a different order, with certain steps performed simultaneously, with additional steps, and/or with certain described steps omitted.


All matter contained in the above description or shown in the accompanying drawings shall be interpreted as illustrative only and not limiting. Changes in detail or structure may be made without departing from the present disclosure.


It should be understood that a computing device (e.g., ECU 50, remote server 70, medical server 90), a system, and/or a processor as described herein may include a conventional processing apparatus known in the art, which may be capable of executing preprogrammed instructions stored in an associated memory, all performing in accordance with the functionality described herein. To the extent that the methods described herein are embodied in software, the resulting software can be stored in an associated memory and can also constitute means for performing such methods. Such a system or processor may further be of the type having ROM, RAM, RAM and ROM, and/or a combination of non-volatile and volatile memory so that any software may be stored and yet allow storage and processing of dynamically produced data and/or signals.


It should be further understood that an article of manufacture in accordance with this disclosure may include a non-transitory computer-readable storage medium having a computer program encoded thereon for implementing logic and other functionality described herein. The computer program may include code to perform one or more of the methods disclosed herein. Such embodiments may be configured to execute via one or more processors, such as multiple processors that are integrated into a single system or are distributed over and connected together through a communications network, and the communications network may be wired and/or wireless. Code for implementing one or more of the features described in connection with one or more embodiments may, when executed by a processor, cause a plurality of transistors to change from a first state to a second state. A specific pattern of change (e.g., which transistors change state and which transistors do not), may be dictated, at least partially, by the logic and/or code.

Claims
  • 1. A seat system, comprising a seat;a sensor assembly; andan electronic control unit (ECU) connected with the sensor assembly;wherein the sensor assembly is configured to sense a breathing pattern of a user associated with the seat;the ECU is configured to communicate with a remote server;at least one of the ECU and the remote server is configured to determine a medical state of said user according, at least in part, to the breathing pattern;the sensor assembly includes a fluid bladder disposed to align, at least in part, with a lung of said user; andthe ECU is configured to detect whether said user is speaking and compensate for breathing pattern alterations corresponding to speaking.
  • 2. The seat system of claim 1, wherein the sensor assembly includes an occupancy sensor; and the ECU is configured to obtain an occupancy status of the seat via the occupancy sensor.
  • 3. The seat system of claim 1, wherein the sensor assembly includes a biomedical sensor; the biomedical sensor includes a piezoelectric sensor and/or a radar sensor; andthe ECU is configured to sense the breathing pattern of said user via the biomedical sensor.
  • 4. The seat system of claim 1, wherein the sensor assembly includes a camera or vision device.
  • 5. The seat system of claim 1, wherein the ECU is configured to generate a breathing pattern waveform associated with the breathing pattern of said user.
  • 6. The seat system of claim 5, wherein the remote server is connected with a medical database; the medical database includes sample breathing patterns; andat least one of the ECU and the remote server is configured to compare the breathing pattern waveform with the sample breathing patterns.
  • 7. The seat system of claim 1, wherein the remote server is configured to connect with a medical server; and the remote server is configured to transmit the medical state of said user to the medical serverat least one of the remote server and the medical server is configured to transmit the medical state to an emergency vehicle.
  • 8. A vehicle seat system, comprising: a seat assembly including a first seat and a second seat;a sensor assembly configured to obtain first breathing pattern information associated with a first user associated with the first seat and second breathing pattern information associated with a second user associated with the second seat; andan ECU connected with the sensor assembly;wherein the ECU is configured to communicate with a remote server;at least one of the ECU and the remote server is configured to determine (i) a first medical state of said first user according, at least in part, to the first breathing pattern information, and (ii) a second medical state of said second user according, at least in part, to the second breathing pattern information; andat least one of the ECU and said remote server is configured to determine a medical priority order of said first user and said second user according to the first medical state and the second medical state.
  • 9. The vehicle seat system of claim 8, wherein the sensor assembly includes a biomedical sensor connected to the first seat; the biomedical sensor includes a first portion and a second portion;the first portion is configured to sense first lung information corresponding to a first lung of said first user; andthe second portion is separate from the first portion and is configured to sense second lung information corresponding to a second lung of said first user.
  • 10. The vehicle seat system of claim 8, wherein at least one of the ECU and said remote server is configured to determine whether any users have been ejected from the seat assembly.
  • 11. The vehicle seat system of claim 8, wherein determining the medical priority order includes evaluating pre-existing medical conditions of the first user and the second user.
  • 12. The vehicle seat system of claim 8, wherein said remote server is configured to transmit the medical priority order, the first breathing pattern information, and the second breathing pattern information to an electronic device associated with an emergency vehicle and/or a medical services professional to facilitate treatment of at least one of said first user and said second user.
  • 13. A method of operating a seat system including an ECU and a seat assembly having a seat, the method comprising: providing a sensor assembly connected with the seat and the ECU;sensing biomedical information of a user of the seat via the sensor assembly;determining a medical state of said user according, at least in part, to the biomedical information;transmitting at least one of the biomedical information and the medical state to a remote server;sensing, via an additional sensor connected to an additional seat, additional biomedical information of an additional user associated with the additional seat; anddetermining a medical priority order of the said user and said additional user according to the biomedical information and the additional biomedical information.
  • 14. The method of claim 13, wherein sensing biomedical information includes generating a breathing pattern waveform corresponding to said user.
  • 15. The method of claim 14, including receiving a user profile associated with said user from the remote server, the user profile including biomedical information corresponding with said user; and comparing the breathing pattern waveform with the biomedical information of the user profile to determine whether the user requires medical attention.
  • 16. The method of claim 14, wherein determining the medical state of said user includes comparing the breathing pattern waveform with a medical database connected with the remote server.
  • 17. The method of claim 13, wherein the sensor assembly includes a biomedical sensor having a first portion and a second portion; sensing biomedical information includes the first portion sensing a first lung of said user;andsensing biomedical information includes the second portion sensing a second lung of said user.
  • 18. The method of claim 13, wherein the sensor assembly includes a fluid bladder aligned, at least in part, with a lung of said user.
  • 19. The method of claim 13, including transmitting the medical priority order to an electronic device associated with an emergency vehicle and/or a medical services professional.
  • 20. The vehicle seat system of claim 8, wherein the ECU is configured to detect whether said first user is speaking and compensate for breathing pattern alterations corresponding to speaking.
US Referenced Citations (119)
Number Name Date Kind
5769490 Falzon Jun 1998 A
6056360 Schneider May 2000 A
6088642 Finkelstein et al. Jul 2000 A
6088643 Long et al. Jul 2000 A
6098000 Long et al. Aug 2000 A
6345839 Kuboki et al. Feb 2002 B1
6353207 Burt Mar 2002 B1
6506153 Littek et al. Jan 2003 B1
6559422 Burt May 2003 B2
6682494 Sleichter, III et al. Jan 2004 B1
6908152 McMillen Jun 2005 B2
7011369 Massara et al. Mar 2006 B2
7083232 Frank Aug 2006 B2
7083233 Massara et al. Aug 2006 B2
7152920 Sugiyama et al. Dec 2006 B2
7201446 Massara et al. Apr 2007 B2
7219923 Fujita et al. May 2007 B2
7267652 Coyle et al. Sep 2007 B2
7303231 Frank Dec 2007 B2
7314451 Halperin et al. Jan 2008 B2
7417536 Lakshmanan et al. Aug 2008 B2
7731279 Asada et al. Jun 2010 B2
7808395 Raisanen et al. Oct 2010 B2
7862119 Schafer et al. Jan 2011 B2
7866755 Okano Jan 2011 B2
7900736 Breed Mar 2011 B2
7967379 Walters et al. Jun 2011 B2
7967381 Sugiyama Jun 2011 B2
8341786 Oexman et al. Jan 2013 B2
8444558 Young et al. May 2013 B2
8616654 Zenk et al. Dec 2013 B2
8706204 Seo et al. Apr 2014 B2
8710784 Meyer et al. Apr 2014 B2
8725311 Breed May 2014 B1
8794707 Bocsanyi et al. Aug 2014 B2
8958955 Hotary et al. Feb 2015 B2
8971839 Hong Mar 2015 B2
8979191 Friderich et al. Mar 2015 B2
8989697 Leung et al. Mar 2015 B2
9237242 Basir Jan 2016 B2
9272647 Gawade et al. Mar 2016 B2
9272689 Fung et al. Mar 2016 B2
9277385 Iwamoto Mar 2016 B2
9427598 Pilla et al. Aug 2016 B2
9504416 Young et al. Nov 2016 B2
9815385 Lippman et al. Nov 2017 B2
9883821 Muehlsteff Feb 2018 B2
9978283 Jedrzejewski et al. May 2018 B2
9980680 Matsumoto May 2018 B2
9989571 Loftus Jun 2018 B2
10034631 Gallagher et al. Jul 2018 B1
10210409 Migneco et al. Feb 2019 B1
10213147 Gallagher et al. Feb 2019 B2
10328823 O'Bannon et al. Jun 2019 B2
10358065 McMillen et al. Jul 2019 B2
10369074 Oberg et al. Aug 2019 B2
10379535 Migneco et al. Aug 2019 B2
10391900 Zhao et al. Aug 2019 B2
10470968 Saren et al. Nov 2019 B2
10471868 Wheeler Nov 2019 B2
10492979 Norman et al. Dec 2019 B2
10556532 Gallagher et al. Feb 2020 B2
10569668 Migneco et al. Feb 2020 B2
10576855 Dorfler et al. Mar 2020 B2
10640010 Yetukuri et al. May 2020 B2
10709386 Gallagher et al. Jul 2020 B2
10807439 Migneco et al. Oct 2020 B2
20030075959 Xue et al. Apr 2003 A1
20040119599 Stevenson et al. Jun 2004 A1
20070118054 Pinhas May 2007 A1
20080255731 Mita et al. Oct 2008 A1
20080267460 Aoki et al. Oct 2008 A1
20090008970 Flory et al. Jan 2009 A1
20090030578 Periot et al. Jan 2009 A1
20100087748 Tobola et al. Apr 2010 A1
20110015468 Aarts et al. Jan 2011 A1
20120080911 Brykalski et al. Apr 2012 A1
20120086249 Rotary et al. Apr 2012 A1
20130090816 Huber Apr 2013 A1
20130251216 Smowton et al. Sep 2013 A1
20140070943 Breed Mar 2014 A1
20140207333 Vandivier et al. Jul 2014 A1
20140319895 Lange-Mao et al. Oct 2014 A1
20140361871 Silva et al. Dec 2014 A1
20150266405 Fitzpatrick et al. Sep 2015 A1
20150313475 Benson et al. Nov 2015 A1
20150351692 Pereny et al. Dec 2015 A1
20150352979 O'Bannon et al. Dec 2015 A1
20150352990 Zouzal et al. Dec 2015 A1
20160001781 Fung Jan 2016 A1
20160143803 Portales May 2016 A1
20160250956 Setting et al. Sep 2016 A1
20160278709 Ridao Granado et al. Sep 2016 A1
20170043681 Seiller et al. Feb 2017 A1
20170086588 Patrick et al. Mar 2017 A1
20170274906 Hassan Sep 2017 A1
20170361748 Meachum et al. Dec 2017 A1
20180008507 Saren et al. Jan 2018 A1
20180009343 Saren et al. Jan 2018 A1
20180110960 Youngblood et al. Apr 2018 A1
20180325264 Gallagher et al. Nov 2018 A1
20180345833 Gallagher et al. Dec 2018 A1
20190053761 Young Feb 2019 A1
20190054796 Thomas Feb 2019 A1
20190126036 Franco-Obregon et al. May 2019 A1
20190133511 Migneco et al. May 2019 A1
20190168771 Migneco et al. Jun 2019 A1
20190193591 Migneco et al. Jun 2019 A1
20190239815 Gallagher et al. Aug 2019 A1
20190275860 Migneco et al. Sep 2019 A1
20190332902 Gallagher et al. Oct 2019 A1
20190337431 McMillen et al. Nov 2019 A1
20190344043 Migneco et al. Nov 2019 A1
20200035237 Kim et al. Jan 2020 A1
20200113344 Youngblood et al. Apr 2020 A1
20200170576 Lerner Jun 2020 A1
20200188211 Ellermann Jun 2020 A1
20200231428 Migneco et al. Jul 2020 A1
20200253381 Dorfler et al. Aug 2020 A1
Foreign Referenced Citations (58)
Number Date Country
2855822 Jan 2007 CN
203186154 Sep 2013 CN
104252615 Dec 2014 CN
205468657 Aug 2016 CN
10027686 Jan 2002 DE
10063478 Jul 2002 DE
102004010626 Jun 2005 DE
102004013674 Oct 2005 DE
102006029871 Jan 2008 DE
102008029339 Jan 2009 DE
102009008421 Oct 2009 DE
102009035566 Feb 2010 DE
102009031331 Aug 2010 DE
102009033041 Jan 2011 DE
102010021332 Jan 2011 DE
102011012431 Nov 2011 DE
102011016073 Dec 2011 DE
102011017238 Dec 2011 DE
102011102021 Nov 2012 DE
102011113100 Mar 2013 DE
102011116194 Apr 2013 DE
102012201430 Apr 2013 DE
102012216869 Mar 2014 DE
202015104103 Aug 2015 DE
102014002942 Sep 2015 DE
102015011460 Mar 2016 DE
102015011461 Mar 2016 DE
102017110812 Jan 2018 DE
102016011481 Mar 2018 DE
202017103162 May 2018 DE
102018000765 Aug 2019 DE
102018001230 Aug 2019 DE
202019100400 Jan 2020 DE
202019100710 Feb 2020 DE
102018007921 Apr 2020 DE
202019102879 May 2020 DE
202019105369 May 2020 DE
102019008724 Aug 2020 DE
1077154 Feb 2001 EP
1749477 Feb 2007 EP
1932715 Jun 2008 EP
2149475 Feb 2010 EP
2205460 Mar 2016 EP
2988654 Oct 2013 FR
2512136 Sep 2014 GB
2001269380 Oct 2001 JP
2005137896 Jun 2005 JP
2005237456 Sep 2005 JP
2006014756 Jan 2006 JP
3857869 Dec 2006 JP
2009172145 Aug 2009 JP
2012196253 Oct 2012 JP
2013163405 Aug 2013 JP
2019131049 Aug 2019 JP
2011144280 Nov 2011 WO
2012039368 Mar 2012 WO
2013144498 Oct 2013 WO
2015127193 Aug 2015 WO