The present disclosure generally relates to seat systems, including seat systems that may be used in connection with vehicles, such as automobiles.
This background description is set forth below for the purpose of providing context only. Therefore, any aspect of this background description, to the extent that it does not otherwise qualify as prior art, is neither expressly nor impliedly admitted as prior art against the instant disclosure.
Some seat systems may not be configured to monitor medical states of users/occupants. For example, some seat systems may not be configured to assess the medical state of injured occupants following a detected collision and/or prioritize users/occupants for emergency medical assistance.
There is a desire for solutions/options that minimize or eliminate one or more challenges or shortcomings of seat systems. The foregoing discussion is intended only to illustrate examples of the present field and is not a disavowal of scope.
In embodiments, a seat system may include a seat, a sensor assembly, and/or an electronic control unit (ECU) connected with the sensor assembly. The sensor assembly may be configured to sense a breathing pattern of a user associated with the seat. The ECU may be configured to communicate with a remote server, and at least one of the ECU and the remote server may be configured to determine a medical state of said user according, at least in part, to the breathing pattern.
With embodiments, a vehicle seat system may include a seat assembly including a first seat and a second seat, a sensor assembly configured to obtain first breathing pattern information associated with a first user associated with the first seat and second breathing pattern information associated with a second user associated with the second seat, and/or an ECU connected with the sensor assembly. The ECU may be configured to communicate with a remote server. At least one of the ECU and the remote server may be configured to determine (i) a first medical state of said first user according, at least in part, to the first breathing pattern information, and (ii) a second medical state of said second user according, at least in part, to the second breathing pattern information.
In embodiments, a method of operating a seat system may include providing a sensor assembly connected with the seat and the ECU, sensing biomedical information of a user of the seat via the sensor assembly, determining a medical state of said user according, at least in part, to the biomedical information, and/or transmitting at least one of the biomedical information and the medical state to a remote server.
The foregoing and other potential aspects, features, details, utilities, and/or advantages of examples/embodiments of the present disclosure will be apparent from reading the following description, and from reviewing the accompanying drawings.
While the claims are not limited to a specific illustration, an appreciation of various aspects may be gained through a discussion of various examples. The drawings are not necessarily to scale, and certain features may be exaggerated or hidden to better illustrate and explain an innovative aspect of an example. Further, the exemplary illustrations described herein are not exhaustive or otherwise limiting, and are not restricted to the precise form and configuration shown in the drawings or disclosed in the following detailed description. Exemplary illustrations are described in detail by referring to the drawings as follows:
Reference will now be made in detail to embodiments of the present disclosure, examples of which are described herein and illustrated in the accompanying drawings. While the present disclosure will be described in conjunction with embodiments and/or examples, it will be understood that they do not limit the present disclosure to these embodiments and/or examples. On the contrary, the present disclosure covers alternatives, modifications, and equivalents.
In embodiments, such as generally illustrated in
In embodiments, such as generally illustrated in
With embodiments, such as generally illustrated in
In embodiments, such as generally illustrated in
With embodiments, some or all of the track portions 42, 44, 46, 48 may include first and second sets of fixed and movable tracks. The fixed tracks may be fixed to the mounting surface 24. The movable tracks may be connected to support members 38N and may be configured to move (e.g., slide) along the fixed tracks.
In embodiments, such as generally illustrated in
With embodiments, such as generally illustrated in
In embodiments, such as generally illustrated in
With embodiments, such as generally illustrated in
In embodiments, such as generally illustrated in
With embodiments, the ECU 50 and/or the detection system 64 may be configured to obtain reference occupancy information that may correspond to the seats 32N being unoccupied, and may compare current information to the reference information to determine if a seat 32N is occupied. Additionally or alternatively, the ECU 50 and/or the detection system 64 may be configured to determine whether a seat 32N is occupied by an occupant or by cargo. For example and without limitation, if the ECU 50 and/or the detection system 64 determines that a seat 32N is occupied and that whatever is occupying the seat 32N is moving (e.g., fidgeting, talking, adjusting a seat belt, etc.), the ECU 50 and/or the detection system 64 may determine that the seat 32N is occupied by an occupant and not just by cargo.
In embodiments, such as generally illustrated in
With embodiments, such as generally illustrated in
In embodiments, the ECU 50 may be configured to monitor one or more physiological parameters of a user/occupant (e.g., heart rate, heart rate variability, fidgets, sneezing, drowsiness symptoms, diabetes, kidney function, etc.) via the biomedical sensors 66N. Movement of a user associated with breathing may impair monitoring physiological parameters (e.g., effectively act as noise), and the ECU 50 may filter out noise created by breathing motions to more accurately identify physiological parameters of a user (e.g., by removing/ignoring the measured breathing pattern of a user and/or motion detected by the detection system 64, such as via motion artifact correction). Additionally or alternatively, the ECU 50 may be configured to detect whether a user is speaking, which may alter the breathing pattern of the user, and the ECU 50 may be configured to compensate for such breathing pattern alterations (e.g., ignore/remove the alterations and/or estimates thereof).
With embodiments, such as generally illustrated in
In embodiments, the ECU 50 may receive biometric profile information and/or the ECU 50 may be configured to compare biometric profile information with sensed biometric/biomedical information to identify a user and the corresponding seat 32N. If the biometric profile information is substantially consistent with and/or similar to the sensed biometric/biomedical information, the ECU 50 may determine that the correct user is seated in the corresponding seat 32N. If the biometric profile information is not substantially consistent with and/or similar to the sensed biometric/biomedical information, the ECU 50 may determine that an incorrect user is seated in the respective seat 32N.
In embodiments, such as generally illustrated in
In embodiments, a remote server 70 may include and/or be connected to a medical server 90. While the remote server 70 and the medical server 90 are shown as separate for illustrative purposes in the embodiments of
With embodiments, a user profile and/or a remote server 70 may include biometric profile information corresponding to a typical/normal breathing pattern for a user. The ECU 50 and/or the remote server 70 may analyze the breathing pattern waveform 80 to assess the physical condition of the user. For example and without limitation, if (i) the ECU 50 and/or the remote server 70 matches the breathing pattern waveform 80 with a breathing pattern sample 94, 96, 98, and/or (ii) the breathing pattern waveform 80 is substantially different from the typical breathing pattern for a user (as indicated by the user profile), the ECU 50 and/or the remote server 70 may connect/transmit corresponding information (e.g., matching breathing pattern sample 92, 94, 96, breathing pattern abnormality, etc.) to one or more other devices, such as to a medical server 90, which may be associated with a medical and/or emergency services provider. The medical server 90 may conduct further analysis of the breathing pattern waveform 80, such as to assess and/or confirm a physical condition (e.g., a medical state) of one or more users of the seats 32N.
In embodiments, a typical/normal breathing pattern for a particular user may be an abnormal breathing pattern (e.g., somewhat-irregular breathing pattern) compared to an average user. The ECU 50 and/or the remote server 70 may use the abnormal breathing pattern as a baseline (e.g., the typical/normal pattern for the particular user) for comparison with the generated breathing pattern waveform 80.
In embodiments, an ECU 50 and/or a remote server 70 may determine whether a collision has occurred and/or may transmit crash information (e.g., as sensed by the vehicle sensors 54) to another location/device, such as to the remote server 70 and/or the medical server 90. The medical server 90 may utilize the crash information and the breathing pattern waveform 80 in assessing a physical condition/medical state of users. For example and without limitation, the ECU 50 and/or the remote server 70 may determine the direction/location of a vehicle collision and/or may prioritize the assessment of the seat(s) 32N and the users therein substantially proximate the direction/location of the collision as users closer to the impact zone of the collision may require more immediate medical attention. Once the medical server 90 determines that a collision has occurred and/or that users are in need of medical attention, the medical server 90 may contact (e.g., dispatch) an emergency vehicle 130 (e.g., ambulance, fire truck, police car, etc.) to travel to the location of the vehicle 22.
With embodiments, an ECU 50, a remote server 70, and/or a medical server 90 may be configured to transmit user physical condition information to the emergency vehicle 130 and/or the operators thereof, such as via an electronic device 132 that may be associated with the emergency vehicle 130 and/or the operators. For example and without limitation, the user physical condition information may indicate a priority of assistance for the users of the vehicle 22. The medical server 90 may analyze the breathing pattern waveform 80 for each user and/or the medical server 90 may determine which users need medical assistance more urgently than other users within the vehicle 22. Determining the urgency for medical assistance may include evaluating pre-existing medical conditions and/or prior injuries, which may increase the urgency for medical assistance (e.g., high blood pressure, use of blood thinners, impaired immune system, etc.) or decrease the urgency for medical assistance (e.g., if a detected issue existed before the collision and was already being treated). Urgency information may be included in the user physical condition information, which may be provided to the emergency vehicle 130, and/or the user physical condition information may include a recommended priority of assistance for the users of the vehicle 22. Upon an emergency vehicle 130 arriving at the vehicle 22 after a collision, emergency vehicle operators/medical professionals may, for example and without limitation, assist users in the priority order as indicated by the user physical condition information. Medical personnel may consider the user physical condition information when preparing for and/or while assisting users.
With embodiments, an ECU 50, a remote server 70, and/or a medical server 90 may determine whether any users have been ejected from a respective seat 32N, such as a result of a collision, or if any users have exited the vehicle 22 (e.g., via the occupancy sensors 62N). For example and without limitation, the ECU 50 may determine which seats 32N were occupied (e.g., by a living being) prior to the collision and which seats 32N are unoccupied after the collision (e.g., immediately after, before users may deliberately exit the seats 32N). If any seats 32N that were occupied prior to the collision are unoccupied after the collision, the ECU 50 may provide an indication that a user and/or a number of users may have been ejected from a seat 32N and/or a vehicle 22, such as to a remote server 70 and/or a medical server 90. The ECU 50 may be configured to exclude seats 32N that are unoccupied after the collision if a restraint system of the seat 32N has been deactivated by a user (e.g., if a vehicle sensor 54 and/or the sensor assembly 60 detects that a user unbuckled a seat belt).
In embodiments, a medical server 90 may combine the information from one or more vehicle sensors 54 (e.g., accelerometer, speedometer, compass, GPS, etc.) and one or more occupancy sensors 62N to determine an expected location for a user that may have been ejected from the vehicle 22 as a result of the collision. For example and without limitation, the ECU 50 may provide a vehicle speed and/or deceleration at the time of the collision to the medical server 90, which may estimate a potential distance from the vehicle 22 that a user may be according, at least in part, to the speed/deceleration.
With embodiments, an ECU 50, a remote server 70, and/or a medical server 90 may be configured to obtain an authorization from a user to share health/medical information with appropriate third parties (e.g., medical professionals), such as to comply with Health Insurance Portability and Accountability Act (HIPAA) regulations. An ECU 50, a remote server 70, and/or a medical server 90 may not communicate any health or medical information about a user unless such an authorization has been obtained.
In embodiments, a method 100 of operating a seat system 20 may include providing a seat system 20, which may include an ECU 50 and a seat assembly 30 having one or more seats 32N (step 102). The method 100 may include the ECU 50 receiving a user profile from a remote server 70 (step 104). The user profile may include one or more of a variety of types of information. For example and without limitation, the user profile may include a typical breathing pattern of a user, a seat assignment, prior/current medical conditions, and/or biometric information. The method 100 may include sensing (e.g., via one or more biomedical sensors 66N, and/or one or more occupancy sensors 62N) biometric/biomedical information of a user occupying a seat 32, which may be used by the ECU 50 for identifying and/or confirming the identity of the user (step 106). The method 100 may include generating a breathing pattern waveform 80 for users seated in respective seats 32N of the seat system (step 108). The method 100 may include the ECU 50 comparing the breathing pattern waveform 80 with information from the corresponding user profile. The ECU 50 may determine whether the breathing pattern waveform 80 is substantially similar to the information from the user profile (e.g., an expected breathing pattern waveform for the user). If the breathing pattern waveform 80 is substantially similar to/consistent with the biomedical information, the ECU 50 may not take further diagnostic action. However, if the breathing pattern waveform 80 is materially different than the biomedical information from the user profile, the ECU 50 may transmit the information to a remote server 70 (step 110).
With embodiments, the remote server 70 may include a medical server 90 and/or medical database. The method 100 may include the remote server 70 determining a physical condition/medical state of a user (step 112), which may include analyzing the breathing pattern waveform 80. Analyzing a breathing pattern waveform 80 may include comparing the breathing pattern waveform 80 with a variety of waveforms corresponding to medical conditions. If the remote server 70 determines that a user requires medical assistance, the remote server 70 may transmit user physical condition information to a medical server 90 (e.g., a medical assistance provider) (step 114). The physical condition information may a priority order for the users of a seat system 20. Medical personnel associated with the emergency service may use the user physical status information, which may include the priority order, to determine the order in which to provide medical assistance to the users. Users with more traumatic/urgent conditions may be treated before other users according to the determined priority order.
In embodiments, one or more portions of a method of operating a seat system 20 (e.g., method 100) may be conducted while a vehicle 22 is in motion and/or when a vehicle 22 is stopped.
In embodiments, one or more activities that may be conducted by an ECU 50, a remote server 70, and/or a medical server 90 may, additionally or alternatively, be conducted by another one of the ECU 50, the remote server 70, and/or the medical server 90.
In examples, a computing device (e.g., ECU 50, remote server 70, medical server 90) may include an electronic controller and/or include an electronic processor, such as a programmable microprocessor and/or microcontroller. In embodiments, a computing device may include, for example, an application specific integrated circuit (ASIC). A computing device may include a central processing unit (CPU), a memory (e.g., a non-transitory computer-readable storage medium), and/or an input/output (I/O) interface. A computing device may be configured to perform various functions, including those described in greater detail herein, with appropriate programming instructions and/or code embodied in software, hardware, and/or other medium. In embodiments, a computing device may include a plurality of controllers. In embodiments, a computing device may be connected to a display, such as a touchscreen display.
Various examples/embodiments are described herein for various apparatuses, systems, and/or methods. Numerous specific details are set forth to provide a thorough understanding of the overall structure, function, manufacture, and use of the examples/embodiments as described in the specification and illustrated in the accompanying drawings. It will be understood by those skilled in the art, however, that the examples/embodiments may be practiced without such specific details. In other instances, well-known operations, components, and elements have not been described in detail so as not to obscure the examples/embodiments described in the specification. Those of ordinary skill in the art will understand that the examples/embodiments described and illustrated herein are non-limiting examples, and thus it can be appreciated that the specific structural and functional details disclosed herein may be representative and do not necessarily limit the scope of the embodiments.
Reference throughout the specification to “examples, “in examples,” “with examples,” “various embodiments,” “with embodiments,” “in embodiments,” or “an embodiment,” or the like, means that a particular feature, structure, or characteristic described in connection with the example/embodiment is included in at least one embodiment. Thus, appearances of the phrases “examples, “in examples,” “with examples,” “in various embodiments,” “with embodiments,” “in embodiments,” or “an embodiment,” or the like, in places throughout the specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more examples/embodiments. Thus, the particular features, structures, or characteristics illustrated or described in connection with one embodiment/example may be combined, in whole or in part, with the features, structures, functions, and/or characteristics of one or more other embodiments/examples without limitation given that such combination is not illogical or non-functional. Moreover, many modifications may be made to adapt a particular situation or material to the teachings of the present disclosure without departing from the scope thereof.
It should be understood that references to a single element are not necessarily so limited and may include one or more of such element. Any directional references (e.g., plus, minus, upper, lower, upward, downward, left, right, leftward, rightward, top, bottom, above, below, vertical, horizontal, clockwise, and counterclockwise) are only used for identification purposes to aid the reader's understanding of the present disclosure, and do not create limitations, particularly as to the position, orientation, or use of examples/embodiments.
Joinder references (e.g., attached, coupled, connected, and the like) are to be construed broadly and may include intermediate members between a connection of elements and relative movement between elements. As such, joinder references do not necessarily imply that two elements are directly connected/coupled and in fixed relation to each other. The use of “e.g.” in the specification is to be construed broadly and is used to provide non-limiting examples of embodiments of the disclosure, and the disclosure is not limited to such examples. Uses of “and” and “or” are to be construed broadly (e.g., to be treated as “and/or”). For example and without limitation, uses of “and” do not necessarily require all elements or features listed, and uses of “or” are inclusive unless such a construction would be illogical.
While processes, systems, and methods may be described herein in connection with one or more steps in a particular sequence, it should be understood that such methods may be practiced with the steps in a different order, with certain steps performed simultaneously, with additional steps, and/or with certain described steps omitted.
All matter contained in the above description or shown in the accompanying drawings shall be interpreted as illustrative only and not limiting. Changes in detail or structure may be made without departing from the present disclosure.
It should be understood that a computing device (e.g., ECU 50, remote server 70, medical server 90), a system, and/or a processor as described herein may include a conventional processing apparatus known in the art, which may be capable of executing preprogrammed instructions stored in an associated memory, all performing in accordance with the functionality described herein. To the extent that the methods described herein are embodied in software, the resulting software can be stored in an associated memory and can also constitute means for performing such methods. Such a system or processor may further be of the type having ROM, RAM, RAM and ROM, and/or a combination of non-volatile and volatile memory so that any software may be stored and yet allow storage and processing of dynamically produced data and/or signals.
It should be further understood that an article of manufacture in accordance with this disclosure may include a non-transitory computer-readable storage medium having a computer program encoded thereon for implementing logic and other functionality described herein. The computer program may include code to perform one or more of the methods disclosed herein. Such embodiments may be configured to execute via one or more processors, such as multiple processors that are integrated into a single system or are distributed over and connected together through a communications network, and the communications network may be wired and/or wireless. Code for implementing one or more of the features described in connection with one or more embodiments may, when executed by a processor, cause a plurality of transistors to change from a first state to a second state. A specific pattern of change (e.g., which transistors change state and which transistors do not), may be dictated, at least partially, by the logic and/or code.
Number | Name | Date | Kind |
---|---|---|---|
5769490 | Falzon | Jun 1998 | A |
6056360 | Schneider | May 2000 | A |
6088642 | Finkelstein et al. | Jul 2000 | A |
6088643 | Long et al. | Jul 2000 | A |
6098000 | Long et al. | Aug 2000 | A |
6345839 | Kuboki et al. | Feb 2002 | B1 |
6353207 | Burt | Mar 2002 | B1 |
6506153 | Littek et al. | Jan 2003 | B1 |
6559422 | Burt | May 2003 | B2 |
6682494 | Sleichter, III et al. | Jan 2004 | B1 |
6908152 | McMillen | Jun 2005 | B2 |
7011369 | Massara et al. | Mar 2006 | B2 |
7083232 | Frank | Aug 2006 | B2 |
7083233 | Massara et al. | Aug 2006 | B2 |
7152920 | Sugiyama et al. | Dec 2006 | B2 |
7201446 | Massara et al. | Apr 2007 | B2 |
7219923 | Fujita et al. | May 2007 | B2 |
7267652 | Coyle et al. | Sep 2007 | B2 |
7303231 | Frank | Dec 2007 | B2 |
7314451 | Halperin et al. | Jan 2008 | B2 |
7417536 | Lakshmanan et al. | Aug 2008 | B2 |
7731279 | Asada et al. | Jun 2010 | B2 |
7808395 | Raisanen et al. | Oct 2010 | B2 |
7862119 | Schafer et al. | Jan 2011 | B2 |
7866755 | Okano | Jan 2011 | B2 |
7900736 | Breed | Mar 2011 | B2 |
7967379 | Walters et al. | Jun 2011 | B2 |
7967381 | Sugiyama | Jun 2011 | B2 |
8341786 | Oexman et al. | Jan 2013 | B2 |
8444558 | Young et al. | May 2013 | B2 |
8616654 | Zenk et al. | Dec 2013 | B2 |
8706204 | Seo et al. | Apr 2014 | B2 |
8710784 | Meyer et al. | Apr 2014 | B2 |
8725311 | Breed | May 2014 | B1 |
8794707 | Bocsanyi et al. | Aug 2014 | B2 |
8958955 | Hotary et al. | Feb 2015 | B2 |
8971839 | Hong | Mar 2015 | B2 |
8979191 | Friderich et al. | Mar 2015 | B2 |
8989697 | Leung et al. | Mar 2015 | B2 |
9237242 | Basir | Jan 2016 | B2 |
9272647 | Gawade et al. | Mar 2016 | B2 |
9272689 | Fung et al. | Mar 2016 | B2 |
9277385 | Iwamoto | Mar 2016 | B2 |
9427598 | Pilla et al. | Aug 2016 | B2 |
9504416 | Young et al. | Nov 2016 | B2 |
9815385 | Lippman et al. | Nov 2017 | B2 |
9883821 | Muehlsteff | Feb 2018 | B2 |
9978283 | Jedrzejewski et al. | May 2018 | B2 |
9980680 | Matsumoto | May 2018 | B2 |
9989571 | Loftus | Jun 2018 | B2 |
10034631 | Gallagher et al. | Jul 2018 | B1 |
10210409 | Migneco et al. | Feb 2019 | B1 |
10213147 | Gallagher et al. | Feb 2019 | B2 |
10328823 | O'Bannon et al. | Jun 2019 | B2 |
10358065 | McMillen et al. | Jul 2019 | B2 |
10369074 | Oberg et al. | Aug 2019 | B2 |
10379535 | Migneco et al. | Aug 2019 | B2 |
10391900 | Zhao et al. | Aug 2019 | B2 |
10470968 | Saren et al. | Nov 2019 | B2 |
10471868 | Wheeler | Nov 2019 | B2 |
10492979 | Norman et al. | Dec 2019 | B2 |
10556532 | Gallagher et al. | Feb 2020 | B2 |
10569668 | Migneco et al. | Feb 2020 | B2 |
10576855 | Dorfler et al. | Mar 2020 | B2 |
10640010 | Yetukuri et al. | May 2020 | B2 |
10709386 | Gallagher et al. | Jul 2020 | B2 |
10807439 | Migneco et al. | Oct 2020 | B2 |
20030075959 | Xue et al. | Apr 2003 | A1 |
20040119599 | Stevenson et al. | Jun 2004 | A1 |
20070118054 | Pinhas | May 2007 | A1 |
20080255731 | Mita et al. | Oct 2008 | A1 |
20080267460 | Aoki et al. | Oct 2008 | A1 |
20090008970 | Flory et al. | Jan 2009 | A1 |
20090030578 | Periot et al. | Jan 2009 | A1 |
20100087748 | Tobola et al. | Apr 2010 | A1 |
20110015468 | Aarts et al. | Jan 2011 | A1 |
20120080911 | Brykalski et al. | Apr 2012 | A1 |
20120086249 | Rotary et al. | Apr 2012 | A1 |
20130090816 | Huber | Apr 2013 | A1 |
20130251216 | Smowton et al. | Sep 2013 | A1 |
20140070943 | Breed | Mar 2014 | A1 |
20140207333 | Vandivier et al. | Jul 2014 | A1 |
20140319895 | Lange-Mao et al. | Oct 2014 | A1 |
20140361871 | Silva et al. | Dec 2014 | A1 |
20150266405 | Fitzpatrick et al. | Sep 2015 | A1 |
20150313475 | Benson et al. | Nov 2015 | A1 |
20150351692 | Pereny et al. | Dec 2015 | A1 |
20150352979 | O'Bannon et al. | Dec 2015 | A1 |
20150352990 | Zouzal et al. | Dec 2015 | A1 |
20160001781 | Fung | Jan 2016 | A1 |
20160143803 | Portales | May 2016 | A1 |
20160250956 | Setting et al. | Sep 2016 | A1 |
20160278709 | Ridao Granado et al. | Sep 2016 | A1 |
20170043681 | Seiller et al. | Feb 2017 | A1 |
20170086588 | Patrick et al. | Mar 2017 | A1 |
20170274906 | Hassan | Sep 2017 | A1 |
20170361748 | Meachum et al. | Dec 2017 | A1 |
20180008507 | Saren et al. | Jan 2018 | A1 |
20180009343 | Saren et al. | Jan 2018 | A1 |
20180110960 | Youngblood et al. | Apr 2018 | A1 |
20180325264 | Gallagher et al. | Nov 2018 | A1 |
20180345833 | Gallagher et al. | Dec 2018 | A1 |
20190053761 | Young | Feb 2019 | A1 |
20190054796 | Thomas | Feb 2019 | A1 |
20190126036 | Franco-Obregon et al. | May 2019 | A1 |
20190133511 | Migneco et al. | May 2019 | A1 |
20190168771 | Migneco et al. | Jun 2019 | A1 |
20190193591 | Migneco et al. | Jun 2019 | A1 |
20190239815 | Gallagher et al. | Aug 2019 | A1 |
20190275860 | Migneco et al. | Sep 2019 | A1 |
20190332902 | Gallagher et al. | Oct 2019 | A1 |
20190337431 | McMillen et al. | Nov 2019 | A1 |
20190344043 | Migneco et al. | Nov 2019 | A1 |
20200035237 | Kim et al. | Jan 2020 | A1 |
20200113344 | Youngblood et al. | Apr 2020 | A1 |
20200170576 | Lerner | Jun 2020 | A1 |
20200188211 | Ellermann | Jun 2020 | A1 |
20200231428 | Migneco et al. | Jul 2020 | A1 |
20200253381 | Dorfler et al. | Aug 2020 | A1 |
Number | Date | Country |
---|---|---|
2855822 | Jan 2007 | CN |
203186154 | Sep 2013 | CN |
104252615 | Dec 2014 | CN |
205468657 | Aug 2016 | CN |
10027686 | Jan 2002 | DE |
10063478 | Jul 2002 | DE |
102004010626 | Jun 2005 | DE |
102004013674 | Oct 2005 | DE |
102006029871 | Jan 2008 | DE |
102008029339 | Jan 2009 | DE |
102009008421 | Oct 2009 | DE |
102009035566 | Feb 2010 | DE |
102009031331 | Aug 2010 | DE |
102009033041 | Jan 2011 | DE |
102010021332 | Jan 2011 | DE |
102011012431 | Nov 2011 | DE |
102011016073 | Dec 2011 | DE |
102011017238 | Dec 2011 | DE |
102011102021 | Nov 2012 | DE |
102011113100 | Mar 2013 | DE |
102011116194 | Apr 2013 | DE |
102012201430 | Apr 2013 | DE |
102012216869 | Mar 2014 | DE |
202015104103 | Aug 2015 | DE |
102014002942 | Sep 2015 | DE |
102015011460 | Mar 2016 | DE |
102015011461 | Mar 2016 | DE |
102017110812 | Jan 2018 | DE |
102016011481 | Mar 2018 | DE |
202017103162 | May 2018 | DE |
102018000765 | Aug 2019 | DE |
102018001230 | Aug 2019 | DE |
202019100400 | Jan 2020 | DE |
202019100710 | Feb 2020 | DE |
102018007921 | Apr 2020 | DE |
202019102879 | May 2020 | DE |
202019105369 | May 2020 | DE |
102019008724 | Aug 2020 | DE |
1077154 | Feb 2001 | EP |
1749477 | Feb 2007 | EP |
1932715 | Jun 2008 | EP |
2149475 | Feb 2010 | EP |
2205460 | Mar 2016 | EP |
2988654 | Oct 2013 | FR |
2512136 | Sep 2014 | GB |
2001269380 | Oct 2001 | JP |
2005137896 | Jun 2005 | JP |
2005237456 | Sep 2005 | JP |
2006014756 | Jan 2006 | JP |
3857869 | Dec 2006 | JP |
2009172145 | Aug 2009 | JP |
2012196253 | Oct 2012 | JP |
2013163405 | Aug 2013 | JP |
2019131049 | Aug 2019 | JP |
2011144280 | Nov 2011 | WO |
2012039368 | Mar 2012 | WO |
2013144498 | Oct 2013 | WO |
2015127193 | Aug 2015 | WO |