The present invention relates to systems and methods for detecting occupants in a vehicle and, more particularly, to an occupant detection system and method for a vehicle.
A vehicle, such as an automobile, truck, boat, and the like typically includes one or more user interfaces accessible by occupants including an operator and passengers for displaying information. A user interface may also include one or more inputs that an occupant uses, or the vehicle uses, to sense and control a vehicle function or accessory like an alarm, a radio, navigation, or phone use. A user interface may also be used to control portable accessories for the vehicle or occupants, such as a key fob, mobile phone, or a cloud based service.
In various types of vehicles, a user interface, such as a center stack console, is accessible to the operator and front seat passengers. The center stack has user interfaces for many vehicle functions and may include switches, knobs, light indicators, displays including touch sensitive displays, and the like. Other areas of a vehicle that may have user interfaces for sensing, control, and/or information display include overhead consoles where sunroof and interior lighting controls may be placed. The particular type of user interface and its location may vary depending on the type of information displayed or accessory being controlled across a wide variety of applications.
Occupant detection sensors in vehicles are typically reserved for use by the air-bag deployment system of the vehicle. It is desirable to provide a new occupant detection system for a vehicle.
A system and method for determining if an occupant such as a person or animal is located within an interior of a vehicle.
An embodiment of an occupant detection system or detection method for a vehicle according to the present disclosure may include a user interface and at least one sensor to determine if there is an occupant in the vehicle. For example, one embodiment would use capacitive sensing sensors embedded into a cushion and/or back of seats in a vehicle. The capacitive sensor would send a sense signal to a controller and the controller would recognize various conditions such as an empty seat or a seat occupied by a person or animal. It could also determine if there was a child seat in a position on a vehicle seat and if there is a child in the child seat.
Accordingly, the present invention provides an occupant detection system for a vehicle including a user interface having a touch sensitive input device adapted to be fixedly mounted to a vehicle with a touch sensitive active surface. The occupant detection system also includes at least one first sensor located in a seat back of a seat assembly of the vehicle and at least one second sensor located in a seat cushion of the seat assembly. The at least one first sensor and the at least one second sensor are operable for detecting an occupant in the seat assembly. The occupant detection system further includes a controller in communication with the user interface and the at least one first sensor and the at least one second sensor to receive sensor signals from the at least one first sensor and the at least one second sensor and user interface signals from the user interface. The controller is operable for monitoring the at least one first sensor and the at least one second sensor and the user interface. The controller includes an artificial intelligence algorithm to process the sensor signals and the user interface signals and determines whether the seat assembly is unoccupied and occupied and characteristics of any occupant in the seat assembly if the seat assembly is occupied.
The present invention also provides a method for detecting an occupant in a vehicle including steps of providing an occupant detection system including a user interface having a touch sensitive input device adapted to be fixedly mounted to a vehicle with a touch sensitive active surface, at least one first sensor located in a seat back of a seat assembly of the vehicle and at least one second sensor located in a seat cushion of the seat assembly, and a controller in communication with the user interface and the at least one first sensor and the at least one second sensor. The method also includes steps of detecting by the at least one first sensor and the at least one second sensor an occupant in the seat assembly, and receiving, by the controller, sensor signals from the at least one first sensor and the at least one second sensor and user interface signals from the user interface. The method further includes steps of processing, by an artificial intelligence algorithm of the controller, the sensor signals and the user interface signals, and determining, by the controller, whether the seat assembly is occupied and characteristics of any occupant in the seat assembly if the seat assembly is occupied.
The above embodiments, features, and advantages as well as other embodiments, features, and advantages of a system or method according to the present disclosure will be readily apparent to one of ordinary skill in the art from the following detailed description taken in connection with the accompanying drawings.
As those of ordinary skill in the art will understand, various features illustrated and described with reference to any one of the Figures may be combined with features illustrated in one or more other Figures to produce embodiments that are not necessarily explicitly illustrated or described. The combinations of features illustrated provide representative embodiments for representative applications. However, various combinations and modifications of the features consistent with the teachings of this disclosure may be desired for particular applications or implementations.
Touch sensitive user interface devices according to the present invention may be used in a wide variety of applications. In vehicle applications, for example, touch sensitive user interface devices facilitate interaction with the vehicle by means of a touch screen display, by sensors embedded in a seat to detect if the seat is occupied, or by various vehicle trim components, internal or external, with specific touch areas to initiate, modify, or stop a function.
As used herein, a touch sensitive user interface refers to virtually any type of user interface that includes one or more regions used to sense user input associated with proximity to, or contact with, the touch sensitive region.
Referring to
Referring now to
The occupant detection sensors in vehicles are typically reserved for use by the air-bag deployment system of the vehicle. However, information from these sensors and/or other sensors embedded into the seat assembly 60 can also be used for determining if an occupant is present. Information from embedded sensors such as the occupant detection sensors 62, 63, and 63a of
Referring now to
If a force is applied to the outer surface of the conductive layer 87a for example, the dielectric layers 92a and 92b will compress and the active sensing layer 89 will be deformed. The act of compressing the layers together brings the active sense layer 89 into closer proximity to the conductive layers 87a and 87b. By doing so, the capacitance of the sensor 82 will increase according to the well-known formula C=εoεrA/d where εo is the permeability of free space, εr is the dielectric constant of the compressible layer(s), where A is the surface area of the sensor, and d is the distance between the electrically conductive layers. The sensor configuration in
As previously stated, the sensor 83 can sense an applied force due to the compression. The configuration of this sensor 83 provides an output if an infant or child is in proximity to the sensor 83, as well as if an infant/child seat is physically touching and compressing the sensor 83 as would be the case when the infant/child seat is secured in the seat assembly of the vehicle. The sensor 83 further provides for determining if an infant or child is in the infant/child seat because having an infant or child in the infant/child seat would further compress the sensor 83 beyond where it would be if the infant/child seat is empty. If an empty infant/child seat is normalized, that is zeroed, the controller 34 knows that the seat assembly is empty, the further compression of the sensor 83 when an infant or child is in the infant/child seat will allow the controller 34 to recognize that an infant or child is present and will allow an estimate weight of the infant or child. Further, if there is movement of an infant or child, the sensors in the seat assembly will provide a capacitance signal modulated by the movement. It should be appreciated that this modulated signal may be used to determine or increase confidence that there is an occupant in a seat and to a limited.
Referring to
The signal 76 from the seat back sensor 62 is used to determine if the occupant of the seat assembly 70 is sitting back on the seat back 72. The seat sensor signals 78 and 80 from the seat bottom left and right sensors 64 and 66 to determine whether the occupant of the seat assembly 70 is sitting centered or more left or right in the seat assembly 70. The seat sensor signal 81 from the seat bottom front sensor 68 is used to determine whether the occupant of the seat assembly 70 is sitting toward the front of the seat assembly 70. For example, if the signals 78 and 80 are small and the signal 81 is large, it is an indication that an occupant is sitting forward in the seat assembly 70. While the controller 34 shown in
The non-conductive or dielectric layers 92a and 92b may be an EPDM closed cell foam or the like. Each of the sensors 62, 64, 66, and 68 can be configured into different shapes for placement into the seat assembly 70 as the conductive plates 86, 88, and 90 are flexible and the non-conductive layers 92a and 92b are compressible. Different shapes, cut-outs, slits and the like may be employed in the sensors 62, 64, 66, and 68 to allow for sensor conformance to an occupant profile and/or seat shape so that a seat sensor does not pucker, bulge, and the like to cause a visible bump in the seating surface or to become uncomfortable to an occupant.
The controller 34 further monitors vehicle status such as an ignition signal, vehicle door activity, vehicle door locks, key fob proximity, and the like to determine when the operator is leaving or has left the vehicle. If the controller 34 senses the presence of persons or animals inside the vehicle after it has determined that the operator has, or is, leaving the vehicle, the controller 34 sends a warning signal 45 to the operator indicating that the vehicle has been exited while occupants remain in the vehicle. This warning can be given to the operator by sounding an alarm 23 of the vehicle such as the vehicle horn, and/or lighting a warning indicator such as a dedicated indicator or any other light such as flashing of headlights and/or brake lights. The warning may also send a signal to a remote 24 such as a key fob, cell phone, connected watch, and the like to get the attention of the operator. In another embodiment, the touch sensitive active surface of the user interface 14 includes an on/off switch acknowledgement of the seat assembly 70 being occupied. For example, the controller 34 sends a signal to the user interface 14, which then displays the on/off switch acknowledgement. In another embodiment, the touch sensitive active surface includes of the user interface 14 includes an indicator light or message indication of the seat assembly 70 being occupied. For example, the controller sends a signal to the user interface 14 to light or turn ON the indicator light or display a message that the seat assembly 70 is occupied.
As another level of protection, the controller 34 can perform vehicle functions such as a venting operation via a sunroof or window, or enabling a fan or an HVAC system to cool the cabin of the vehicle to help ensure the safety of an occupant left inside the vehicle while unattended. As described, the system 10 acts accordingly when the vehicle interior temperature is too hot, but it can also act accordingly if the interior temperature is too cold by enabling an HVAC system to heat the cabin of the vehicle. It should be appreciated that these venting and/or HVAC functions can be performed coincident with audible and visual warnings as previously described.
As another further level of protection, if no response to all warnings has been addressed in a determined amount of time, the controller 34 may initiate an emergency services call such as 911 in the US, 119 in Japan, or 112 in the UK. In doing so, the vehicle can provide detailed location of the vehicle as determined by GPS, cell phone towers, vehicle to infrastructure (V2I) or vehicle to everything (V2X) communications, and the like. It should be appreciated that communication to emergency services can be initiated at any time including at the beginning of a venting and/or HVAC operation, or after a venting and/or HVAC operation is determined to be insufficient, or at any other predetermined time or condition.
In an alternate strategy, the occupant detection signal values are read from two or more seat sensors such as the sensors 62, 64, 66, and 68 of
In another alternate strategy, two or more seat sensors such as the sensors 62, 64, 66, and 68 of
It should be appreciated that this effectively makes the seat sensor more sensitive, allowing for smaller and/or further distant occupants to be sensed.
In another embodiment, the system 10 may include sensing input from more than one seat assembly 60, 70. As an example, in a 2nd row seating of a vehicle, the seat sensor signals received from one seat assembly, say the driver side, may be compared with the seat sensor signals from another seat assembly such as the seat on the passenger side of the vehicle. This method will provide for comparing seat sensor signals from one seat to another seat to help determine if there is a fault with either seat sensor, or as a baseline to compare each seat against. For example, if the driver side seat has high seat sensor signals, and the passenger side seat has low seat sensor signals, the controller 34 can compare them to each other to aid in determining that there is an occupant on the seat with high sensor signals.
Any combination of the proximity detection methods can be implemented to achieve the desired characteristics of proximity detection of an approaching or proximal occupant.
Other occupant detection sensors may be employed at appropriate locations in addition to, or in place of seat sensors. Other sensor locations may include in a headliner, behind interior trim, or integrated into a door panel. The non-seat-located sensors may be capacitive sensing or could be any other non-capacitive sensing type.
Another embodiment is when at least one camera that sends signals directly to the system 10 and the system 10 would recognize various conditions such as an empty seat or a seat occupied by a child seat, a person, or an animal. Both the capacitive sensing and camera would also allow the system 10 to determine if there is a person or animal present and visible including movement in any of the seating positions. Image analysis may employ image subtraction, edge detection, or other methods to determine movement. A camera located in a center stack, a headliner position, or in a headrest of the vehicle may allow all seating positions to be viewed. If there are other cameras on the vehicle, such as those on a front or middle support structure like the A or B pillars, they may be used to view the inside of the cabin of the vehicle. Referring to
Another embodiment is when at least one radar system that sends signal directly to the system 10 and the system 10 would recognize various conditions such as an empty seat or a seat occupied by a child seat, a person, or an animal. Both the capacitive sensing and radar system would also allow the system 10 to determine if there is a person or animal present and movement in any of the seating positions.
Another embodiment is when at least one thermal imaging system that sends signal directly to the system 10 and the system 10 would recognize various conditions such as an empty seat or a seat occupied by a child seat, a person, or an animal. Both the capacitive sensing and thermal imaging systems would also allow the system 10 to determine if there is movement in any of the seating positions.
Another embodiment is when at least one microphone system, such in a typical automotive configuration, sends signals directly to the system 10 and the system 10 would recognize various conditions such as a quiet cabin or a cabin with noise, where noise could be talking, crying, or other audible sound. Both the capacitive sensing and microphone systems would also allow the system 10 to determine if there is a person or animal making noise and in which of various seating positions.
Yet another embodiment would use a gas sensor to measure the concentration of carbon dioxide in the vehicle cabin. It is medically established that a living person will produce approximately 18% more carbon dioxide when thermally stressed. Measuring an increase in carbon dioxide concentration in the cabin, especially a rapid increase, would indicate not only that there is an occupant in the vehicle, but that the occupant may be going into thermal distress. Gas sensing may also include hydrogen and methane as well as gases with a sulfur component such as hydrogen sulfide and methanethiol.
In yet another embodiment, a moisture sensor may be used in conjunction with a capacitive seat sensor. The moisture sensor will provide information to the controller 34 as to whether the seat sensor has become contaminated with water, coffee, urine, or other substance so that a warning may be given on a driver information display, other instrument panel display, or remote device such as a key fob or a mobile phone. The warning is twofold in that it may provide an indication that there is an occupant in the vehicle and it may also indicate that the capacitive sensor has been compromised and cannot be relied on for an accurate reading of an occupant.
Some applications may not have capacitive sensors embedded into seat cushions or seat backs. In these cases, the aforementioned sensing methods may be used singly or in combination to provide occupant sensing methods without the need for capacitive sensors in vehicle seats.
A camera located in the front of the vehicle can provide a visual image that can be analyzed for detection of an occupant. These cameras could be located in the center stack display area, the headliner, or the A pillars of the vehicle. A camera could be located more midrange and placed in the dome light area or on the back side of a front seat headrest or in a seat back such as in a rear seat entertainment system. The camera system employed may be a standard type or a thermal imaging type.
Radar systems of any desirable frequency and transmission method could be employed in seat backs, head rests, headliners, center consoles, etc. can be used to detect even the smallest of motions to provide indication that an occupant is in the vehicle.
Vehicle microphones may be used to monitor for ambient noise when the vehicle is parked. This provides for detection of talking, crying, barking, etc. to provide an indication that there is an occupant in the vehicle. The microphones used may be the ones that are from the vehicle manufacturer for making phone calls, etc. or the microphones may be additional ones specifically for detecting sounds for occupancy detection.
Use of gas sensors to measure the concentration of relevant gases in the vehicle cabin, as previously mentioned, will provide indication of a vehicle occupant and essentially a vehicle occupant that is in physical distress.
Use of neuro-sensors provides real-time brain data without needing to contact the head of an occupant. Brain activity analysis is advancing quickly with correlation of brain signals and physiological status. Sensor locations can be in a headrest or a seat back cushion with a sensing distance of 3-30 cm.
All the mentioned sensors used individually or in combination will provide a reasonable indication of the presence of a vehicle occupant. If all, or a subset of the sensor data is provided to an electronic control, a determination can be made for vehicle occupancy. This determination can be made by using signal threshold limits, averaging, envelope analysis, pattern recognition, edge detection techniques, and the like. All of these are standard techniques used in algorithm development. An enhanced approach would include using artificial intelligence to further improve the detection capabilities and accuracy.
Artificial intelligence (AI) can play a crucial role in monitoring and analyzing sensor signals in vehicle occupant sensing systems. AI algorithms can process vast amounts of data from sensors to detect patterns and anomalies that signify the presence of an occupant as well as occupant health. By continuously monitoring sensor signals, electronic systems can provide real-time insights into the presence and health of vehicle occupants and identify potential issues before they escalate into life threatening situations.
AI has the capability to interpret electrical sensor signals and determine the presence of a human in a vehicle, leading to various applications in the automotive industry. One of the key sensors used for this purpose is the occupant detection sensor, which measures various parameters such as weight, pressure, and body heat. By employing AI algorithms, these sensor signals can be analyzed in real-time to accurately determine if a human is present in the vehicle.
The AI models trained on a diverse range of data can learn to recognize patterns and characteristics associated with human presence. For example, the AI models can identify a thermal signature unique to humans, allowing them to differentiate between a human and an inanimate object. This capability is crucial for ensuring the safety of occupants, as it can prevent accidents caused by leaving children or pets unattended in vehicles.
In addition to safety considerations, AI can also contribute to energy efficiency in vehicles. By accurately detecting human presence, AI algorithms can optimize energy consumption by activating or deactivating specific systems or components based on occupancy. For example, the AI system can adjust the power mode, dim the interior lights, or switch off non-essential features when no occupants are detected, thereby conserving energy.
Overall, AI's ability to interpret electrical sensor signals and determine human presence in vehicles has transformative implications for the automotive industry. From enhancing safety by preventing accidents and enabling personalized features to optimizing energy consumption in autonomous and conventional vehicles, AI's integration in occupant detection systems brings numerous benefits to both vehicle occupants and the broader transportation ecosystem.
While embodiments of the present disclosure have been illustrated and described, it is not intended that these embodiments illustrate and describe all possible forms of the present disclosure. Rather, the words used in the specification are words of description rather than limitation, and it is understood that various changes may be made without departing from the spirit and scope of the present disclosure.
The present application is a continuation-in-part of U.S. Ser. No. 17/203,031, filed Mar. 16, 2021, which claims the benefit of and priority to U.S. Provisional Patent Application Ser. No. 62/990,406, filed Mar. 16, 2020, and which is a continuation-in-part of U.S. Ser. No. 16/820,613, filed Mar. 16, 2020, which is a continuation of U.S. Ser. No. 14/028,941, filed Sep. 17, 2013 (now U.S. Pat. No. 10,592,092, issued Mar. 17, 2020), which is a continuation-in-part of U.S. Ser. No. 12/496,938, filed Jul. 2, 2009 (now U.S. Pat. No. 9,046,967, issued Jun. 2, 2015), the entire disclosures of which are hereby expressly incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
4766368 | Cox | Aug 1988 | A |
4825385 | Dolph et al. | Apr 1989 | A |
4933807 | Duncan | Jun 1990 | A |
5305017 | Gerpheide | Apr 1994 | A |
5337353 | Boie et al. | Aug 1994 | A |
5463388 | Boie et al. | Oct 1995 | A |
5565658 | Gerpheide et al. | Oct 1996 | A |
5621290 | Heller et al. | Apr 1997 | A |
5730165 | Philipp | Mar 1998 | A |
5801340 | Peter | Sep 1998 | A |
5832397 | Yoshida et al. | Nov 1998 | A |
5952801 | Boisvert et al. | Sep 1999 | A |
5986421 | Fukazawa et al. | Nov 1999 | A |
6064165 | Boisvert et al. | May 2000 | A |
6144114 | Chutorash | Nov 2000 | A |
6181996 | Chou et al. | Jan 2001 | B1 |
6233872 | Glagow et al. | May 2001 | B1 |
6275146 | Kithil et al. | Aug 2001 | B1 |
6337549 | Bledin | Jan 2002 | B1 |
6346935 | Nakajima et al. | Feb 2002 | B1 |
6377009 | Philipp | Apr 2002 | B1 |
6389752 | Rosenau | May 2002 | B1 |
6404158 | Boisvert et al. | Jun 2002 | B1 |
6499359 | Washeleski et al. | Dec 2002 | B1 |
6555982 | Tyckowski | Apr 2003 | B2 |
6559555 | Saitou et al. | May 2003 | B1 |
6782759 | Shank et al. | Aug 2004 | B2 |
6936986 | Nuber | Aug 2005 | B2 |
6946853 | Gifford et al. | Sep 2005 | B2 |
6968746 | Shank et al. | Nov 2005 | B2 |
7015666 | Staus | Mar 2006 | B2 |
7030860 | Hsu et al. | Apr 2006 | B1 |
7038414 | Daniels et al. | May 2006 | B2 |
7132642 | Shank et al. | Nov 2006 | B2 |
7162928 | Shank et al. | Jan 2007 | B2 |
7293467 | Shank et al. | Nov 2007 | B2 |
7312591 | Washeleski et al. | Dec 2007 | B2 |
7342373 | Newman et al. | Mar 2008 | B2 |
7421321 | Breed et al. | Sep 2008 | B2 |
7449852 | Washeleski et al. | Nov 2008 | B2 |
7471334 | Stenger | Dec 2008 | B1 |
7518327 | Newman et al. | Apr 2009 | B2 |
7576631 | Bingle et al. | Aug 2009 | B1 |
7653883 | Hotelling et al. | Jan 2010 | B2 |
7813025 | Ribi | Oct 2010 | B2 |
8040326 | Hotelling et al. | Oct 2011 | B2 |
9415689 | Wäller et al. | Aug 2016 | B2 |
11059490 | Migneco | Jul 2021 | B1 |
20010052839 | Nahata et al. | Dec 2001 | A1 |
20020003571 | Schofield et al. | Jan 2002 | A1 |
20020039008 | Edgar et al. | Apr 2002 | A1 |
20020055811 | Obradovich | May 2002 | A1 |
20020070862 | Francis et al. | Jun 2002 | A1 |
20020152010 | Colmenarez et al. | Oct 2002 | A1 |
20020190961 | Chen | Dec 2002 | A1 |
20040046452 | Suyama et al. | Mar 2004 | A1 |
20040056842 | Tisaka et al. | Mar 2004 | A1 |
20040119688 | Troxell et al. | Jun 2004 | A1 |
20040215382 | Breed et al. | Oct 2004 | A1 |
20040233677 | Su et al. | Nov 2004 | A1 |
20050012484 | Gifford et al. | Jan 2005 | A1 |
20050251314 | Schindler et al. | Nov 2005 | A1 |
20060006701 | Wells | Jan 2006 | A1 |
20060097991 | Hotelling et al. | May 2006 | A1 |
20060161871 | Hotelling et al. | Jul 2006 | A1 |
20060229811 | Herman et al. | Oct 2006 | A1 |
20060238517 | King et al. | Oct 2006 | A1 |
20070075986 | Chen | Apr 2007 | A1 |
20070152615 | Newman et al. | Jul 2007 | A1 |
20070257890 | Hotelling et al. | Nov 2007 | A1 |
20070273560 | Hua et al. | Nov 2007 | A1 |
20080048997 | Gillespie et al. | Feb 2008 | A1 |
20080129684 | Adams et al. | Jun 2008 | A1 |
20080147308 | Howard et al. | Jun 2008 | A1 |
20090079705 | Sizelove et al. | Mar 2009 | A1 |
20090132130 | Kumon et al. | May 2009 | A1 |
20090144622 | Evans et al. | Jun 2009 | A1 |
20090146846 | Grossman | Jun 2009 | A1 |
20090179988 | Reibel et al. | Jul 2009 | A1 |
20090193361 | Lee et al. | Jul 2009 | A1 |
20090198420 | Newman et al. | Aug 2009 | A1 |
20090210110 | Dybalski et al. | Aug 2009 | A1 |
20090219134 | Nakasato et al. | Sep 2009 | A1 |
20090234542 | Orlewski | Sep 2009 | A1 |
20090244017 | Pala et al. | Oct 2009 | A1 |
20090309851 | Bernstein | Dec 2009 | A1 |
20100001971 | Jiang et al. | Jan 2010 | A1 |
20100097346 | Sleeman | Apr 2010 | A1 |
20100117845 | Satz et al. | May 2010 | A1 |
20100188248 | Sultan et al. | Jul 2010 | A1 |
20100188343 | Bach | Jul 2010 | A1 |
20100222939 | Namburu et al. | Sep 2010 | A1 |
20100260350 | Chutorash et al. | Oct 2010 | A1 |
20100295670 | Sato et al. | Nov 2010 | A1 |
20100295812 | Burns et al. | Nov 2010 | A1 |
20100302201 | Ritter et al. | Dec 2010 | A1 |
20110074565 | Cuddihy et al. | Mar 2011 | A1 |
20110080363 | Kao et al. | Apr 2011 | A1 |
20110206239 | Wada et al. | Aug 2011 | A1 |
20110246026 | Shuster | Oct 2011 | A1 |
20120038559 | Radivojevic et al. | Feb 2012 | A1 |
20120316702 | Liu | Dec 2012 | A1 |
20130009761 | Horseman | Jan 2013 | A1 |
20130030645 | Divine et al. | Jan 2013 | A1 |
20130307771 | Parker et al. | Nov 2013 | A1 |
20140028542 | Lovitt et al. | Jan 2014 | A1 |
20140300561 | Waller et al. | Oct 2014 | A1 |
20160089084 | Sugiyama | Mar 2016 | A1 |
20170113615 | Fendt | Apr 2017 | A1 |
20200241824 | Lee | Jul 2020 | A1 |
20200398637 | Chang | Dec 2020 | A1 |
20210078529 | Sung | Mar 2021 | A1 |
20210188287 | Taveira | Jun 2021 | A1 |
Number | Date | Country |
---|---|---|
2613232 | Jul 2013 | EP |
0127868 | Apr 2001 | WO |
0212669 | Feb 2002 | WO |
0227132 | Apr 2002 | WO |
03038220 | May 2003 | WO |
2005114369 | Dec 2005 | WO |
Entry |
---|
ISR & Written Opinion of the ISA for PCT/US2010/040541 dated Sep. 1, 2010. |
Extended European Search Report for EP 14 18 4912 dated Apr. 17, 2015. |
Final Office Action regarding U.S. Appl. No. 14/700,731 dated May 16, 2016. |
Non-Final Office Action regarding U.S. Appl. No. 14/700,731 datred Dec. 15, 2016. |
Buxton, B., “31.1: Invited Paper: A Touching Story: A Personal Perspective on the History of Touch Interfaces Past and Future,” Society for Information Display (SIDS) Symposium Digest of Technical Papers, vol. 41, No. 1, Session 31, May 2010, pp. 444-448. |
Hinckley, K., et al., “38.2: Direct Display Interaction via Simultaneous Pen + Multi-touch Input,” Society for INformation Display (SID) Symposium Digest of Tenical Papers, vol. 41, No. 1, Session 38, May 2010, pp. 537-540. |
Lee, S., “A Fast Multiple-Touch-Sensitive-Input Device,” University of Toronto, Department of Electrical Engineering, Master Thesis, Oct. 1984, 118 pages. |
Hillis, W.D., “A High-Resolution Imaging Touch Sensor,” The International Journal of RObotics Research, vol. 1, No. 2, Summer Jun.-Aug. 1982, pp. 33-44. |
Lee, S.K., et al., “A Multi-Touch Three Dimensional Touch-Sensitive Tablet,” Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Apr. 1985, pp. 21-25. |
Hlady, A.M., “A Touch Sensitive X-Y Position Encoder for Computer Input,” Proceedings of the Fall Joint-Computer Conference, Nov. 18-20, 1969, pp. 545-551. |
Sasaki, L., et al., “A Touch-Sensitive Input Device,” International Computer Music Conference Proceedings, Nov. 1981, pp. 293-296. |
Callaghan, J., et al., “An Empirical Comparison of Pie vs. Linear Menus,” Human Factors in Computing Sysems: Chicago '88 Conference Proceedings: May 15-19, 1988, Washington DC: SPecial Issue of the SIGCHI Bulletin, New York, Association for Computing Machinery, pp. 95-100. |
Casio, AT-550 Advertisement, published in Popular Science by On The Run, Feb. 1984, p. 129. |
Casio, “Module No. 320,” AT-550 Owner's Manual, at least as early as Dec. 1984, 14 pages. |
Smith, S.D., et al., “Bit-sliceMicroprocessors in H.F. Digital Communications,” The Radio and Electronic Engineer, vol. 51, No. 6, Jun. 1981, pp. 29-301. |
Boie, R.A., “Capacitive Impedeance Readout Tactile Image Sensor,” Proceedings of the IEEE International Conference on Robotics and Automation, vol. 1, Mar. 1984, pp. 370-372. |
Thompson, C., “Clive Thompson on The Breakthrough Myth,” Wired Magazine, http://www.wired.com/magazine/2011/07/st_thompson_breakthrough, Aug. 2011, 3 pages. |
“Innovation in Information Technology,” National Research Council of the National Academies, Computer Science and Telecommunications Board, Division of Engineering and Physical Sciences, http://www.nap.edu/catalog/10795 html, 2003, 85 pages. |
Buxton, W., et al., “Issues and Techniques in Touch-Sensitive Tablet Input,” Proceedings of SIGGRAPH '85, vol. 19, No. 3, Jul. 22-26, 1985, pp. 215-223. |
Buxton, W., et al., “Large Displays in Automotive Design,” IEEE Computer Graphics and Applications, Jul./Aug. 2000, pp. 68-75. |
Buxton, W., “Lexical and Pragmatic Consideration of Input Structures,” ACM SIGGRAPH Computer Graphics, vol. 17, No. 1, Jan. 1983, pp. 31-37. |
Betts, P., et al., “Light Beam Matrix Input Terminal,” IBM Technical Disclosure Bulletin, Oct. 1966, pp. 493-494. |
Buxton, B., “Multi-Touch Systems that I Have Known and Loved,” downloaded from http://www.billbuxton.com/multitouchoverview.html, Jan. 12, 2007, ww pages. |
Herot, C.F., et al., “One-Point Touch Input of Vector Information for Computer Displays,” Proceedings of the 5th Annual Conference on Computer Graphics and Interactive Techniques, Aug. 23-25, 1978, pp. 210-216. |
Wolfeld, J.A., “Real Time Control of a Robot Tactile Sensor,” University of Pennsylvania, Dept. of Computer & Information Science, Technical Reports (CIS), Master Thesis, http://respository.upenn.edu/cisreports/678, Aug. 1981, 68 pages. |
Lewis, J.R., “Reaping the Benefits of Modern Usability Evaluation: The Simon Story,” Advances in Applied Ergonomics: Proceedings of the 1st International Conference on Applied Ergonomics, ICAE May 21-24, 1996, pp. 752-755. |
“Casio AT-550 Touch Screen Calculator Watch (1984),” http://www.youtube.com/watch?v=UhVAsqhfhqU, May 24, 2012, 1 page. |
Wikipedia, “Indium Tin Oxide,” (Mar. 29, 2019), <URL http://web.archive.org/web/20100429134539/http://en.wikipedia.org/wiki/indium_tin_oxide/> p. 1-3. |
Kolokowsky, et al., “Touchscreens 101: Understanding Touchscreen Technology and Design,” Cypress Perform, http://www.planetanalog.com, 5 pages, Planet Analog. |
Narkatani, L.H., et al., “Soft Machines: A Philosophy of User-Computer Interface Design,” Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Dec. 1983, Chicago, pp. 19-23. |
Rubine, D.H., “The Automatic Recognition of Gestures,” Carnegie Mellon University, Master Thesis, Cmu- CS-91-202, Dec. 1991, 285 pages. |
Kurtenbach, G.P., “The Design and Evaluation of Marking Menus,” University of Toronto, Graduate Dept. of Computer Science, Master Thesis, May 1993, 201 pages. |
Hopkins, D., “The Design and Implementation of Pie Menus,” originally published in Dr. Dobb's Journal, Dec. 1991, lead cover story, user interface issue, reproduced at www.DonHopkins.com, 8 pages. |
Buxton, B., “The Long Nose of Innovation,” Bloomberg Businessweek, Innovation & Design, Jan. 2, 2008, 3 pages, downloaded from http://www.businessweek.com/stories/2008-01-02/the-long-nose-of-innovationbusinessweek-business-news-stock-market-and-financialadvice. |
Buxton, B., “The Mad Dash Toward Touch Technology,” Bloomberg Businessweek, Innovation & Design, Oct. 21, 2009, 3 pages, downloaded from: http://www.businessweek.com/innovate/content/Oct. 2009/d20091021_629186. |
“The Sensor Frame Graphic Manipulator,” NASA Phase II Final Report, NASA-CR-194243, May 8, 1992, 28 pages. |
Izadi, S., et al., “ThinSight: A Thin Form-Factor Interactive Surface Technology,” Communications of the ACM, Research Highlights, vol. 52, No. 12, Dec. 2009, pp. 90-98. |
Krueger, M.W., et al., “VIDEOPLACE—An Artificial Reality,” Proceedings of the SIGCHI Conference on Human Fa tors in Computing Systems, Apr. 1985, pp. 35-40. |
Brown, E., et al., “Windows on Tablets as a Means of Achieving Virtual Input Devices,” Proceedings of the IFIP TC13 Third International Conference on Human-Computer Interaction, Aug. 27-31, 1990, in D. Diaper, et al.(Eds), Human-Computer Interaction—INTERACT '90, Amsterdam: Elsevier Science Publishers B.V. (North Holland), 11 pages. |
“A Multi-Touch Three Dimensional Touch-Sensitive Tablet,” http://www.youtube.com/watch?v=Amus9CxUiA, Nov. 18, 2009, 1 page. |
Number | Date | Country | |
---|---|---|---|
20230342023 A1 | Oct 2023 | US |
Number | Date | Country | |
---|---|---|---|
62990406 | Mar 2020 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14028941 | Sep 2013 | US |
Child | 16820613 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17203031 | Mar 2021 | US |
Child | 18217455 | US | |
Parent | 16820613 | Mar 2020 | US |
Child | 17203031 | US | |
Parent | 12496938 | Jul 2009 | US |
Child | 14028941 | US |