This relates generally to touch sensing and, more particularly, to various methodologies and applications of acoustic touch detection.
Many types of input devices are presently available for performing operations in a computing system, such as buttons or keys, mice, trackballs, joysticks, touch sensor panels, touch screens and the like. Touch screens, in particular, have become extremely popular because of their ease and versatility of operation as well as their declining price. Touch screens can include a touch sensor panel, which can be a clear panel with a touch-sensitive surface, and a display device such as a liquid crystal display (LCD) that can be positioned partially or fully behind the panel so that the touch-sensitive surface can cover at least a portion of the viewable area of the display device. Touch screens can allow a user to perform various functions by touching the touch sensor panel using a finger, stylus or other object at a location often dictated by a user interface (UI) being displayed by the display device. In general, touch screens can recognize a touch and the position of the touch on the touch sensor panel, and the computing system can then interpret the touch in accordance with the display appearing at the time of the touch, and thereafter can perform one or more actions based on the touch. In the case of some touch sensing systems, a physical touch on the display is not needed to detect a touch. For example, in some capacitive-type touch sensing systems, fringing electrical fields used to detect touch can extend beyond the surface of the display, and approaching objects may be detected near the surface without actually touching the surface. Capacitive-type touch sensing system, however, can experience reduced performance due to electrically floating objects (e.g., water droplets) in contact with the touch-sensitive surface.
This relates to system architectures, apparatus and methods for acoustic touch detection (touch sensing) and exemplary applications of the system architectures, apparatus and methods. Position of an object touching a surface can be determined using time of flight (TOF) bounding box techniques, acoustic image reconstruction techniques, acoustic tomography techniques, attenuation of reflections from an array of barriers, or a two-dimensional piezoelectric receiving array, for example. Acoustic touch sensing can utilize transducers, such as piezoelectric transducers, to transmit ultrasonic waves along a surface and/or through the thickness of an electronic device to the surface. As the ultrasonic wave propagates, one or more objects (e.g., fingers, styli) in contact with the surface can interact with the transmitted wave causing attenuation, redirection and/or reflection of at least a portion of the transmitted wave. Portions of the transmitted wave energy after interaction with the one or more objects can be measured to determine the touch location(s) of the one or more objects on the surface of the device. For example, one or more transducers (e.g., acoustic transducers) coupled behind the display of a device can be configured to transmit an acoustic wave through the thickness of a device (e.g., through the display stack and/or glass surface) to the surface and can receive a portion of the wave reflected back when the acoustic wave encounters a finger or object touching the surface. The location of the object can be determined, for example, based on the amount of time elapsing between the transmission of the wave and the detection of the reflected wave (e.g., time-of-flight ranging) and/or changes in the amplitude of the reflected wave. Acoustic touch sensing can be used instead of, or in conjunction with, other touch sensing techniques, such as resistive and/or capacitive touch sensing. In some examples, the acoustic touch sensing techniques described herein can be integrated into a display. In some examples, the acoustic touch sensing techniques described herein can be used on a glass surface of a display or touch screen. In some examples, an acoustic touch sensing system can be configured to be insensitive to contact on the device surface by water, and thus acoustic touch sensing can be used for touch sensing in devices that are likely to become wet or fully submerged in water.
In the following description of various examples, reference is made to the accompanying drawings which form a part hereof, and in which it is shown by way of illustration specific examples that can be practiced. It is to be understood that other examples can be used and structural changes can be made without departing from the scope of the various examples.
Acoustic sensors can be incorporated in the above described systems to add acoustic touch sensing capabilities to a surface of the system. For example, in some examples, a touch screen (e.g., capacitive, resistive, etc.) can be augmented with acoustic sensors to provide a touch sensing capability for use in wet environments or under conditions where the device may get wet (e.g., exercise, swimming, rain, washing hands). In some examples, an otherwise non-touch-sensitive display screen can be augmented with acoustic sensors to provide a touch sensing capability. In such examples, a touch screen can be implemented without the stack-up required for a capacitive touch screen. In some examples, the acoustic sensors can be used to provide touch sensing capability for a non-display surface. For example, the acoustic sensors can be used to provide touch sensing capabilities for a track pad 146, a button, a scroll wheel, part or all of the housing or any other surfaces of the device (e.g., on the front, rear or sides).
In some examples, transducers 204 can also be partially or completely disposed under or behind a display 208 (e.g., an Organic Light Emitting Diodes (OLED) display) such that the transducers are not visible to a user. When electrical energy is applied to the transducers 204 it can cause the transducers to vibrate, the display materials in contact with the transducers can also be caused to vibrate, and the vibrations of the molecules of the display materials can propagate as an acoustic wave through them. In some examples, display 208 can be a touch screen (e.g., capacitive) and the transducers 204 can be partially or completely disposed on (or coupled to) a portion of the touch screen display 208. For example, the touch screen display 208 may comprise a glass panel (cover glass), and a display region of the touch screen may be surrounded by a non-display region (e.g., a black border region surrounding the periphery of the display region of touch screen). In some examples, transducers 204 can be disposed partially or completely in the black mask region of the touch screen display 208 glass panel (e.g., on the back side of the glass panel behind the black mask) such that the transducers are not visible (or are only partially visible) to a user.
Device 200 can further comprise acoustic touch sensing circuitry 206, which can include circuitry for driving electrical signals to stimulate vibration of the transducers 204 (e.g., transmit circuitry), as well as circuitry for sensing electrical signals output by the transducers (e.g., receive circuitry) when the transducer is stimulated by received acoustic energy. In some examples, timing operations for the acoustic touch sensing circuitry 206 can optionally be provided by a separate acoustic touch sensing controller 210 that can control timing of acoustic touch sensing circuitry 206 operations. In some examples, touch sensing controller 210 can be coupled between acoustic touch sensing circuitry 206 and host processor 214. In some examples, controller functions can be integrated with the acoustic touch sensing circuitry 206 (e.g., on a single integrated circuit). Output data from acoustic touch sensing circuitry 206 can be output to a host processor 214 for further processing to determine a location of an object contacting the device as will be described in more detail below. In some examples, the processing for determining location of a contacting object can be performed by the acoustic touch sensing circuitry 206, controller 210 or a separate sub-processor of device 200 (not shown).
In addition to acoustic touch sensing, the device can include additional touch circuitry 212 and optionally a touch controller (not shown) that can be coupled to the touch screen display 208. In examples including a touch controller, the touch controller can be disposed between the touch circuitry 212 and the host processor 214. The touch circuitry 212 can, for example, be capacitive or resistive touch sensing circuitry, and can be used to detect contact and/or hovering of objects (e.g., fingers, styli) in contact with and/or in proximity to the touch screen display 208, particularly in the display region of the touch screen. Thus, device 200 can include multiple types of sensing circuitry (e.g., touch circuitry 212 and acoustic touch sensing circuitry 206) for detecting objects (and their positions) in different regions of the device and/or for different purposes, as will be described in more detail below. Although described herein as including a touch screen, it should be understood that touch circuitry 212 can be omitted and touch screen display 208 can be replaced by an otherwise non-touch-sensitive display (e.g., but—for the acoustic touch sensors).
Host processor 214 can receive acoustic or other touch outputs (e.g., capacitive) and perform actions based on the touch outputs. Host processor 214 can also be connected to program storage 216 and touch screen display 208. Host processor 214 can, for example, communicate with touch screen display 208 to generate an image on touch screen display 208, such as an image of a UI, and can use touch sensing circuitry 212 and/or acoustic touch sensing circuitry 206 (and, in some examples, their respective controllers) to detect a touch on or near touch screen display 208, such as a touch input to the displayed UI. The touch input can be used by computer programs stored in program storage 216 to perform actions that can include, but are not limited to, moving an object such as a cursor or pointer, scrolling or panning, adjusting control settings, opening a file or document, viewing a menu, making a selection, executing instructions, operating a peripheral device connected to the host device, answering a telephone call, placing a telephone call, terminating a telephone call, changing the volume or audio settings, storing information related to telephone communications such as addresses, frequently dialed numbers, received calls, missed calls, logging onto a computer or a computer network, permitting authorized individuals access to restricted areas of the computer or computer network, loading a user profile associated with a user's preferred arrangement of the computer desktop, permitting access to web content, launching a particular program, encrypting or decoding a message, and/or the like. Host processor 214 can also perform additional functions that may not be related to touch processing.
Note that one or more of the functions described herein can be performed by firmware stored in memory and executed by the touch circuitry 212 and/or acoustic touch sensing circuitry 206 (or their respective controllers), or stored in program storage 216 and executed by host processor 214. The firmware can also be stored and/or transported within any non-transitory computer-readable storage medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. In the context of this document, a “non-transitory computer-readable storage medium” can be any medium (excluding a signal) that can contain or store the program for use by or in connection with the instruction execution system, apparatus, or device. The non-transitory computer-readable storage medium can include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus or device, a portable computer diskette (magnetic), a random access memory (RAM) (magnetic), a read-only memory (ROM) (magnetic), an erasable programmable read-only memory (EPROM) (magnetic), a portable optical disc such a CD, CD-R, CD-RW, DVD, DVD-R, or DVD-RW, or flash memory such as compact flash cards, secured digital cards, USB memory devices, memory sticks, and the like.
The firmware can also be propagated within any transport medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. In the context of this document, a “transport medium” can be any medium that can communicate, propagate or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The transport readable medium can include, but is not limited to, an electronic, magnetic, optical, electromagnetic or infrared wired or wireless propagation medium.
It is to be understood that device 200 is not limited to the components and configuration of
At 304, returning acoustic energy can be received, and the acoustic energy can be converted to an electrical signal by one or more transducers 204. At 306, the acoustic sensing system can determine whether one or more objects is contacting the surface of the device, and can further detect the position of one or more objects based on the received acoustic energy. In some examples, a distance of the object from the transmission source (e.g., transducers 204) can be determined from a TOF between transmission and reception of reflected energy, and a propagation rate of the ultrasonic wave through the material. In some examples, TOF measurements from multiple transducers at different positions can be used to triangulate or trilaterate object position. In some examples, baseline reflected energy from one or more intentionally included discontinuities (e.g., barriers, ridges, grooves, etc.) can be compared to a measured value of reflected energy. The baseline reflected energy can be determined during a measurement when no object (e.g., finger) is in contact with the surface. In some examples, reflected energy can be measured at different positions relative to the surface (e.g., at touch pixels as will be described further below). A position of measured deviations of the reflected energy from the baseline can be correlated with a location of an object. Although method 300, as described above, generally refers to reflected waves received by the transducers that transmitted the waves, in some examples, the transmitter and receiver functions can be separated such that the transmission of acoustic energy at 302 and receiving acoustic energy at 304 may not occur at the same transducer. Exemplary device configurations and measurement timing examples that can be used to implement method 300 will be described in further detail below.
When an object 410 is touching the cover glass 402 (e.g., a touch condition), the object 410 can absorb and/or reflect a portion of the transmitted shear wave. Furthermore, the object 410 can change the impedance seen by the transducer relative to when there is no object in contact with the cover glass 402. Accordingly, several different sensing techniques can be used to detect the presence and location of an object. In a first exemplary sensing technique, TOF can be used to determine the presence of object 410 (e.g., a touch event on the surface 401 of the cover glass 402). The TOF measurement can be carried out by measuring the time between transmission of the shear horizontal wave and detection of returned energy from the shear horizontal wave. If the shear horizontal wave interacts with an object (e.g., finger 410) on the surface 401 of the cover glass 402, a portion of the incident energy can reflect and return to the transducer 408 and/or other nearby transducers (or pixels, as described further below). The amount of time (e.g., TOF) and the speed of propagation of the wave can be used to determine the distance of the object from the origin point of the transmitted wave. The TOF of the reflected wave when no object 410 is touching the cover glass 402 can be used as a baseline for comparing the time of flight of reflected energy from an object 410. In a second exemplary sensing technique, absorption of a portion of the transmitted energy by an object 410 can be used to determine the presence of an object. In some examples where the reflected energy can be received at multiple locations (e.g., pixels) simultaneously, the location of the touch event can then be determined by triangulating or trilaterating the received signals at the different locations. In some examples, a baseline amplitude (or energy) of reflected acoustic waves from the surface 401 of cover glass 402 can be determined for a no-touch condition. If a portion of the transmitted wave is absorbed by an object 410 in contact with the cover glass 402, the change in amplitude (or energy) in the reflected wave can be used to detect the presence of the object. In a third exemplary detection technique, a baseline impedance of the stack-up 400 can be determined for a no-touch condition, such that changes in impedance caused by an object 410 in contact with the cover glass 402 can be measured.
In some alternative examples, a transducer operating in a d15 mode can also be used to produce shear horizontal waves that propagate in the z-axis direction. In some examples, the d15 mode can be achieved by moving the electrodes 502 and 504 to the top and bottom sides of the transducer 500 with the same poling direction 506. In such a configuration, a shear horizontal mode wave can be created to propagate in the z-axis direction similar to the d24 mode described above. Unlike the d24 mode, in the case of the d15 mode, the frequency of operation is inversely proportional to the thickness “t” of the transducer 500. Thus, in order to reduce the frequency of a transmitted wave, the thickness “t” can be increased. For an exemplary frequency of 1 MHz, the thickness of the transducer can be around about 1 mm and for an exemplary frequency of 0.5 MHz, the thickness of the transducer can be around about 2 mm. Thus, for low frequencies the thickness required to achieve the proper frequency can become prohibitively large. While thinner transducers are possible with the use of a higher frequency (e.g., 5 MHz), higher frequencies can experience greater attenuation resulting in a reduction of signal strength from reflected waves received at the transducer 500.
Therefore, according to the above, some examples of the disclosure are directed to an apparatus comprising: a cover surface, a display panel coupled to the cover surface, a plurality of bar transducers coupled to the display panel, each bar transducer comprising: a plurality of electrodes coupled to the bar transducer and configured to divide the bar transducer into a plurality of touch sensing pixels, and control circuitry configured to: simultaneously stimulate a first plurality of electrodes coupled to a first bar transducer to produce a shear horizontal wave and measure a plurality of electrical signals received at each pixel of a first plurality of touch sensing pixels corresponding to the first plurality of electrodes coupled to the first bar transducer. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the control circuitry is further configured to determine a position of an object in contact with the cover surface based on measured values at one or more of the plurality of touch sensing pixels. Additionally or alternatively to one or more of the examples disclosed above, in some examples, determining the position of the object comprises: receiving a baseline measurement value for each of the first plurality of touch sensing pixels; and comparing each of the measured values at the one or more of the plurality of touch sensing pixels respectively to the baseline measurement value. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the plurality of bar transducers is configured to operate in a d24 mode. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the plurality of bar transducers is configured to operate in a d15 mode. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the position of the object comprises capturing time of flight information based on the transmitted shear horizontal wave at multiple pixels and determining the position of the object based on the captured time of flight information at the multiple pixels. Additionally or alternatively to one or more of the examples disclosed above, in some examples, an adhesive layer disposed between the cover surface and the display panel provides an impedance match for a shear horizontal wave at a frequency of the transmitted shear horizontal wave.
Some examples of the disclosure are directed to a method comprising: stimulating a first bar transducer to transmit a shear horizontal wave through a display stack-up comprising a display panel and a cover surface coupled to the display panel, and measuring a plurality of electrical signals received at each pixel of a first plurality of touch sensing pixels corresponding to a first plurality of electrodes coupled to the first bar transducer.
Some examples of the disclosure are directed to a non-transitory computer readable storage medium having stored thereon a set of instructions that when executed by a processor causes the processor to: stimulate a first bar transducer to transmit a shear horizontal wave through a display stack-up comprising a display panel and a cover surface coupled to the display panel, and measure a plurality of electrical signals received at each pixel of a first plurality of touch sensing pixels corresponding to a first plurality of electrodes coupled to the first bar transducer.
Although examples of this disclosure have been fully described with reference to the accompanying drawings, it is to be noted that various changes and modifications will become apparent to those skilled in the art. Such changes and modifications are to be understood as being included within the scope of examples of this disclosure as defined by the appended claims.
This application claims benefit of U.S. Provisional Patent Application No. 62/624,046, filed Jan. 30, 2018, the entire disclosures of which is incorporated herein by reference for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
3673327 | Johnson | Jun 1972 | A |
4506354 | Hansen | Mar 1985 | A |
4729128 | Grimes et al. | Mar 1988 | A |
4746914 | Adler | May 1988 | A |
4825212 | Adler | Apr 1989 | A |
5162618 | Knowles | Nov 1992 | A |
5381696 | Ichinose et al. | Jan 1995 | A |
5483261 | Yasutake | Jan 1996 | A |
5488204 | Mead et al. | Jan 1996 | A |
5515298 | Bicz | May 1996 | A |
5589636 | Bicz | Dec 1996 | A |
5591945 | Kent | Jan 1997 | A |
5719950 | Osten et al. | Feb 1998 | A |
5766493 | Shin | Jun 1998 | A |
5816225 | Koch et al. | Oct 1998 | A |
5825352 | Bisset et al. | Oct 1998 | A |
5835079 | Shieh | Nov 1998 | A |
5854450 | Kent | Dec 1998 | A |
5880411 | Gillespie et al. | Mar 1999 | A |
5886452 | Toda | Mar 1999 | A |
6078315 | Huang | Jun 2000 | A |
6091406 | Kambara | Jul 2000 | A |
6159149 | Erikson et al. | Dec 2000 | A |
6164135 | Bicz | Dec 2000 | A |
6188391 | Seely et al. | Feb 2001 | B1 |
6225985 | Armstrong | May 2001 | B1 |
6229529 | Yano | May 2001 | B1 |
6310610 | Beaton et al. | Oct 2001 | B1 |
6323846 | Westerman et al. | Nov 2001 | B1 |
6327011 | Kim | Dec 2001 | B2 |
6690387 | Zimmerman et al. | Feb 2004 | B2 |
6720712 | Scott et al. | Apr 2004 | B2 |
6856259 | Sharp | Feb 2005 | B1 |
7015894 | Morohoshi | Mar 2006 | B2 |
7032454 | Amano | Apr 2006 | B2 |
7079118 | Benard | Jul 2006 | B2 |
7098891 | Pryor | Aug 2006 | B1 |
7184064 | Zimmerman et al. | Feb 2007 | B2 |
7400750 | Nam | Jul 2008 | B2 |
7458268 | Schneider et al. | Dec 2008 | B2 |
7489308 | Blake | Feb 2009 | B2 |
7497120 | Schneider et al. | Mar 2009 | B2 |
7499039 | Roberts | Mar 2009 | B2 |
7568391 | Schneider et al. | Aug 2009 | B2 |
7573466 | Marzen | Aug 2009 | B1 |
7656932 | Durand et al. | Feb 2010 | B2 |
7663607 | Hotelling et al. | Feb 2010 | B2 |
7667374 | Aono et al. | Feb 2010 | B2 |
7734435 | Thomas et al. | Jun 2010 | B2 |
7739912 | Schneider et al. | Jun 2010 | B2 |
7770456 | Stevenson et al. | Aug 2010 | B2 |
7907129 | Idzik | Mar 2011 | B2 |
8047995 | Wakabayashi et al. | Nov 2011 | B2 |
8054203 | Breed et al. | Nov 2011 | B2 |
8085998 | Setlak et al. | Dec 2011 | B2 |
8095328 | Thomas et al. | Jan 2012 | B2 |
8169404 | Boillot | May 2012 | B1 |
8179678 | Yamashita et al. | May 2012 | B2 |
8201739 | Schneider et al. | Jun 2012 | B2 |
8335356 | Schmitt | Dec 2012 | B2 |
8345508 | Wodnicki et al. | Jan 2013 | B2 |
8479122 | Hotelling et al. | Jul 2013 | B2 |
8508103 | Schmitt et al. | Aug 2013 | B2 |
8536465 | Kobayashi et al. | Sep 2013 | B2 |
8576202 | Tanaka et al. | Nov 2013 | B2 |
8601876 | Schneider et al. | Dec 2013 | B2 |
8617078 | Machida et al. | Dec 2013 | B2 |
8666126 | Lee et al. | Mar 2014 | B2 |
8692812 | Hecht | Apr 2014 | B2 |
8724859 | Schneider et al. | May 2014 | B2 |
8724869 | Funabasama et al. | May 2014 | B2 |
8743091 | Bernstein | Jun 2014 | B2 |
8781180 | Schneider et al. | Jul 2014 | B2 |
8791792 | Benkley, III | Jul 2014 | B2 |
8982089 | Lim | Mar 2015 | B2 |
9044171 | Venkatraman et al. | Jun 2015 | B2 |
9056082 | Liautaud et al. | Jun 2015 | B2 |
9100034 | Oshima et al. | Aug 2015 | B2 |
9132693 | Klootwijk et al. | Sep 2015 | B2 |
9170668 | Schneider et al. | Oct 2015 | B2 |
9201546 | Son et al. | Dec 2015 | B2 |
9276625 | Jing et al. | Mar 2016 | B2 |
9323393 | Djordjev et al. | Apr 2016 | B2 |
9465972 | Chung et al. | Oct 2016 | B2 |
9568315 | Il et al. | Feb 2017 | B2 |
9607203 | Yazdandoost et al. | Mar 2017 | B1 |
9613246 | Gozzini et al. | Apr 2017 | B1 |
9747488 | Yazdandoost et al. | Aug 2017 | B2 |
9747988 | Maeda | Aug 2017 | B2 |
9778193 | Vacca | Oct 2017 | B2 |
9824254 | Yazdandoost et al. | Nov 2017 | B1 |
9904836 | Yeke Yazdandoost et al. | Feb 2018 | B2 |
9952095 | Hotelling et al. | Apr 2018 | B1 |
9979955 | Guo | May 2018 | B1 |
9984271 | King et al. | May 2018 | B1 |
10133904 | Yazdandoost et al. | Nov 2018 | B2 |
20030102777 | Kuniyasu et al. | Jun 2003 | A1 |
20030109993 | Peat et al. | Jun 2003 | A1 |
20040140735 | Scott et al. | Jul 2004 | A1 |
20040164970 | Benard | Aug 2004 | A1 |
20040264746 | Polcha et al. | Dec 2004 | A1 |
20050017959 | Kraus | Jan 2005 | A1 |
20050052432 | Kraus | Mar 2005 | A1 |
20050083313 | Hardie-bick | Apr 2005 | A1 |
20050105784 | Nam | May 2005 | A1 |
20050248548 | Tsumura | Nov 2005 | A1 |
20060026521 | Hotelling et al. | Feb 2006 | A1 |
20060097991 | Hotelling et al. | May 2006 | A1 |
20060196271 | Jancsik et al. | Sep 2006 | A1 |
20060197753 | Hotelling | Sep 2006 | A1 |
20070211031 | Marc | Sep 2007 | A1 |
20070240913 | Schermerhorn | Oct 2007 | A1 |
20080059761 | Norman | Mar 2008 | A1 |
20080114251 | Weymer | May 2008 | A1 |
20080142571 | Yokozuka et al. | Jun 2008 | A1 |
20080175450 | Scott | Jul 2008 | A1 |
20080266266 | Kent | Oct 2008 | A1 |
20090167704 | Terlizzi et al. | Jul 2009 | A1 |
20100026667 | Bernstein | Feb 2010 | A1 |
20100237992 | Liautaud | Sep 2010 | A1 |
20100239133 | Schmitt et al. | Sep 2010 | A1 |
20110227862 | Lim | Sep 2011 | A1 |
20120092026 | Liautaud et al. | Apr 2012 | A1 |
20120275669 | Kim | Nov 2012 | A1 |
20130015868 | Peng | Jan 2013 | A1 |
20130194241 | Jeng | Aug 2013 | A1 |
20140198072 | Schuele | Jul 2014 | A1 |
20140333328 | Nelson et al. | Nov 2014 | A1 |
20140352440 | Fennell et al. | Dec 2014 | A1 |
20140355381 | Lal et al. | Dec 2014 | A1 |
20140359757 | Sezan et al. | Dec 2014 | A1 |
20150053006 | Decoux et al. | Feb 2015 | A1 |
20150185898 | Masson et al. | Jul 2015 | A1 |
20150189136 | Chung et al. | Jul 2015 | A1 |
20150192547 | Lee et al. | Jul 2015 | A1 |
20150293639 | Furutani | Oct 2015 | A1 |
20150358740 | Tsai et al. | Dec 2015 | A1 |
20160063300 | Du et al. | Mar 2016 | A1 |
20160092714 | Yazdandoost et al. | Mar 2016 | A1 |
20160092715 | Yazdandoost et al. | Mar 2016 | A1 |
20160092716 | Yazdandoost et al. | Mar 2016 | A1 |
20160117541 | Lu et al. | Apr 2016 | A1 |
20160246396 | Dickinson et al. | Aug 2016 | A1 |
20160350573 | Kitchens et al. | Dec 2016 | A1 |
20170053151 | Yeke Yazandoost et al. | Feb 2017 | A1 |
20170255338 | Medina et al. | Sep 2017 | A1 |
20170357839 | Yazdandoost et al. | Dec 2017 | A1 |
20180046836 | Hinger | Feb 2018 | A1 |
20190004662 | Gagne-Keats | Jan 2019 | A1 |
Number | Date | Country |
---|---|---|
2000-163031 | Jun 2000 | JP |
2002-342033 | Nov 2002 | JP |
9402911 | Feb 1994 | WO |
2005103872 | Nov 2005 | WO |
Entry |
---|
Final Office Action dated Feb. 20, 2013, for U.S. Appl. No. 12/184,232, filed Jul. 31, 2008, 24 pages. |
Final Office Action dated Aug. 27, 2013, for U.S. Appl. No. 12/184,232, filed Jul. 31, 2008, 25 pages. |
Non-Final Office Action dated Nov. 18, 2011, for U.S. Appl. No. 12/184,232, filed Jul. 31, 2008, 21 pages. |
Non-Final Office Action dated Jul. 25, 2012, for U.S. Appl. No. 12/184,232, filed Jul. 31, 2008, 18 pages. |
Notice of Allowance dated Mar. 14, 2014, for U.S. Appl. No. 12/184,232, filed Jul. 31, 2008, eight pages. |
Lee, S.K. et al. (Apr. 1985). “A Multi-Touch Three Dimensional Touch-Sensitive Tablet,” Proceedings of CHI: ACM Conference on Human Factors in Computing Systems, pp. 21-25. |
Rubine, D.H. (Dec. 1991). “The Automatic Recognition of Gestures,” CMU-CS-91-202, Submitted in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy in Computer Science at Carnegie Mellon University, 285 pages. |
Rubine, D.H. (May 1992). “Combining Gestures and Direct Manipulation,” CHI '92, pp. 659-660. |
Westerman, W. (Spring 1999). “Hand Tracking, Finger Identification, and Chordic Manipulation on a Multi-Touch Surface,” A Dissertation Submitted to the Faculty of the University of Delaware in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy in Electrical Engineering, 364 pages. |
Advisory Action received for U.S. Appl. No. 14/822,614, dated Apr. 30, 2018, 3 pages. |
Advisory Action received for U.S. Appl. No. 14/823,928, dated Aug. 17, 2017, 4 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 14/590,821, dated Aug. 21, 2017, 2 pages. |
Final Office Action received for U.S. Appl. No. 14/590,821, dated Apr. 5, 2017, 10 pages. |
Final Office Action received for U.S. Appl. No. 14/822,614, dated Apr. 12, 2017, 20 pages. |
Final Office Action received for U.S. Appl. No. 14/822,614, dated Feb. 23, 2018, 22 pages. |
Final Office Action received for U.S. Appl. No. 14/823,928, dated May 31, 2017, 15 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/590,821, dated Sep. 21, 2016, 17 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/822,614, dated Oct. 23, 2017, 17 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/822,614, dated Oct. 27, 2016, 17 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/823,928, dated Jan. 20, 2017, 18 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/823,945, dated Dec. 12, 2016, 13 pages. |
Notice of Allowance received for U.S. Appl. No. 14/590,821, dated Jul. 18, 2017, 5 pages. |
Notice of Allowance received for U.S. Appl. No. 14/822,614, dated Jul. 17, 2018, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 14/823,928, dated Oct. 13, 2017, 11 pages. |
Notice of Allowance received for U.S. Appl. No. 14/823,945, dated Apr. 25, 2017, 7 pages. |
Number | Date | Country | |
---|---|---|---|
20190235656 A1 | Aug 2019 | US |
Number | Date | Country | |
---|---|---|---|
62624046 | Jan 2018 | US |