The field of the present invention is light-based touch screens.
In computing, multi-touch refers to a touch sensing surface's ability to recognize the presence of two or more points of contact with the surface. This plural-point awareness is often used to implement advanced functionality such as pinch to zoom or activating predefined programs (Wikipedia, “multi-touch”). The Windows 8 operating system from Microsoft Corporation requires a touch screen supporting a minimum of 5-point digitizers. WINDOWS® is a registered trademark of Microsoft Corporation.
The present invention relates to light-based touch sensitive surfaces. Light-based touch sensitive surfaces surround the surface borders with light emitters and light detectors to create a light beam grid above the surface. An object touching the surface blocks a corresponding portion of the beams.
Reference is made to
Light-based touch detection systems are unable to accurately recognize many instances of two or more points of contact with the surface. Reference is made to
There is further ambiguity when more than two objects touch the screen simultaneously. Reference is made to
Reference is made to
As shown in
Reference is made to
The present invention also relates to integrated casings for optical touch screen displays in which the outer casing, or housing, for the display is a light guide for guiding light from emitters mounted within the housing over and across the screen surface, and for guiding beams that have crossed the screen surface to light receivers mounted within the housing.
Embodiments of the present invention provide improved multi-touch detection methods that eliminate ghosting and unambiguously identify touch locations, based on blocked light beams. The methods of the present invention also detect moving multi-touch locations.
Embodiments of the present invention also provide an integrated display frame and light guide which wraps around the outside to the front of the monitor or All-in-One (AiO) device. In some embodiments the light guide is made of clear plastic making the frame almost invisible at first glance.
There is thus provided in accordance with an embodiment of the present invention a touch screen assembly including a display screen, a plurality of infra-red LEDs operative to emit light when activated, a plurality of photo diodes operative to detect amounts of light received when activated, a transparent plastic frame including an exposed upper edge along the entire frame perimeter, vertically straight inner walls, extending from below the display screen to the exposed upper edge of the frame, along the entire frame perimeter, and internally reflective facets for directing light, emitted by the infra-red LEDs, along light paths that travel upward through one side of the frame along the height of the inner walls, over the display screen, downward through the opposite side of the frame along the height of the inner walls, and onto the photo diodes, and a processor coupled with the infra-red LEDs and the photo diodes, operative to selectively activate the infra-red LEDs and the photo diodes, to identify a location of an object touching the display screen from above, based on amounts of light detected by activated photo diodes when light emitted by activated infra-red LEDs is blocked along its light path by the object.
There is additionally provided in accordance with an embodiment of the present invention a touch screen assembly, including a curved display screen, a plurality of LEDs, mounted underneath the display screen, operative to emit light when activated, a plurality of photo diodes, mounted underneath the display screen, operative to detect amounts of light received when activated, a frame including internally reflective surfaces that guide light emitted by the LEDs along light paths that travel upwards, across the display screen in segments that follow the contour of the display screen, and downwards to the photo diodes, wherein the frame is oriented such that some of the light paths are incident upon and reflect off of the display screen while crossing said display screen, and a processor coupled with the LEDs and the photo diodes, operative to selectively activate the LEDs and the photo diodes, and to identify a location of an object touching the display screen, based on amounts of light detected by activated photo diodes when light emitted by activated LEDs is blocked by the object along its light path.
There is further provided in accordance with an embodiment of the present invention a touch screen assembly including a display screen, a plurality of infra-red LEDs operative to emit light when activated, a plurality of photo diodes operative to detect amounts of light received when activated, a transparent plastic frame surrounding the display screen on four sides that guides light emitted by the infra-red LEDs to the photo diodes along light paths that travel into the frame on one side, over the display screen, and into the frame on the opposite side, and a processor coupled with the infra-red LEDs and the photo diodes, operative to selectively activate the infra-red LEDs and the photo diodes, to identify a location of an object touching the display screen from above, based on amounts of light detected by activated photo diodes when light emitted by activated infra-red LEDs is blocked along its light path by the object, and to recognize the object touching the frame, based on amounts of light detected by activated photo diodes when light emitted by activated infra-red LEDs is absorbed along its light path by the object, thereby providing light-based touch sensitivity to the display screen and to the frame.
The present invention will be more fully understood and appreciated from the following detailed description, taken in conjunction with the drawings in which:
Aspects of the present invention relate to light-based touch screens and light-based touch surfaces. Throughout this specification, the terms “touch screen” and “touch sensitive surface” include touch sensitive electronic displays and touch surfaces that do not include an electronic display, inter alia, a mouse touchpad as included in many laptop computers and the back cover of a handheld device. They also include airspace enclosed by the rectangular emitter-detector sensor frame provided by the present invention. They also include airspace bordered on only one edge by a linear emitter-detector array whereby light projected into the airspace by the emitters is reflected by the touch object onto the reflectors.
According to embodiments of the present invention, a light-based touch sensor includes a plurality of infra-red or near infra-red light-emitting diodes (LEDs) arranged along two adjacent edges of a rectangular touch sensitive surface, as defined above, and a plurality of photodiodes (PDs) arranged along the two remaining adjacent edges. When light projected by the LEDs is blocked by an inserted object, such as a finger or a stylus, the absence of expected light is detected by the PDs. The LEDs and PDs are controlled for selective activation and de-activation by a controller. Generally, each LED and PD has I/O connectors, and signals are transmitted to specify which LEDs and which PDs are activated.
Reference is made to
Reference is made to
Reference is made to
Reference is made to
One advantage of grouping the beams this way is that beams within each group do not intersect. Therefore, when multiple objects are detected within a set of beams the system can identify their relative placement to one another along the axis that the emitters are situated along. In contrast, when analyzing a many-to-many configuration of detection beams in which intersecting beams are analyzed together it is often not possible to determine the relative positions of the detected objects. For example, when two objects a and b are detected by two intersecting beams, it is not known whether a is to the left of b or to the right, because it is unknown if a and b are situated above or below the point of intersection between the two detecting beams. This advantage is elaborated below.
According to embodiments of the present invention a plurality of light pulse emitters, E1, . . . , En, is arranged such that each emitter, E, transmits light pulses, denoted E(θ1), . . . , E(θn), that are directed in directions, θ1 . . . θn, outward from an edge of the display and over the display, for detection by n light pulse detectors situated along the perimeter of the display. Each light pulse E(θ) is directed through a lens to create a wide light pulse such that light pulses from neighboring emitters Em and Em+1, directed at a same angle θb, are substantially contiguous. Each such set of parallel beams is thus denoted E1(θa), . . . , En(θa).
In response to a single touch on the display each set of parallel beams identifies a respective touch location, corresponding to one or more light pulses Ei(θj) that is at least partially blocked by the touch, and having (a) a normalized touch value between 0 and 1, denoted W(Ei(θj)), according to the percentage of blockage of light pulse Ei(θj) by the touch, and (b) a respective screen coordinate, denoted X(Ei(θj)). A touch coordinate, denoted XT, is calculated by interpolating the screen coordinates of the identified touch locations according to the identified touch locations' normalized touch values,
XT=ΣW(Ei(θj))*X(Ei(θj)) (1)
When an unambiguous pattern of one or more touches is detected, an interpolation of touch locations from even only a few sets of parallel beams, e.g., 2-6 sets, provides a highly accurate touch coordinate. The fewer touch locations required for interpolation the faster the scanning speed and/or the lower the power required per screen scan. When the possibility of ghosting is determined present in one or more locations on the screen, the system scans additional sets of parallel beams E1(θa), . . . En(θa), E1(θb), . . . En(θb), . . . E1(θm), . . . En(θm). In some embodiments, the area containing the potential ghosting is identified, and only a subset of each additional set of beams that traverses the potential ghosting area is activated. The additional sets of beams resolve the ghosting ambiguity in two stages. The first is explained presently, and the second is described beginning at the section entitled Ghost Point Elimination.
Using few sets of parallel beams, e.g., two sets, for each of the x and y axes, touch locations are detected. Discussion now turns to resolving whether the detected touch is caused by one touch object or by a multi-touch, namely, a plurality of separate simultaneous touches. One axis—the x-axis, is discussed, but a similar process is performed for the y-axis. When a possible multi-touch is detected, the screen is logically divided into strips parallel to the axis whose coordinates the system is analyzing. Thus, when determining x-axis coordinates, the screen is logically divided into strips each strip crossing the width of the screen at a different y coordinate. A segment within each strip, containing the possible multi-touch, is identified based on the initial few sets of parallel beams. Next, the system determines which beams, in each additional set of parallel beams, cross the thus identified segment. The result is a table looking like this:
Next, the system activates the sets of additional beams in the table. As mentioned above, neighboring parallel beams in each set are substantially contiguous. Thus, a case of two touches is distinguished from a case of one touch if the pattern of touch detections within a series of neighboring parallel beams has more than one peak. This is explained with reference to
Reference is made to
In prior art touch detection systems that activate many-to-many emitter-receiver pairs, certain ambiguities remain. For example, two intersecting beams are blocked by two touches 910 and 911, respectively. It is unclear whether the touches are located above the intersection point—and thus touch point 910 is to the left of touch point 911, or below the intersection point in which case touch point 910 is to the right of touch point 911.
Reference is made to
Embodiments of the present invention resolve these ambiguities by analyzing each set of parallel beams separately. Within each set of beams, the beams do not intersect. Therefore, the ambiguity resulting from intersecting blocked beams does not exist.
Ghost Point Elimination
The blocking pattern, designated
In order to remove ghost touches a blocking pattern is calculated based on a candidate touch combination. The geometry of the light transmitters and receivers together with the candidate touch combination is used to model a blocking pattern, m,
where
There are different possibilities for an error metric, but calculating the norm of the difference in blocking patterns was found empirically to work well:
e=Σk=1N|dk−mk|, (3)
where
Tracking
The matching error can be used as a metric to perform touch tracking, for moving touch locations. The method consists of three steps:
Reference is made to
The optimization used in
Improving Touch Accuracy
Optimization of the matching error can alternatively be used to improve touch accuracy in situations where the detection algorithm does not perform well. This may happen close to edges or corners where there are fewer light beams available for coordinate calculation.
The optimization approach can be extended to cover inaccuracies due to finite beam acquisition time. In a multi-touch system based on tens of beams per touch, the acquisition may take long enough for the touches to move a significant distance. This can be taken into account by including estimated touch speeds, as additional unknowns, into the blocking pattern model
Each touch is assumed to be moving, and the sequence of measuring the beam data is included in the model for m.
This technique reduces the need for parallel beam acquisition, and consequently reduces total hardware cost of the multi-touch solution.
Touch Screen Assembly
According to embodiments of the present invention, a light guide for directing light beams for touch detection over and across the touch screen surface forms an exposed, outer frame of the screen. Embodiments of the invention are designed for desktop displays and All-in-One devices where there is little risk of trauma to the screen frame in the course of normal use, as compared to mobile phones, laptops and tablets. Nonetheless, the present invention is applicable to mobile phones, laptops and tablets as well.
Reference is made to
In other embodiments, visible light LEDs are provided to illuminate light guide frame 403. In one embodiment, the system identifies the color of the outermost pixels on the display and illuminates light guide frame 403 with a similar color so that the frame is adapted to visually blend into the rendered image.
Reference is made to
Reference is made
Reference is made to
Reference is made to
In some embodiments, light guide frame 403 is transparent, and furthermore, visible-light LEDs are mounted within frame 403, e.g., on PCB 502, to illuminate the frame, under certain conditions. In some embodiments, the frame is illuminated to notify the user, e.g., upon receiving an email or a video call the frame is illuminated a specific color.
Furthermore, the infrared light used for touch detection is transmitted through frame 403 by total internal reflection (TIR). As such, when an object having a higher index of refraction than the frame touches the frame, the object absorbs a portion of the transmitted infrared light. This absorption is detected as a reduction in expected light at the receivers. Significantly, this reduction occurs only to light beams along one axis, not both axes, when a user touches one edge of the frame. Thus, a touch on the frame is distinguished from a touch on the screen which causes detections on two axes. This touch gesture, namely, a touch or tap on the outside of frame 403, is used in some embodiments to activate a function, e.g., to open the Charms bar in certain Windows operating systems. WINDOWS® is a registered trademark of Microsoft Corporation. In other embodiments, when a notification of an incoming email or video call is received, this gesture opens the email or the video call. In another embodiment, a tap on the frame wakes up the computer from sleep mode. This can be communicated to the user by illuminating the frame in response to the tap.
Still further, a swipe gesture along an outer edge of frame 403 is also detectable because as the user glides his finger along the edge his finger absorbs infrared light beams from different emitters or receivers. Therefore, swipe gestures are also enabled on the outer perimeter of the frame.
An advantage of the touch screen assembly described above is the ability to accommodate light-based touch sensitivity for curved display screens. A technical challenge arises when trying to emit light from one edge of a curved screen to an opposite edge. In general, it is not possible for light to travel from one edge of the screen to the opposite edge along a single plane, since the edges are not co-planar. Furthermore, even if two opposite edges are coplanar, use of a single plane to direct light over a curved screen leads to touch detection errors due to the screen dipping underneath the plane, as shown in
Reference is made to
As shown in
Reference is made to
Although the left-to-right light paths could have been generated as straight lines, since the left and right edges are co-planar in the screen of
Reference is made to
Reference is made to
Reference is made to
Reference is made to
In the foregoing specification, the invention has been described with reference to specific exemplary embodiments thereof. It will, however, be evident that various modifications and changes may be made to the specific exemplary embodiments without departing from the broader spirit and scope of the invention. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.
This application is a continuation of U.S. patent application Ser. No. 14/588,462, now U.S. Pat. No. 9,207,800, entitled INTEGRATED LIGHT GUIDE AND TOUCH SCREEN FRAME AND MULTI-TOUCH DETERMINATION METHOD, and filed on Jan. 2, 2015 by inventors Thomas Eriksson, Alexander Jubner, John Karlsson, Lars Sparf, Saska Lindfors and Robert Pettersson. U.S. patent application Ser. No. 14/588,462 claims priority benefit of U.S. Provisional Patent Application No. 62/054,353, entitled INTEGRATED LIGHT GUIDE AND TOUCH SCREEN FRAME AND MULTI-TOUCH DETERMINATION METHOD, and filed on Sep. 23, 2014 by inventors Saska Lindfors, Robert Pettersson, John Karlsson and Thomas Eriksson.
Number | Name | Date | Kind |
---|---|---|---|
4243879 | Carroll et al. | Jan 1981 | A |
4267443 | Carroll et al. | May 1981 | A |
4301447 | Funk et al. | Nov 1981 | A |
4588258 | Hoopman | May 1986 | A |
4641426 | Hartman et al. | Feb 1987 | A |
4672364 | Lucas | Jun 1987 | A |
4703316 | Sherbeck | Oct 1987 | A |
4761637 | Lucas et al. | Aug 1988 | A |
4928094 | Smith | May 1990 | A |
5036187 | Yoshida et al. | Jul 1991 | A |
5162783 | Moreno | Nov 1992 | A |
5194863 | Barker et al. | Mar 1993 | A |
5220409 | Bures | Jun 1993 | A |
5414413 | Tamaru et al. | May 1995 | A |
5559727 | Deley et al. | Sep 1996 | A |
5577733 | Downing | Nov 1996 | A |
5603053 | Gough et al. | Feb 1997 | A |
5729250 | Bishop et al. | Mar 1998 | A |
5748185 | Stephan et al. | May 1998 | A |
5825352 | Bisset et al. | Oct 1998 | A |
5880462 | Hsia | Mar 1999 | A |
5889236 | Gillespie et al. | Mar 1999 | A |
5914709 | Graham et al. | Jun 1999 | A |
5936615 | Waters | Aug 1999 | A |
5943044 | Martinelli et al. | Aug 1999 | A |
5946134 | Benson et al. | Aug 1999 | A |
5988645 | Downing | Nov 1999 | A |
6010061 | Howell | Jan 2000 | A |
6091405 | Lowe et al. | Jul 2000 | A |
6333735 | Anvekar | Dec 2001 | B1 |
6340979 | Beaton et al. | Jan 2002 | B1 |
6362468 | Murakami et al. | Mar 2002 | B1 |
6421042 | Omura et al. | Jul 2002 | B1 |
6429857 | Masters et al. | Aug 2002 | B1 |
6690365 | Hinckley et al. | Feb 2004 | B2 |
6690387 | Zimmerman et al. | Feb 2004 | B2 |
6707449 | Hinckley et al. | Mar 2004 | B2 |
6757002 | Oross et al. | Jun 2004 | B1 |
6762077 | Schuurmans et al. | Jul 2004 | B2 |
6788292 | Nako et al. | Sep 2004 | B1 |
6803906 | Morrison et al. | Oct 2004 | B1 |
6836367 | Seino et al. | Dec 2004 | B2 |
6864882 | Newton | Mar 2005 | B2 |
6947032 | Morrison et al. | Sep 2005 | B2 |
6954197 | Morrison et al. | Oct 2005 | B2 |
6972401 | Akitt et al. | Dec 2005 | B2 |
6972834 | Oka et al. | Dec 2005 | B1 |
7030861 | Westerman et al. | Apr 2006 | B1 |
7133032 | Cok | Nov 2006 | B2 |
7170590 | Kishida | Jan 2007 | B2 |
7176905 | Baharav et al. | Feb 2007 | B2 |
7184030 | McCharles et al. | Feb 2007 | B2 |
7339580 | Westerman et al. | Mar 2008 | B2 |
7352940 | Charters et al. | Apr 2008 | B2 |
7369724 | Deane | May 2008 | B2 |
7372456 | McLintock | May 2008 | B2 |
7429706 | Ho | Sep 2008 | B2 |
7855716 | McCreary et al. | Dec 2010 | B2 |
8022941 | Smoot | Sep 2011 | B2 |
8120625 | Hinckley | Feb 2012 | B2 |
8139045 | Jang et al. | Mar 2012 | B2 |
8243047 | Chiang et al. | Aug 2012 | B2 |
8269740 | Sohn et al. | Sep 2012 | B2 |
8350831 | Drumm | Jan 2013 | B2 |
8426799 | Drumm | Apr 2013 | B2 |
8482547 | Christiansson et al. | Jul 2013 | B2 |
8508505 | Shin et al. | Aug 2013 | B2 |
8581884 | Fahraeus et al. | Nov 2013 | B2 |
8933876 | Galor et al. | Jan 2015 | B2 |
20010002694 | Nakazawa et al. | Jun 2001 | A1 |
20010022579 | Hirabayashi | Sep 2001 | A1 |
20010026268 | Ito | Oct 2001 | A1 |
20010028344 | Iwamoto et al. | Oct 2001 | A1 |
20010055006 | Sano et al. | Dec 2001 | A1 |
20020067348 | Masters et al. | Jun 2002 | A1 |
20020109843 | Ehsani et al. | Aug 2002 | A1 |
20020175900 | Armstrong | Nov 2002 | A1 |
20030231308 | Granger | Dec 2003 | A1 |
20030234346 | Kao | Dec 2003 | A1 |
20040046960 | Wagner et al. | Mar 2004 | A1 |
20040140961 | Cok | Jul 2004 | A1 |
20040201579 | Graham | Oct 2004 | A1 |
20050073508 | Pittel et al. | Apr 2005 | A1 |
20050104860 | McCreary et al. | May 2005 | A1 |
20050122308 | Bell et al. | Jun 2005 | A1 |
20050174473 | Morgan et al. | Aug 2005 | A1 |
20050271319 | Graham | Dec 2005 | A1 |
20060001654 | Smits | Jan 2006 | A1 |
20060018586 | Kishida | Jan 2006 | A1 |
20060132454 | Chen et al. | Jun 2006 | A1 |
20060161870 | Hotelling et al. | Jul 2006 | A1 |
20060161871 | Hotelling et al. | Jul 2006 | A1 |
20060229509 | Al-Ali et al. | Oct 2006 | A1 |
20070024598 | Miller et al. | Feb 2007 | A1 |
20070052693 | Watari | Mar 2007 | A1 |
20070084989 | Lange et al. | Apr 2007 | A1 |
20070146318 | Juh et al. | Jun 2007 | A1 |
20070152984 | Ording et al. | Jul 2007 | A1 |
20070176908 | Lipman et al. | Aug 2007 | A1 |
20080008472 | Dress et al. | Jan 2008 | A1 |
20080012850 | Keating, III | Jan 2008 | A1 |
20080013913 | Lieberman et al. | Jan 2008 | A1 |
20080055273 | Forstall | Mar 2008 | A1 |
20080056068 | Yeh et al. | Mar 2008 | A1 |
20080068353 | Lieberman et al. | Mar 2008 | A1 |
20080080811 | Deane | Apr 2008 | A1 |
20080093542 | Lieberman et al. | Apr 2008 | A1 |
20080100593 | Skillman et al. | May 2008 | A1 |
20080117183 | Yu et al. | May 2008 | A1 |
20080122792 | Izadi et al. | May 2008 | A1 |
20080122796 | Jobs et al. | May 2008 | A1 |
20080122803 | Izadi et al. | May 2008 | A1 |
20080158174 | Land et al. | Jul 2008 | A1 |
20080221711 | Trainer | Sep 2008 | A1 |
20080259053 | Newton | Oct 2008 | A1 |
20080273019 | Deane | Nov 2008 | A1 |
20080278460 | Arnett et al. | Nov 2008 | A1 |
20080297487 | Hotelling et al. | Dec 2008 | A1 |
20090009944 | Yukawa et al. | Jan 2009 | A1 |
20090027357 | Morrison | Jan 2009 | A1 |
20090058833 | Newton | Mar 2009 | A1 |
20090066673 | Molne et al. | Mar 2009 | A1 |
20090096994 | Smits | Apr 2009 | A1 |
20090102815 | Juni | Apr 2009 | A1 |
20090135162 | Van De Wijdeven et al. | May 2009 | A1 |
20090139778 | Butler et al. | Jun 2009 | A1 |
20090153519 | Suarez Rovere | Jun 2009 | A1 |
20090167724 | Xuan et al. | Jul 2009 | A1 |
20100002291 | Fukuyama | Jan 2010 | A1 |
20100023895 | Benko et al. | Jan 2010 | A1 |
20100079407 | Suggs | Apr 2010 | A1 |
20100079409 | Sirotich et al. | Apr 2010 | A1 |
20100079412 | Chiang et al. | Apr 2010 | A1 |
20100095234 | Lane et al. | Apr 2010 | A1 |
20100149073 | Chaum et al. | Jun 2010 | A1 |
20100289755 | Zhu et al. | Nov 2010 | A1 |
20100295821 | Chang et al. | Nov 2010 | A1 |
20100302185 | Han et al. | Dec 2010 | A1 |
20110043826 | Kiyose | Feb 2011 | A1 |
20110044579 | Travis et al. | Feb 2011 | A1 |
20110050639 | Challener et al. | Mar 2011 | A1 |
20110050650 | McGibney et al. | Mar 2011 | A1 |
20110057906 | Raynor et al. | Mar 2011 | A1 |
20110063214 | Knapp | Mar 2011 | A1 |
20110074734 | Wassvik et al. | Mar 2011 | A1 |
20110074736 | Takakura | Mar 2011 | A1 |
20110075418 | Mallory et al. | Mar 2011 | A1 |
20110090176 | Christiansson et al. | Apr 2011 | A1 |
20110116104 | Kao et al. | May 2011 | A1 |
20110148820 | Song | Jun 2011 | A1 |
20110157097 | Hamada et al. | Jun 2011 | A1 |
20110163956 | Zdralek | Jul 2011 | A1 |
20110163996 | Wassvik et al. | Jul 2011 | A1 |
20110169780 | Goertz et al. | Jul 2011 | A1 |
20110175533 | Holman et al. | Jul 2011 | A1 |
20110175852 | Goertz et al. | Jul 2011 | A1 |
20110179368 | King et al. | Jul 2011 | A1 |
20110205175 | Chen | Aug 2011 | A1 |
20110205186 | Newton et al. | Aug 2011 | A1 |
20110221706 | McGibney et al. | Sep 2011 | A1 |
20110227487 | Nichol et al. | Sep 2011 | A1 |
20110227874 | Fahraeus et al. | Sep 2011 | A1 |
20110242056 | Lee et al. | Oct 2011 | A1 |
20110248151 | Holcombe et al. | Oct 2011 | A1 |
20120050226 | Kato | Mar 2012 | A1 |
20120056821 | Goh | Mar 2012 | A1 |
20120068971 | Pemberton-Pigott | Mar 2012 | A1 |
20120068973 | Christiansson et al. | Mar 2012 | A1 |
20120086672 | Tseng et al. | Apr 2012 | A1 |
20120098753 | Lu | Apr 2012 | A1 |
20120098794 | Kleinert et al. | Apr 2012 | A1 |
20120176343 | Holmgren et al. | Jul 2012 | A1 |
20120188203 | Yao et al. | Jul 2012 | A1 |
20120212457 | Drumm | Aug 2012 | A1 |
20120212458 | Drumm | Aug 2012 | A1 |
20120218229 | Drumm | Aug 2012 | A1 |
20120306793 | Liu et al. | Dec 2012 | A1 |
20130044071 | Hu et al. | Feb 2013 | A1 |
20130127788 | Drumm | May 2013 | A1 |
20130135259 | King et al. | May 2013 | A1 |
20130141395 | Holmgren et al. | Jun 2013 | A1 |
20130215034 | Oh et al. | Aug 2013 | A1 |
Number | Date | Country |
---|---|---|
0601651 | Jun 1994 | EP |
1906632 | Apr 2008 | EP |
11232024 | Aug 1999 | JP |
8600446 | Jan 1986 | WO |
8600447 | Jan 1986 | WO |
2008004103 | Jan 2008 | WO |
2008133941 | Nov 2008 | WO |
2010015408 | Feb 2010 | WO |
2010134865 | Nov 2010 | WO |
Entry |
---|
Moeller et al., ZeroTouch: An Optical Multi-Touch and Free-Air Interaction Architecture, Proc. CHI 2012 Proceedings of the 2012 Annual Conference Extended Abstracts on Human Factors in Computing Systems, May 5, 2012, pp. 2165-2174, ACM, New York, NY, USA. |
Moeller et al., ZeroTouch: A Zero-Thickness Optical Multi-Touch Force Field, CHI EA '11 Proceedings of the 2011 Annual Conference Extended Abstracts on Human Factors in Computing Systems, May 2011, pp. 1165-1170, ACM, New York, NY, USA. |
Moeller et al., IntangibleCanvas: Free-Air Finger Painting on a Projected Canvas, CHI EA '11 Proceedings of the 2011 Annual Conference Extended Abstracts on Human Factors in Computing Systems, May 2011, pp. 1615-1620, ACM, New York, NY, USA. |
Moeller et al., Scanning FTIR: Unobtrusive Optoelectronic Multi-Touch Sensing through Waveguide Transmissivity Imaging,TEI '10 Proceedings of the Fourth International Conference on Tangible, Embedded, and Embodied Interaction, Jan. 2010, pp. 73-76, ACM, New York, NY, USA. |
Van Loenen et al., Entertable: A Solution for Social Gaming Experiences, Tangible Play Workshop, Jan. 28, 2007, pp. 16-19, Tangible Play Research and Design for Tangible and Tabletop Games, Workshop at the 2007 Intelligent User Interfaces Conference, Workshop Proceedings. |
Johnson, Enhanced Optical Touch Input Panel, IBM Technical Disclosure Bulletin vol. 28, No. 4, Sep. 1985, pp. 1760-1762. |
U.S. Appl. No. 14/588,462, Non-final Office Action, Apr. 8, 2015, 11 pages. |
U.S. Appl. No. 14/588,462, Notice of Allowance, Aug. 14, 2015, 8 pages. |
Number | Date | Country | |
---|---|---|---|
20160154533 A1 | Jun 2016 | US |
Number | Date | Country | |
---|---|---|---|
62054353 | Sep 2014 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14588462 | Jan 2015 | US |
Child | 14960369 | US |