Providing efficient and intuitive interaction between a computer system and users thereof is essential for delivering an engaging and enjoyable user-experience. Today, most computer systems include a keyboard for allowing a user to manually input information into the computer system, and a mouse for selecting or highlighting items shown on an associated display unit. As computer systems have grown in popularity, however, alternate input and interaction systems have been developed. For example, touch-based, or touchscreen, computer systems allow a user to physically touch the display unit and have that touch registered as an input at the particular touch location, thereby enabling a user to interact physically with objects shown on the display of the computer system.
The features and advantages of the inventions as well as additional features and advantages thereof will be more clearly understood hereinafter as a result of a detailed description of particular embodiments of the invention when taken in conjunction with the following drawings in which:
The following discussion is directed to various embodiments. Although one or more of these embodiments may be specified, the embodiments disclosed should not be interpreted, or otherwise used, as limiting the scope of the disclosure, including the claims. In addition, one skilled in the art will understand that the following description has broad application, and the discussion of any embodiment is meant only to be exemplary of that embodiment, and not intended to intimate that the scope of the disclosure, including the claims, is limited to that embodiment.
While touch technology is an exciting and natural means of user interface, it can still be improved. One fundamental drawback of conventional touchscreen interface systems is that the user must be within a few feet of the touchscreen computer for operation. In addition to limiting user movement and system placement, physical touching of the display screen may also cause arm and shoulder fatigue with extended use of the system.
Embodiments of the present invention provide a system and method for remote touch detection capable utilizing existing touchscreen computer system without hardware modification. That is, the touchscreen computer vision system is capable of supporting both physical touch, or “black or shadow” detection, and remote touch, or “white” detection. As such, embodiments in accordance with the present invention allow system interaction with an laser beam or infrared signal such as one from a laser pointer, thereby enabling users to remotely interface with a touchscreen system while maintaining line-of-sight contact of the laser beam with the front surface of the display.
Several advantages are afforded by the remote touch input method of the present embodiments. For example, the operating user does not need to be in close proximity to the touch display in order to interface with it. Accordingly, such a configuration allows for greater user mobility and more flexible placement of the touchscreen display. Furthermore, the remote touch system of the present embodiments helps to alleviate arm and shoulder fatigue caused by extended sessions of physically touching a vertical screen since user input can now be done from a comfortable location with only the press of a button or a flick of the wrist from the remote pointing device.
Referring now in more detail to the drawings in which like numerals identify corresponding parts throughout the views,
The display system 100 may includes a display panel 109 and a transparent layer 107 in front of the display panel 109, though the transparent layer 107 may be omitted in certain embodiments. The front side of the display panel 109 is the surface that displays an image and the back of the panel 109 is opposite the front. Light emitting devices 113a and 113b and optical sensors 110a and 110b can be on the same side of the transparent layer 107 as the display panel 109 to protect the optical sensors from contaminates. In an alternative embodiment, the light emitting device and optical sensors 110a and 110b may be in front of the transparent layer 107. The transparent layer 107 can be glass, plastic, or another transparent material. The display panel 109 may be a liquid crystal display (LCD) panel, a plasma display, a cathode ray tube (CRT), an OLED or a projection display such as digital light processing (DLP), for example. In one embodiment, mounting the light emitting devices 113a and 113b and optical sensors 110a and 110b in an area of the display system 100 that is outside of the perimeter of the of the display panel 109 provides that the clarity of the transparent layer is not reduced by the light emitting devices or optical sensors.
In one embodiment, optical sensors 110a and 110b represent may two-dimensional cameras including a charge-coupled device (CCD) or complementary metal-oxide semiconductor (CMOS) phototransistors for example, and are configured to receive external light or shadows and convert the light or shadow to data. In another embodiment, optical sensors 110a and 110b represent three-dimensional optical sensors configured to report a three-dimensional depth snap to a processor. The three-dimensional optical sensors 110a and 110b can determine the depth of an object located within its respective field of view 115a and 115b. The depth map changes over time as an object and signal moves in the respective field of view 115a of optical sensor 110a, or within the field of view 115b of optical sensor 115b. According to one embodiment, the depth of the object can be used to determine if the object is within a programmed distance of the display panel but not actually contacting the front side of the display panel. For example, the object may be a user's hand or finger approaching the front side of the display panel 109, or an infrared signal emitted from a laser pointing device operated by a user. Still further, and according to one embodiment, optical sensors 110a and 110b are positioned at top most corners around the perimeter of the display panel 109 such that each field of view 115a and 115b includes the areas above and surrounding the display panel 109. As such, a touch input such as a user's hand or infrared signal for example, may both be detected, and any associated motions around the perimeter and in front of the computer system 100 can be accurately interpreted by the computer processor.
More particularly, an object such as a user's hand 428 may approach the display 405 and cause a disruption in either light source 417a or 417b at position 425 in
As shown in the embodiment of
In step 608, the processor calculates the surface target position based on measurement data captured by the optical sensor. Next, in step 610, the system determines if the surface target position has moved, or if the user is moving the remote pointing device or physical object across the front display surface, and if so, updates the surface target position accordingly. According to one embodiment, once the surface target position is stationary for a predetermined time, the processor registers the surface target position as a touch input location in step 612 for determining an appropriate operation of the computer system.
Embodiments of the present invention provide a method for implementing a remote pointing device for use with touchscreen computer vision systems. In particular, the computer vision system of the present embodiments is configured to detect touch inputs caused by physical objects contacting a front surface of the display panel, in addition to touch inputs caused by an infrared light source contacting the front surface of the display panel. As such, a laser pointing device operated by a user and emitting an infrared signal may serve as a remote operating device for controlling and operating a touchscreen computing system.
Many advantages are afforded by the remote touch detection system and method according to embodiments of the present invention. For instance, the display panel of the touchscreen computing system may be placed in a location in which physical touch of the display is difficult or impossible (e.g. ceiling mounted). In such a case, the user may operate the computing system remotely from a more comfortable position than normal (e.g. lying down). Still further, the computer vision system of the present embodiments helps to reduce upper body fatigue caused by prolonged extension of a user's arms when physically contacting the display panel with their hands or a stylus for example. Moreover, embodiments of the present invention can be beneficial for users with physical disabilities by attaching the laser emitter to a headband or body part other than the user's hand for example.
Furthermore, while the invention has been described with respect to particular embodiments, one skilled in the art will recognize that numerous modifications are possible. For example, although exemplary embodiments depict an all-in-one computer as the representative display panel of the computer vision system, the invention is not limited thereto. For example, the computer vision system of the present embodiments may be implemented in a netbook, a tablet personal computer, a cell phone, or any other electronic device having a display panel, light emitting device, and optical sensor.
Still further, a single light emitting device and single optical sensor may be utilized in the computer visions system in lieu of the two emitting devices and two optical sensors depicted in the figures. Thus, although the invention has been described with respect to exemplary embodiments, it will be appreciated that the invention is intended to cover all modifications and equivalents within the scope of the following claims.
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/US2010/043447 | 7/27/2010 | WO | 00 | 11/28/2012 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2012/015395 | 2/2/2012 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
6498602 | Ogawa | Dec 2002 | B1 |
7091949 | Hansen | Aug 2006 | B2 |
20040095312 | Chen | May 2004 | A1 |
20050243070 | Ung et al. | Nov 2005 | A1 |
20090091532 | Hockett | Apr 2009 | A1 |
20090219253 | Izadi et al. | Sep 2009 | A1 |
20090309834 | Adams | Dec 2009 | A1 |
20100090985 | Newton | Apr 2010 | A1 |
20100295802 | Lee | Nov 2010 | A1 |
20100328267 | Chen | Dec 2010 | A1 |
20110291988 | Bamji et al. | Dec 2011 | A1 |
Number | Date | Country |
---|---|---|
20010004860 | Jan 2001 | KR |
20030046093 | Jun 2003 | KR |
20040078271 | Sep 2004 | KR |
10-0910024 | Nov 2008 | KR |
10-2009-0026957 | Mar 2009 | KR |
10-2008-0100008 | Jul 2009 | KR |
10-0913758 | Aug 2009 | KR |
10-200968205 | Jul 2010 | KR |
Entry |
---|
PCT; “Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration”; cited in PCT/US2010/043447; dated Apr. 28, 2011; 9 pages. |
Steve Hodges, Shahram Izadi, Alex Butler, Alban Rrustemi, Bill Buxton; pp. 259-268; 2007; ACM; Cite 2 source: Proceedings of the 20th annual ACM symposium on User Interface software and technology URL for cite 2: http://dl.acm.org/citation.cfm?doid=1294211.1294258. |
Number | Date | Country | |
---|---|---|---|
20130088462 A1 | Apr 2013 | US |