1. Field of the Invention
This invention generally relates user interfaces and more specifically to user interfaces with sensing capability and also relates to reflective user interfaces.
2. Description of the Related Art
Various sensors are becoming a ubiquitous part of computers and computing devices such as mobile phones. However, in order to fully take advantage of the information collected from various sensors, the user needs to stay within the range of the sensor. When the data from the sensors are used as an input device, reliable signal from the sensors are essential. The sensing technology most commonly used today for computer input is eye tracking.
Eye tracking is a promising computer input technique for disabled users or as a complement to existing input methods particular when a mouse is not appropriate or when the users' hands already are occupied. However, eye tracking is sensitive to a number of factors, such as light conditions, characteristics of the user's eye and if the target, the eye, is within the field of view of the eye tracker. Some of the factors influencing the performance of the eye trackers are possible to control; it is possible to keep the light conditions stable. While others are impossible to control, such as the characteristics of the user's eye.
Keeping the user within the field of view has been a concern since the early days of eye tracking. Solutions such as bite bars and chin rest as early solutions still in use to day, for instance SR International sells an eye tracker (Eye Link 1000) 100 with built in chin rest 101, as shown in
Another solution is a head mounted eye tracker, such as iView commercially available from SMI (SensoMotoric Instruments). However, even light weight solutions, such as the iView from SMI, require the user to be attached to a computer. This method is not preferred when using eye tracking as a computer input device, since the user's field of view is recorded by a camera and the gaze information is in relation to this recording. Finding out exactly where on a computer screen a user is looking, would require video analysis of user's video stream. Some eye trackers allow the users a limited range of head motion. These are generally table mounted eye trackers.
The problem of maintaining a user within the field of view of the eye tracker has attracted much research. Solutions vary, but the most common is to emply a system incorporating moving cameras. One such system is described in Beymer, D. and Flickner, M., 2003, Eye gaze tracking using an active stereo head, In Proceedings of 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Pp. 451-458. The technique of using moving cameras will soon become available in a commercial eye tracker developed by LC Technology. However, the high cost of these eye trackers may prevent them from entering the market place outside research laboratories. As would be appreciated by persons of skill in the art, low costing solutions are preferred, in particular if they are easy to implement.
Another solution for keeping the user within the field of the eye tracker is to make the user aware of if he or she is in the field of view of the eye tracker. One method is to display the user's gaze location on the screen. This method is common when eye tracking is used as an input method. However, this is a far from optimal method. The eye movements are quite jittery, resulting in a constantly jumping eye cursor. Also, the eye cursor does not signal to the user how to adjust to be better tracked by the eye tracker. Another method is to show the camera view, or a representation of the camera view, in a separate window. However with this method, the users need to keep an eye on that window, which interrupts their natural interaction with the computer.
Therefore, it is desirable to have a low cost solution for keeping user's eyes within a field of view of the eye tracker.
The inventive methodology is directed to methods and systems that substantially obviate one or more of the above and other problems associated with conventional techniques for tracking user's movements.
In accordance with one aspect of the inventive concept, there is provided a user interface system with sensing capability. The inventive system incorporates a sensor having a field of sensitivity; a camera configured to create an image of a user; a sensing module generating information indicative of movements of the user, and a display module coupled at least to the camera and sensing module and configured to provide the user a feedback based on the image and the information indicative of the movements of the user, the feedback being co-located with one or more objects in a work area of the user.
In accordance with another aspect of the inventive concept, there is provided a method for sensing movements of a user. The inventive method involves: obtaining information about the user using a sensor having a field of sensitivity; obtaining an image of the user using a camera; generating information indicative of movements of the user; and providing the user a feedback based on the image and the information indicative of the movements of the user, the feedback being co-located with one or more objects in a work area of the user.
In accordance with yet another aspect of the inventive concept, there is provided a computer readable medium embodying a set of instructions, which, when executed by one or more processors cause the one or more processors to perform a method for sensing movements of a user. The aforesaid method involves: obtaining information about the user using a sensor having a field of sensitivity; obtaining an image of the user using a camera; generating information indicative of movements of the user using a sensing module; and providing the user a feedback based on the image and the information indicative of the movements of the user, the feedback being co-located with one or more objects in a work area of the user.
Additional aspects related to the invention will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. Aspects of the invention may be realized and attained by means of the elements and combinations of various elements and aspects particularly pointed out in the following detailed description and the appended claims.
It is to be understood that both the foregoing and the following descriptions are exemplary and explanatory only and are not intended to limit the claimed invention or application thereof in any manner whatsoever.
The accompanying drawings, which are incorporated in and constitute a part of this specification exemplify the embodiments of the present invention and, together with the description, serve to explain and illustrate principles of the inventive technique. Specifically:
a) illustrates an exemplary view from an eye tracker.
b) illustrates an exemplary view from an auxiliary camera.
In the following detailed description, reference will be made to the accompanying drawings, in which identical functional elements are designated with like numerals. The aforementioned accompanying drawings show by way of illustration, and not by way of limitation, specific embodiments and implementations consistent with principles of the present invention. These implementations are described in sufficient detail to enable those skilled in the art to practice the invention and it is to be understood that other implementations may be utilized and that structural changes and/or substitutions of various elements may be made without departing from the scope and spirit of present invention. The following detailed description is, therefore, not to be construed in a limited sense. Additionally, the various embodiments of the invention as described may be implemented in the form of a software running on a general purpose computer, in the form of a specialized hardware, or combination of software and hardware.
In accordance with an embodiment of the inventive concept, there is provided a novel approach for improving the data collection from an eye tracker with free head motions. Instead of costly hardware improvements to the eye trackers, the embodiment of the invention instruments the user interfaces with attractive and esthetically pleasing reflections of the user, so called reflective user interfaces. An exemplary embodiment of a reflective user interface is described in detail in U.S. patent application Ser. No. 12/080,675. The reflective user interface makes the users aware of their movements in a subtle way that does not disturb the interaction or the user experience. An example of a reflective user interface 300 is shown in
The aforesaid U.S. patent application Ser. No. 12/080,675 describes how to construct the reflective user interface. When adapting the reflective user interfaces for the purpose of improving gaze data collection, it would be possible to get the camera image directly from the sensor's camera, for instance the eye tracker. However, this method has two main limitations, when used with eye trackers. First of all, eye trackers use infrared cameras. Since the image is not captured under natural light condition, the resulting image would not match the user's expectation of a mirrored image. In addition, the image generated by the eye tracker is not taken from the most attractive angle, as illustrated in
Instead of using the image of the eye tracker for generating the reflective user interface, in accordance with an embodiment of the inventive concept, a simple auxiliary camera is used. The auxiliary camera can be of a type that is used in webcams, and may be located on the top of the user's screen instead as from the bottom of it as would be the angle of view of the eye tracker. As would be appreciated by persons of ordinary skill in the art, the angle of the reflection from this camera would not be the same as with the image from the eye tracking camera, compare
In one embodiment of the invention, the cropped center portion of the image coincides with the actual field of view of the eye tracker camera. In another embodiment of the invention, the cropped video stream does not need to directly correspond to the field of view of the eye tracker since we have observed that people unconsciously try to position themselves so that their reflection is in the center of the camera's field of view.
The cropped image from the auxiliary camera would only give the user a sense of their position along the eye trackers x- and y-axis. But a user of an eye tracker may also get out of range by moving too close or too far away from the eye tracker. Information the user about their optimal position in front of the eye tracker in the z-axis is important for the x- and y-axis. In one embodiment of the invention, the distance information provided by the eye tracker is used to give the user a cue that they are moving out of range of the eye tracker. This cue is implemented simply by changing the alpha value of the video stream so that the reflection melts into the background and the user reflection completely disappears when the user is out of range. On the other hand, the reflection will be the most vivid when the user is located at optimal distance from the eye tracker camera.
Thus, an embodiment of the invention provides a feedback by manipulating the image from a secondary video source rather from the video source of the sensing device, in this case the eye tracker. This permits the feedback to be non-intrusive, subtle and attractive to the user. In one embodiment of the invention, the feedback is in a form of a reflective image of the user. To minimize the user's distraction, the feedback may be co-located with the user's work area. Specifically, in one embodiment, the image of the user generated by the reflective user interface is placed on the background or the foreground of the graphical user interface window currently used by the user. In another embodiment, the reflective image of the user is placed on the background of the graphical user interface itself. As would be appreciated by those of skill in the art, the present invention is not limited to any specific location of the user's feedback. Other convenient locations for delivering the feedback to the user may be utilized.
As would be appreciated by persons of ordinary skill in the art, the auxiliary camera is not limited to a webcam, but may be implemented using any known technology, which enables a digital image of the user to be acquired.
It should be also noted that the present invention is not limited to use of reflective user interface for facilitating eye tracking. It should be noted that eye tracking is just one instance of using a sensor to collect information about users location in front of a computer. The inventive concept involving using a reflective user interface can be also applied to face tracking and motion sensors (for instance for playing a game on game console) and other similar devices.
The computer platform 701 may include a data bus 704 or other communication mechanism for communicating information across and among various parts of the computer platform 701, and a processor 705 coupled with bus 701 for processing information and performing other computational and control tasks. Computer platform 701 also includes a volatile storage 706, such as a random access memory (RAM) or other dynamic storage device, coupled to bus 704 for storing various information as well as instructions to be executed by processor 705. The volatile storage 706 also may be used for storing temporary variables or other intermediate information during execution of instructions by processor 705. Computer platform 701 may further include a read only memory (ROM or EPROM) 707 or other static storage device coupled to bus 704 for storing static information and instructions for processor 705, such as basic input-output system (BIOS), as well as various system configuration parameters. A persistent storage device 708, such as a magnetic disk, optical disk, or solid-state flash memory device is provided and coupled to bus 701 for storing information and instructions.
Computer platform 701 may be coupled via bus 704 to a display 709, such as a cathode ray tube (CRT), plasma display, or a liquid crystal display (LCD), for displaying information to a system administrator or user of the computer platform 701. An input device 710, including alphanumeric and other keys, is coupled to bus 701 for communicating information and command selections to processor 705. Another type of user input device is cursor control device 711, such as a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to processor 704 and for controlling cursor movement on display 709. This input device typically has two degrees of freedom in two axes, a first axis (e.g., x) and a second axis (e.g., y), that allows the device to specify positions in a plane.
An external storage device 712 may be connected to the computer platform 701 via bus 704 to provide an extra or removable storage capacity for the computer platform 701. In an embodiment of the computer system 700, the external removable storage device 712 may be used to facilitate exchange of data with other computer systems.
The invention is related to the use of computer system 700 for implementing the techniques described herein. In an embodiment, the inventive system may reside on a machine such as computer platform 701. According to one embodiment of the invention, the techniques described herein are performed by computer system 700 in response to processor 705 executing one or more sequences of one or more instructions contained in the volatile memory 706. Such instructions may be read into volatile memory 706 from another computer-readable medium, such as persistent storage device 708. Execution of the sequences of instructions contained in the volatile memory 706 causes processor 705 to perform the process steps described herein. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions to implement the invention. Thus, embodiments of the invention are not limited to any specific combination of hardware circuitry and software.
The term “computer-readable medium” as used herein refers to any medium that participates in providing instructions to processor 705 for execution. The computer-readable medium is just one example of a machine-readable medium, which may carry instructions for implementing any of the methods and/or techniques described herein. Such a medium may take many forms, including but not limited to, non-volatile media, volatile media, and transmission media. Non-volatile media includes, for example, optical or magnetic disks, such as storage device 708. Volatile media includes dynamic memory, such as volatile storage 706. Transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise data bus 704. Transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.
Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic medium, a CD-ROM, any other optical medium, punchcards, papertape, any other physical medium with patterns of holes, a RAM, a PROM, an EPROM, a FLASH-EPROM, a flash drive, a memory card, any other memory chip or cartridge, a carrier wave as described hereinafter, or any other medium from which a computer can read.
Various forms of computer readable media may be involved in carrying one or more sequences of one or more instructions to processor 705 for execution. For example, the instructions may initially be carried on a magnetic disk from a remote computer. Alternatively, a remote computer can load the instructions into its dynamic memory and send the instructions over a telephone line using a modem. A modem local to computer system 700 can receive the data on the telephone line and use an infra-red transmitter to convert the data to an infra-red signal. An infra-red detector can receive the data carried in the infra-red signal and appropriate circuitry can place the data on the data bus 704. The bus 704 carries the data to the volatile storage 706, from which processor 705 retrieves and executes the instructions. The instructions received by the volatile memory 706 may optionally be stored on persistent storage device 708 either before or after execution by processor 705. The instructions may also be downloaded into the computer platform 701 via Internet using a variety of network data communication protocols well known in the art.
The computer platform 701 also includes a communication interface, such as network interface card 713 coupled to the data bus 704. Communication interface 713 provides a two-way data communication coupling to a network link 714 that is connected to a local network 715. For example, communication interface 713 may be an integrated services digital network (ISDN) card or a modem to provide a data communication connection to a corresponding type of telephone line. As another example, communication interface 713 may be a local area network interface card (LAN NIC) to provide a data communication connection to a compatible LAN. Wireless links, such as well-known 802.11a, 802.11b, 802.11g and Bluetooth may also used for network implementation. In any such implementation, communication interface 713 sends and receives electrical, electromagnetic or optical signals that carry digital data streams representing various types of information.
Network link 713 typically provides data communication through one or more networks to other network resources. For example, network link 714 may provide a connection through local network 715 to a host computer 716, or a network storage/server 717. Additionally or alternatively, the network link 713 may connect through gateway/firewall 717 to the wide-area or global network 718, such as an Internet. Thus, the computer platform 701 can access network resources located anywhere on the Internet 718, such as a remote network storage/server 719. On the other hand, the computer platform 701 may also be accessed by clients located anywhere on the local area network 715 and/or the Internet 718. The network clients 720 and 721 may themselves be implemented based on the computer platform similar to the platform 701.
Local network 715 and the Internet 718 both use electrical, electromagnetic or optical signals that carry digital data streams. The signals through the various networks and the signals on network link 714 and through communication interface 713, which carry the digital data to and from computer platform 701, are exemplary forms of carrier waves transporting the information.
Computer platform 701 can send messages and receive data, including program code, through the variety of network(s) including Internet 718 and LAN 715, network link 714 and communication interface 713. In the Internet example, when the system 701 acts as a network server, it might transmit a requested code or data for an application program running on client(s) 720 and/or 721 through Internet 718, gateway/firewall 717, local area network 715 and communication interface 713. Similarly, it may receive code from other network resources.
The received code may be executed by processor 705 as it is received, and/or stored in persistent or volatile storage devices 708 and 706, respectively, or other non-volatile storage for later execution. In this manner, computer system 701 may obtain application code in the form of a carrier wave.
It should be noted that the present invention is not limited to any specific firewall system. The inventive policy-based content processing system may be used in any of the three firewall operating modes and specifically NAT, routed and transparent.
Finally, it should be understood that processes and techniques described herein are not inherently related to any particular apparatus and may be implemented by any suitable combination of components. Further, various types of general purpose devices may be used in accordance with the teachings described herein. It may also prove advantageous to construct specialized apparatus to perform the method steps described herein. The present invention has been described in relation to particular examples, which are intended in all respects to be illustrative rather than restrictive. Those skilled in the art will appreciate that many different combinations of hardware, software, and firmware will be suitable for practicing the present invention. For example, the described software may be implemented in a wide variety of programming or scripting languages, such as Assembler, C/C++, perl, shell, PHP, Java, etc.
Moreover, other implementations of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. Various aspects and/or components of the described embodiments may be used singly or in any combination in a user interface with an eye tracking capability. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following claims.