The present invention is related to the field of interaction techniques in gaze control system interfaces, and in particular to a new system that allows to create, save and format text document using eye tracking devices through a method of fast cursor positioning.
One of the goal of research in the human-computer interaction field has been and it is, till now, the increase of bandwidth of communication between the user and the machine, because with the introduction of the GUI (graphical user interface), the bandwidth of output data has increased while the bandwidth of input data has remained mostly the same.
For this purpose several attempts have been made in using eye tracking to decrease the gap between output and input communication bandwidth and since the technology has become robust, accurate and economic enough, there is now a need for a real human-computer interface that makes use of inputs from eye-tracking devices in application development.
It is therefore necessary to find “interaction techniques” suitable for ocular movements so that to create a dialog user-computer, natural and advantageous, since an user interface based on such input is potentially faster and requires less effort with respect to the current interfaces.
This interface is difficult to develop for many reasons and in particular because the eyes are perceptive organs and the gaze moves on the screen also when the user records information and he doesn't want to produce any type of control command. moreover the user, who can be a person with disabilities, can have difficulties to control his own gaze with accuracy high enough to control the computer as desired and this is particularly emphasized if the objects to control on the screen is small.
IN the state of the art there are a lot of systems that in different ways have tried to develop interaction methods based on the complete management of mouse emulation. In particular, some of them provide a pointer movement as a function of gaze movement.
An example is international patent application WO2007/017500 disclosing a standard eye-tracking device for a computer system executing the steps of displaying a user interface, tracking gaze coordinates and processing data related to said gaze coordinates in order to determine which action is to be performed according to the user gazing activity.
One of these interaction techniques magnifies the areas present on the screen so that the user can carry out an action in a more reliable way using the pointer and allows to gain access practically to all Windows applications.
In these cases such solution isn't the best because the potential of visual input are reduced to a simple copy of mouse features (moving the cursor with the eyes). Unlike the gesture of arms and hands, stable and directly associated with the voluntary action, the eyes movement shows other features, and it is often unintentional and oriented to acquire information about the external world and don't show a stable trend. Besides this interaction technique tires the user, slows down the interaction with the artefact and produces a high number of errors.
In another system the cursors is placed, in approximate way, on the start of gazed word, after a dwell time or pressing the switch. To move the cursor from letter to letter, after gazing a desired word for a certain time (dwell time), the user must gaze the arrow in the right direction among the 4 shown around the gazed word (a set of 4 arrows into 4 directions). To move slowly the cursor the user must gaze in the appropriate direction.
Besides is present a particular navigation screen where some buttons provide to move the cursor in all directions, both for small and large movements. To select a text to copy or cut, the user must place the cursor to the start of that text, select a particular button and move the cursor to the end of text to select.
Other systems show a solution of cursor positioning that combines ocular and manual control; when a manual activation of the user is noticed, the cursor is placed on starting position determined by the user gaze into the selected area.
In other more systems, the area of the screen gazed by the user enlarges so that the objects selection has made easy; the components outside this area close up and/or move in relation to such expansion.
It is an object of the present invention a method and an apparatus to process text document that uses as input the gaze and a method of fast positioning of cursor, developing an interface intuitive and easy to use as described into the claims that are integral part of the present description.
This apparatus represents a possible layout of an assistive technology extremely innovative to create, save and format text document, based on use of input natural and alternative, as the gaze.
In a preferred embodiment of the present invention, the apparatus object of the present invention includes means of data and information processing, means of storage of said data and information and means to interface it with the user.
Said means of electronic processing of data and information include an appropriate control section, preferably based on at least a microprocessor and, for instance can be carried out from a personal computer.
Said means of storage include preferably hard disk and flash memory aid means of user interface include means of data visualization, like display, monitor or similar external output unit and eye tracking device to determine the direction of the user gaze. Said at least microprocessor is preferably equipped with an appropriate software program which architecture, described in
a Set Action module 11, that manages graphic interface of the application and that holds the information about the areas components of the interface the user interacts with, and it is responsible to determine which area is currently gazed by the user, the action to perform and carries out it. Said Set Action module 11 holds the information about the action type associated with the activation of a determined component. Said Set Action module 11 is formed by three component modules: Events Management Module 12 that determines the rules to transform the input on the interface into changes on the application states through a mapping between the user action and application reply; a States Management Module 13 that represents the application data and determines the state and the functionalities and an Interface Management Module 14 that represents the visualization of interface objects, and manages the application graphic interface because holds the information related to the areas components of the graphic interface with which the user can interact and determines the interface area currently gazed by the user.
Referring to the
The generation and execution of action, step e) of sequence described in
In particular are described below two methods of cursor positioning and text selection, integral part of this patent, that allow formatting text document quick and efficient using just the gaze control.
Referring to
Referring to
After such selection the user will can perform operation of erasing, insertion, formatting etc. in accordance with the sequence described previously.
Number | Date | Country | Kind |
---|---|---|---|
FI08A0049 | Mar 2008 | IT | national |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/IB2009/051007 | 3/11/2009 | WO | 00 | 9/13/2010 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2009/113026 | 9/17/2009 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
6204828 | Amir et al. | Mar 2001 | B1 |
6433759 | Richardson et al. | Aug 2002 | B1 |
6873314 | Campbell | Mar 2005 | B1 |
7029121 | Edwards | Apr 2006 | B2 |
7556377 | Beymer | Jul 2009 | B2 |
7809160 | Vertegaal et al. | Oct 2010 | B2 |
20040075645 | Taylor et al. | Apr 2004 | A1 |
20040156020 | Edwards | Aug 2004 | A1 |
20050108092 | Campbell et al. | May 2005 | A1 |
20060256083 | Rosenberg | Nov 2006 | A1 |
20070164990 | Bjorklund et al. | Jul 2007 | A1 |
20090086165 | Beymer | Apr 2009 | A1 |
20090146950 | Maringelli | Jun 2009 | A1 |
Number | Date | Country |
---|---|---|
1 255 225 | Mar 2004 | EP |
2007017500 | Feb 2007 | WO |
2007050029 | May 2007 | WO |
Number | Date | Country | |
---|---|---|---|
20110022950 A1 | Jan 2011 | US |