1. Field of the Invention
This invention generally relates to a user interface for the visually impaired and, more particularly, to a system for translating display screen icons into an audible signal, and manipulating the icons.
2. Description of the Related Art
Visually handicapped people are unable to operate certain equipment, for either job-related or personal use, without accessibility enhancements. These enhancements are not always available. When available, the cost of the enhancements is shared by the sighted community, who has no need for the enhancements. For example, visually handicapped or blind people cannot easily operate equipment utilizing interactive touch screens for feature selection.
Manufacturers are reluctant to add accessibility extensions to their equipment, as these extensions add cost and complexity to the equipment. The increased purchase price associated with these accessibility extensions places their product in an inferior competitive position.
When accessibility extensions are requested or mandated by law, the additional cost associated with the equipment imposes an unfair burden on the sighted user community. Some examples of low cost accessibility-aided options include equipment that uses Braille keypads in elevators and ATMs. Other equipment however, such as an audio-enabled ATM, is relatively complex, and adds significant purchase and maintenance costs. These additional costs must be passed on to all consumers, despite that fact that these features are only used by a small minority.
It would be advantageous if touchscreens and displays could be made more accessible to the visually handicapped.
It would be advantageous if touchscreens and displays could be made more accessible without adding costly modifications to conventional equipment.
The present invention is a display interpreter (DI) that is physically independent of the equipment being accessed by the visually impaired user. The DI can be operated without modifications or additions to conventional equipment, avoiding any cost or complexity that might otherwise be added for accessibility support for the visually impaired.
The DI may be composed of the following hardware and software components: a tactile matrix interface (TMI) and a display interface, which in its simplest form is a cover plate that is temporarily placed on top of the equipment's touchscreen. The cover plate prevents users from accidentally activating an icon with their finger. Instead, it has sensors to locate the physical position of the user's finger on the cover plate. The cover plate sends the position of the user's finger to a map module as described below. The cover plate's physical positioning on top of the touch screen takes advantage of the fact that touchscreens typically are recessed. The recessing provides two edges to use to orient the cover plate.
A stylus is used for depressing the touchscreen at any icon location desired. In one version it has no circuitry or functionality beyond being a pointer that is pushed through a hole in the cover plate in order to activate an icon on the equipment. An alternate version can be used to activate an icon on the equipment touchscreen and send a signal to the map module automatically.
The map module is a programmable device that contains a database of icons and their attributes for every function displayable on the equipment's touchscreen. The map module may receive the x, y position information from the TMI. It processes the coordinates received from the TMI and identifies the icon and/or screen. However, the invention may use other types of coordinate systems, such as polar, or positions relative to landmarks or previous selections. Once it has identified the icon, it provides audio feedback to the user describing the icon's function. It is programmable for different touchscreen formats and functions.
Furthermore, the map module can store the icons for many different device types and models. The user may manually select the appropriate equipment from a menu supplied by the map module. This selection can be done by the visually handicapped.
The various screen layouts and icon functions are stored in a flat file system or database for navigating through the icons and screens, allowing the user to choose from the functionality available in the displays. This database can be stored in the map module. Software, executing in the map module, cross-references the positional information of each icon with the current location of the user's finger on the TMI. Based on the location, the software outputs an audio description of the icon, along with the effect if activated.
The receiver, processor, database creation, and audio response may all be implemented in the map module. In one aspect, the map module can be located in a conventional PDA or other portable device, provided the processor can respond in real-time to the movement of the user's finger across the TMI.
Given the unit and features listed above, a visually handicapped person can locate the icons on any display and navigate from screen display to screen display, making an accurate selection of functions and features. As each desired icon is located, the stylus is used to active the function.
Accordingly, a method is provided for audibly interpreting a screen display. The method comprises: locating a DI with a tactile matrix of sensors overlying a display screen; in response to sensing a proximate pointer, accepting a tactile matrix region selection; loading a first reference map, cross-referencing DI tactile matrix regions with a first field of screen icons; loading a second reference map, cross-referencing screen icons with audible recordings identifying the screen icons; mapping a screen icon to the selected tactile matrix region; and, audibly identifying the mapped screen icon. That is, the DI uses the second reference map to cross-reference the located screen icon to the audible recording identifying the screen icon, and plays the recording.
In one aspect, the DI can be located over a touchscreen with touchscreen icons. Then, the method may further comprise engaging a touchscreen icon. In situations where the selection of an icon generates a new screen, the method further comprises acknowledging a touchscreen icon engagement. Following the acknowledgement of the touchscreen icon engagement, the DI loads a third reference map, cross-referencing DI tactile matrix regions with a second field of (touch)screen icons, and loads a fourth reference map, cross-referencing touchscreen icons with audible recordings identifying the touchscreen icons.
Additional details of the above-described method and an audible screen display interpreter (DI) system are provided below.
In one aspect, there are buttons or switches along the edge of the display (not shown) that permit the user to make choices. The identification of icons in close proximity to an off-screen switch helps the user select a desired function. In some aspects the display screen interface 202 is a touchscreen interface (TI) for engaging a touchscreen icon. That is, the display 203 is a touchscreen and, rather than merely audibly identifying a particular icon 206, the DI 200 can act to trigger (touch) the icon. If the engagement of a touchscreen icon leads to the generation of a new display screen with a new field of icons, the TMI 204 generates an acknowledgement (ACK) 212 in response to the engagement of a touchscreen icon.
As used herein, a “display icon”, “screen icon”, “icon”, or “touchscreen icon” is a symbol, representation, choice, or information that is graphically presented on a display screen. In the case of a touchscreen, the icon can be triggered (engaged) to signify the selection of an offered equipment function. As used herein, “equipment” is a conventional device with a display or touchscreen. For example, a conventional MFP can be equipment. However, the invention is applicable to any conventional equipment that uses a display screen or a touchscreen.
For the sake of clarity, the processes of receiving TMI region selections, and mapping the selections to recordings have been described as separate functions occurring in different DI modules. However, it should be understood that these processes may be performed in a single DI processing module (DIPM), which may include a microprocessor, a memory including maps and program software instructions. The TMI, DIPM, or map module is not limited to any particular means of performing their functions.
For example, a user may run their finger or pointer over the TMI 202, listening for the recording of the icon they are seeking. This search process has been referred to above as a TMI region selection. Once the user has identified an opening associated with the icon they are seeking, a pointer is inserted into the hole. The hole and pointer are sized so as to cause the pointer to engage the underlying screen icon. That is, the desired touchscreen icon is touched by the pointer.
The acknowledgement signal may be generated by a stylus (as explained below), or by the TMI 204. For example, the TMI 204 may further comprise a hole sensor 510 associated with each opening 502, to generate an acknowledgement signal in response to sensing a pointer in the opening.
In another simple aspect, the TI 202 and TMI 204 are a thin membrane on standoffs that includes sensing circuitry. The sensing circuitry detects TMI region selections by sensing the user's finger position on the membrane. Once the user hears an audible indication that their finger is over a desired icon, the user merely presses down on the membrane. The membrane is thin enough to directly transfer the pressure to the underlying touchscreen. In some aspect, the generation of sufficient pressure on the TMI 204 generates an acknowledgement signal, indicating that the user has pressed a particular icon. The membrane technology is conventional and is used for keypads and cash register data entry to name a couple of examples.
In other forms of the DI not shown, the TI 202 generates mechanical pressures or electrical signals that cause the touchscreen icons to be engaged. Although a DI that generates mechanical pressure may be more complicated, or a DI that generates electrical signal may require some equipment modifications (i.e., an electrical jack built into the equipment to accept an electrical connection from the TI 202), this aspect of the invention permits the TMI regions to be decoupled from the underlying touchscreen icons. This aspect may be advantageous if the touchscreen icons are all grouped together in a small region of the screen. By decoupling the 1:1 relationship between TMI regions and touchscreen icons, accidental selections can be minimized. This aspect can also be used to present a common TMI region to the user, regardless of the manufacturer or model of the equipment being used.
Alternately, if the first device 600 has no speaker, the audio signal can be communicated back to the second device 604, which may include an audio signal generator 210 (not shown). In this case receiver 602 and transmitter 606 are transceivers.
Returning to
In another aspect, the stylus 700B includes a switch 708, as well as transmitter 702 to send an acknowledgement 212 of a touchscreen icon engagement in response to initially selecting a tactile matrix region 304 and engaging the switch 708. This aspect assumes that the stylus 700 (or some other device) has been used to touch a touchscreen icon. Alternately, the stylus transmitter 702 can to send an acknowledgement of a touchscreen icon engagement in response to being inserted into a tactile matrix interface opening. For example, stylus 700C includes a sensor 706 can be used to determine that the stylus has been inserted a sufficient depth to engage the underlying touchscreen. If the stylus 700 uses a wireless transmitter 702 for generating either an acknowledgement or region select signal, then the map module would also include a wireless receiver accepting the acknowledgement signal from the stylus 700.
Functional Description
The TMI can be connected either by wire or wirelessly to the map module, which provides tracking information. The map module uses the tracking information to identify any function to which the user is currently pointing, and sends an audio description of that function back to the user. When the user is notified that their finger is over the correct icon, they push the stylus through a hole over that icon to select the function.
Circuitry on the TMI transmits the physical location of the user's pointer to the map module, as it moves across the TMI surface. In a simple aspect, the DI includes a grid of holes through which a pointer can be inserted to depress the desired icon on the equipment's touchscreen.
A better solution is to provide a mechanism to help the user avoid activating the wrong icon. Two approaches are presented as examples. The first approach is to design the holes in the DI such that the user inserts the stylus to a depth sufficient to provide an audio confirmation of the hole selected. Once the user has heard the confirmation, the user can proceed with completing the insertion of the stylus sufficient to engage the touch screen. If the user is careless with inserting the stylus, then this solution has the disadvantage of allowing the user to insert the stylus too far on the first phase, thus engaging the touchscreen without first obtaining the audio confirmation.
The problem of accidentally engaging the touchscreen before getting audio confirmation can be resolved in many ways. For example, the stylus can be constructed with a flexible flange, see
The map module is one of the main processing components of the display interpreter. The map module may function in two modes: database creation, and display interpretation and navigation. The map module contains the database of icons and screens.
Buttons on the DI may serve several functions. One button may signal that the user has positioned the DI and is about to start a scan of the icons. Another button may serve notice to the map module software that the stylus has been used to activate an icon on the equipment (acknowledgement signal). This button is used with the simple pointer for example, whereas a stylus with a transmitter automatically notifies the map module whenever an icon is activated.
Buttons can also be used to support entering, deleting, and correcting entries in the icon database. Another button can be used to initiate the loading of new software for the map module.
The display interpreter has two basic modes, database creation and display interpretation. Database creation is typically accomplished by a sighted person. Information about each icon is stored as a separate entry in the database. Each icon is manually measured for height, width, and coordinates within the touch screen display of the equipment. Note, this approach is acceptable since the resolution of the touchscreen cannot be altered (unlike traditional computer screens in which changing the resolution can affect the location and dimension of an icon). This information is entered into the database. Each database entry also has a screen name, an audio file describing the function, and a next action to be taken whenever the icon is activated. Optionally, the equipment itself can send the icon information directly or on a media for downloading to the map module.
The DI is setup and used as follows. The DI is placed on the touchscreen by orienting the standoff representing the origin in the correct corner of the recessed screen (the correct corner is a matter of implementation and can be arbitrarily be any of the four corners). The map module is signaled that the DI is in place by depressing an appropriate key on the TMI.
The user places a finger on the TMI and begins to slowly traverse the surface in any direction. The TMI and map module track the finger's location in real-time, announcing each icon as the finger enters the icon's bounding box. The user stops moving their finger when the correct icon is found. The user inserts a pointer or stylus into the hole under their finger and touches the icon on the equipment touch screen. If a pointer is used, the user presses an action key on the TMI signaling that an icon was chosen. If a transmitting stylus in is used, then this action is unnecessary because the signal is automatically sent to the map module. In another variation, either the TMI or TI senses the insertion of a pointer through a hole and automatically sends an acknowledgement signal to the map module.
A voice message identifies the next action to be taken, e.g., a new screen has appeared or the next set of icons that follow the action just taken. The user continues to traverse the TMI until all options have been selected for this task. The use of the DI/equipment touchscreen may coincide with the use of a keypad on the equipment that is marked in Braille, marked with some other tactile identifiers, or unmarked, but whose positions have been memorized by the user.
Step 1702 locates a display interpreter (DI) with a tactile matrix of sensors overlying a display screen. Means of locating have been described above (i.e., the use of standoffs and corner locators). Step 1704 loads a first reference map, cross-referencing DI tactile matrix regions with a first field of icons. Step 1706 loads a second reference map, cross-referencing screen icons with audible recordings identifying the screen icons. Step 1708, in response to sensing a proximate pointer, accepts a tactile matrix region selection. Step 1710 maps a screen icon to the selected tactile matrix region. Step 1712 audibly identifies the mapped screen icon. More specifically, Step 1712 may include the substeps of: using the second reference map, cross-referencing the located screen icon to the audible recording identifying the screen icon; and, playing the recording.
In some aspects, Step 1702 locates a DI overlying a touchscreen with touchscreen icons. Then, additional steps may occur. Step 1714 engages a touchscreen icon. Step 1716 acknowledges a touchscreen icon engagement. Step 1718, following the acknowledgement of the touchscreen icon engagement, loads a third reference map, cross-referencing tactile matrix regions with a second field of touchscreen icons, and loads a fourth reference map, cross-referencing touchscreen icons with audible recordings identifying the touchscreen icons.
In one aspect, accepting the tactile matrix region selection in Step 1708 includes monitoring a tactile matrix region sensor for a proximately located pointer. Then, mapping the screen icon to the selected tactile matrix region (Step 1710) comprises, in response to sensing a proximately located pointer, cross-referencing the selected tactile matrix region to locate the screen icon, using the first reference map.
Alternately, accepting the tactile matrix region selection in Step 1708 includes the substeps of: accepting a stylus with a transmitter, located proximate to a tactile matrix region; and, generating a region select signal from the stylus in response to locating the stylus proximate to the tactile matrix region.
In another aspect, acknowledging the touchscreen icon engagement in Step 1716 includes the substeps of: defining an opening through the DI associated with the mapped touchscreen icon; and, accepting a pointer in the defined opening. Then, engaging the touchscreen icon in Step 1714 includes directing the pointer to the touchscreen icon through the defined opening. In this aspect it is typical that Step 1702 locates a DI with a matrix of openings interposed between tactile matrix regions and cross-referenced touchscreen icons.
In a different aspect, Step 1702 locates a DI with a communication medium transmitter communicating the tactile matrix region selections. Then, loading the first and second reference maps in Steps 1704 and 1706, respectively, includes loading the reference maps into a DI map module, embedded in a device communicating with the DI via the first communication medium.
In one aspect, acknowledging the touchscreen icon engagement in Step 1716 includes the substeps of accepting a stylus with a transmitter, into a DI opening; and, generating an acknowledgement signal from the stylus in response to directing the stylus through the defined opening.
In another aspect, acknowledging the touchscreen icon engagement in Step 1716 includes the substeps of: associating a sensor with each opening through the DI; accepting a pointer in the defined opening; and, in response to sensing the pointer in the defined opening, generating an acknowledgement signal from the hole sensor.
A display interpreter system and method have been presented that permit a visually impaired person to access functions presented in equipment displays, without performing modifications to the equipment. Examples have been given of particular user interfaces and display interfaces. Other examples have been given of particular mapping processes and device usage. However, the invention is not limited to just these examples. Although examples have primarily been provided in the context of MFP equipment, the invention is applicable to almost any equipment that uses a display screen or touchscreen. Other variations and embodiments of the invention will occur to those skilled in the art.