The current invention is generally related to user interface for operation of various devices such as an information input device, an automatic transaction device, a ticket vending machine and an image output device, and more particularly related to the user interface based upon tactile sensation for specifying operations.
Multi-function peripherals (MFP) perform a predetermined set of combined functions of a copier, a facsimile machine, a printer, a scanner and other office automation (OA) devices. In operating a sizable number of functions in a MFP, an input screen is widely used in addition to a keypad. The screen display shows an operational procedure in text and pictures and provides a designated touch screen area on the screen for inputting a user selection in response to the displayed operational procedure.
It is desired to improve an office environment for people with disability so that these people can equally contribute to the society as people without disability. In particular, the section 508 of the Rehabilitation Act has become effective on Jun. 21, 2001 in the United States, and the federal government is required by law to purchase information technology related devices that are usable by people with disability. State governments, related facilities and even private sectors appear to follow the same movement.
Despite the above described movement, the operation of the MFP is becoming more and more sophisticated. Without displaying instructions on a display screen or a touch panel, it has become difficult to correctly operate the MFP. Because of the displayed instructions, the operation of the MFP has become impractical for the visually impaired. For example, when a visually impaired person operates a MFP, since he or she cannot visually confirm a designated touch area on a screen, the operation is generally difficult. For this reason, the visually impaired must memorize a certain operational procedure as well as a touch input area on the screen. Unfortunately, even if the visually impaired person memorizes the procedure and the input area, when the operational procedure or the input area is later changed due to future updates or improvements, the current memorization becomes invalid.
One prior art improved the above described problem by providing audio information for the visual information when a MFP is notified of the use by a visually impaired person. The visually impaired person indicates to the MFP by inserting an ID card indicative of his or her visual disability or inserting an ear phone into the MFP. The audio information is provided by a voice generation device. Alternatively, tactile information is provided by a Braille output device.
An automatic teller machine (ATM) is also equipped with a device to recognize a visually impaired person when either a predetermined IC card or a certain ear phone is inserted in the ATM. In order to withdraw or deposit money into his or her own account, the instructions are provided in Braille or audio when the ATM recognizes that a visually impaired person is operating. An input is through a keyboard with Braille on its surface.
Unfortunately, based upon a ratio of the disabled population to the normal population, the extra costs associated with the above described additional features for the disabled are prohibitive to make every user machine equipped with the additional features. Furthermore, if a mixture of the ATMs exists with and without the handicapped features, the users probably will be confused.
Japanese Patent Publication Hei 11-110107 discloses an information input device that includes a transparent touch panel over a display screen of a display device. A part of the touch panel is devoted as a screen search start button to change the operation mode to a screen search mode. In the screen search mode, the user interface outputs through a speaker a corresponding voice message describing an operation button on the touch panel. The above voice user interface enables the visually impaired to operate the operational panel that is commonly used for the operators without any visual impairment. On the other hand, it is necessary for the visually impaired to switch to the screen search mode and to touch the entire touch panel by finger. Because of the above tactile operation, it takes additional handling.
For the above reasons, it remains desirable to provide an operational device with the visually impaired to specify various operations through a touch panel.
In order to solve the above and other problems, according to a first aspect of the current invention, a method of user interfacing a visually impaired user with a multifunction device, including the steps of: assigning a predetermined function to a predetermined surface area of a touch panel; placing a template over the touch panel, a partial template area of the template corresponding to the predetermined surface area of the touch panel, the partial template area providing a non-visual cue for identification; inputting an inquiry about the partial template area based upon the non-visual cue; outputting a voice message about the partial template area in response to the inquiry; and selecting the predetermined function by making a contact ultimately with the predetermined surface area.
According to a second aspect of the current invention, a method of user interfacing a visually impaired user with a multifunction device, including the steps of: dividing a touch panel into predetermined surface areas that resemble a piano keyboard; assigning a predetermined function to each of the predetermined surface areas; touching one of the predetermined surface areas in a first predetermined manner indicative of an inquiry; outputting a sound output about the one of the surface areas in response to the inquiry; and touching one of the predetermined surface areas in a second predetermined manner to selecting the predetermined function.
According to a third aspect of the current invention, a user interface system for facilitating a visually impaired operator to use a multi-function device, including: a touch input unit for non-visually indicating predetermined functions and for receiving a tactile input; a control unit connected to the touch input unit for determining a control signal based upon the tactile input; and a sound generating unit connected to the control unit for outputting a sound in response to the control signal.
These and various other advantages and features of novelty which characterize the invention are pointed out with particularity in the claims annexed hereto and forming a part hereof. However, for a better understanding of the invention, its advantages, and the objects obtained by its use, reference should be made to the drawings which form a further part hereof, and to the accompanying descriptive matter, in which there is illustrated and described a preferred embodiment of the invention.
Based upon incorporation by external reference, the current application incorporates all disclosures in the corresponding foreign priority document (JPAP2001-254779) from which the current application claims priority.
In general, a function area is provided on an operation panel to operate a device, and a particular function is selected by touching on the touch panel over the operational panel. In a first preferred embodiment, a template is placed above the operational panel to indicate a corresponding function of the touch panel so that the visually impaired can also identify a relevant function area. In the following, one of three templates is used to operate a device according to the current invention.
Referring now to the drawings, wherein like reference numerals designate corresponding structures throughout the views, and referring in particular to
Now referring to
Now referring to
The above described templates each have a unique template number for each operational display. Similarly, the operational displays each have a unique operation number that matches the unique template number. The uniquely identified templates are stored in a storage area that resembles like a juke box. In response to a selected unique number, if the selected template is not yet placed on the touch panel, the selected template is taken out from the juke box and placed over the touch panel of the operational unit. The visually impaired user touches the template to input a certain number via keypad for inquiry. In response, the voice message is outputted to provide the corresponding function area name and helpful description of the inputted number. Based upon the voice message, the user selects a desired function by touching the touch panel area surface through the corresponding seal area on the third template that indicates the number.
Now referring to
In the following description, the term, user is interchangeably used with the visually impaired. Furthermore, the process mode for the visually normal users will be called the visual operation mode. On the other hand, the process mode for the visually impaired users will be called the non-visual operation mode.
Now referring to
Still referring to
Now referring to
After the appropriate template is placed, the user touches the template to identify the numbers or the Braille expressions by tactile sensation in a step S40. It is determined in the step S40 whether or not the user enters the identified number via the keypad 50. If the number has been entered in the step S40, the keypad 50 sends the inputted number and the current template identification number to the voice output unit 60 in a step S50. The voice output unit 60 in turn searches among matching voice data files based upon the inputted number and the current template identification number and retrieves a matching voice data file in the step S50. Furthermore, the voice output unit 60 plays the voice data including the function name and helpful information in the step S50. If the above information is stored in the text data format, a voice synthesis process generates voice data. The user listens to the above voice message through the headset and determines whether or not the function is desirable. It is determined in a step S60 whether or not the desired function has been specified through the touch panel. If no function is desirable, the preferred process returns to the step S40 for additional information on other functions. On the other hand, if the user determines that the described function is desirable, she touches the corresponding function area on the touch screen on the touch input unit 70. After selecting a particular function, the touch input unit 70 executes the selected function in a step S70.
Finally, after the execution of the selected function, it is checked if the current operation is running under the visual or non-visual mode in a step S80. For the non-visual mode, it is further determined in a step S110 whether or not a new operation display screen should be provided as a result of the above function execution in the step S70. In a step S120, the control unit 10 replaces the current screen with the new operation display screen, and the template control unit 30 fetches the corresponding template from the juke box storage and places it on the operational panel. If it is determined in the step S110 that no new display is needed, the preferred process returns to the step S40.
Still referring to
Now referring to
The user interface device using the above described template includes the units or the components as shown in
Now referring to
In addition to the above described virtual keyboard arrangement, certain special positions are used to facilitate the identification of the position on the touch panel. For example, these special positions include four corners or central positions. To these special locations, special functions are associated. Exemplary special functions include clearing the setting and jumping to the top layer. The special function keys are placed in the upper portion while the operational function keys are placed in the lower portion where virtual keyboard keys are placed. The above described arrangement is used to standardize the key arrangement. The finger movements on the virtual keyboard generally involve the right-left movements for selecting a function. For example, another movement is a vertical movement that is easily distinguished from the above horizontal movement. When the vertical movement exceeds a predetermined speed value, a certain specific functions is executed. For example, the above vertical movement causes to execute a “go back” function which returns to a previous operation screen from the current operation screen. The above arrangement increases the flexibility in the operation of the system. Similarly, certain predetermined functions are selected for execution when predetermine finger movements in certain shapes are detected over the piano keyboard. For example, the finger is moved in a circular, triangular or crossing fashion over the piano keyboard, the corresponding function is executed.
As the touch panel is directly used, the movement from one end to the other on the touch panel is more quickly accomplished than the above described templates. By placing the functions along the edges of the touch panel, it is easier to determine the relative current position based upon tactile sensation. Furthermore, when a finger tip stays in a function area on the touch panel for a predetermined amount of time, the user interface device provides the voice message help for the corresponding function. After the finger is released from the function area, if the finger touches the same function area and releases within a predetermined amount of time, the corresponding function is selected. The above described touch procedure eliminates the use of the keypad for obtaining the voice message. Alternatively, a certain key on the keypad is predetermined for executing a function that is specified on the touch panel. One smooth operation is that a function is specified by touching a corresponding function area with one hand while the selection of the specified function is made by pressing the specified key by the other hand.
The operation method will be described for the touch panel having operation functions. When a finger touches one key of the above described virtual piano keyboard, a corresponding sound icon such as a piano sound is outputted. The sound icon is relatively small and corresponds to the position of the key in the keyboard. The corresponding information is also provided by the voice message, and the information includes a function name and the function description. The above sound icon is generated in stereo by varying the right and left channels, and the stereo sound corresponds to the currently touched position on the touch panel whose virtual piano keys have been assigned a special function. For example, a louder sound is outputted by the left speaker than the right speaker when a function on the left side is touched on the touch panel. By the same token, a louder sound is outputted by the right speaker than the left speaker when a function on the right side is touched on the touch panel. Thus, the identification of the current position is facilitated by the sound icon.
Certain functions are temporarily disabled for selection due to a combination of items. The pitch for the disable functions remains the same in the sound icon, but the tone of the sound icon and the quality of the voice message are modified to clearly indicate the temporarily disabled state to the user. Furthermore, the generation of the sound icon and the voice message is immediately interrupted upon detecting the change in the currently touched piano key on the touch panel. The sound icon and the voice message are resumed for the newly touched piano key. According to the above responsive sound management, since the user does not have to listen to the end of a message after touching a new key, the operation is smooth to the user. After the user selects a function, when the corresponding operation screen evolves, the function assignment is also changes on the virtual piano keyboard of the touch panel.
Now referring to
The control unit 10 performs various initialization steps. The control unit 10 also controls the entire operation user device as well as the user specified information. The determination unit 20 determines whether or a current user is visually impaired in response to the control unit 10. The determination unit 20 determines the above inquiry based upon the information from the control unit 10. The above information is generated when a headset including a headphone and a microphone is inserted in the user interface device. The information is also generated in response to a certain predetermined key or a non-contact IC card. The non-contact IC card contains information on the visually impaired identification or the historical operational record of a particular individual. The touch input unit 70 includes the touch panel and determines an area in the virtual keyboard based upon the user finger position and the touch duration. The touch input unit 70 outputs the corresponding function number and the operation screen number to the voice output unit 60. The voice output unit 60 retrieves the voice information from the voice data file based upon the function number and the operation screen number. Furthermore, the voice output unit 60 plays the retrieved voice data. As described before, the voice data includes the function name and helpful information that corresponds the user inputted number. If the above information is stored in the text data format, a voice synthesis process generates voice data for output. After hearing the above voice guide information, if the user determines that the described operation is her desired function, she makes a contact with the touch panel. The touch input unit 70 includes the touch panel and executes the specified function in the above manner. Based upon the execution result, the control unit 10 displays the new operation display on the display unit 40. A new operation screen number is assigned.
The functions of the above described preferred embodiments are implemented in software programs that are stored in recording media such as a CD-ROM. The software in the CD is read by a CD drive into memory of a computer or another storage medium. The recording media include semiconductor memory such as read only memory (ROM) and involatile memory cards, optical media such as DVD, MO, MD or CD-R and magnetic media such as magnetic tape and floppy disks. The above software implementation also accomplishes the purposes and objectives of the current invention. In the software implementation, the software program itself is a preferred embodiment. In addition, a recording medium that stores the software program is also considered as a preferred embodiment.
The software implementation includes the execution of the program instructions and other routines such as the operating system routines that are called by the software program for processing a part or an entire process. In another preferred embodiment, the above described software program is loaded into a memory unit of a function expansion board or a function expansion unit. The CPU on the function expansion board or the function expansion unit executes the software program to perform a partial process or an entire process to implement the above described functions.
Furthermore, the above described software program is stored in a storage device such as a magnetic disk in a computer server, and the software program is distributed by downloading to a user in the network. In this regard, the computer server is also considered to be a storage medium according to the current invention.
It is to be understood, however, that even though numerous characteristics and advantages of the present invention have been set forth in the foregoing description, together with details of the structure and function of the invention, the disclosure is illustrative only, and that although changes may be made in detail, especially in matters of shape, size and arrangement of parts, as well as implementation in software, hardware, or a combination of both, the changes are within the principles of the invention to the full extent indicated by the broad general meaning of the terms in which the appended claims are expressed.
Number | Date | Country | Kind |
---|---|---|---|
2001-254779 | Aug 2001 | JP | national |
Number | Date | Country | |
---|---|---|---|
Parent | 10226926 | Aug 2002 | US |
Child | 11746942 | May 2007 | US |