This disclosure relates to electronic display devices, and more particularly, to user interface (UI) techniques for interacting with computing devices.
Electronic display devices such as tablets, eReaders, mobile phones, smart phones, personal digital assistants (PDAs), and other such touch screen electronic display devices are commonly used for displaying consumable content. The content may be, for example, an eBook, an online article or blog, images, a movie or video, a map, just to name a few types. Such display devices are also useful for displaying a user interface that allows a user to interact with an application running on the device. The textual content and/or screen controls may be spoken aloud to the user. The user interface may include, for example, one or more touch screen controls and/or one or more displayed labels that correspond to nearby hardware buttons. The touch screen display may be backlit or not, and may be implemented for instance with an LED screen or an electrophoretic display. Such devices may also include other touch sensitive surfaces, such as a track pad (e.g., capacitive or resistive touch sensor) or touch sensitive housing (e.g., acoustic sensor).
a-b illustrate an example electronic touch screen device having an accessible menu navigation mode configured in accordance with an embodiment of the present invention.
c-d illustrate example configuration screen shots of the user interface of the electronic touch screen device shown in
a illustrates a block diagram of an electronic touch screen device configured in accordance with an embodiment of the present invention.
b illustrates a block diagram of a communication system including the electronic touch screen device of
a-b show tables of two accessible menu navigation gesture sequences, in accordance with an embodiment of the present invention.
a-b collectively illustrate an example accessible menu navigation mode of an electronic touch screen device, in accordance with an embodiment of the present invention.
a-e collectively illustrate example menu navigation mode functions, in accordance with an embodiment of the present invention.
Techniques are disclosed for providing an accessible menu navigation mode in electronic computing devices. The device may be a touch screen mobile device, or any other device with a touch sensitive surface that can detect user gestures. The user can engage a manual reading mode by performing a manual reading mode activation gesture, wherein the manual mode allows the user to navigate through content, share content, adjust the reading rate, font, volume, or other device settings. The user may navigate through a menu structure using menu navigation gestures and the menu and sub-menu options may be read aloud to the user as they are navigated through. The main menu options may be navigated, for example, using vertical up or down swipe gestures, while sub-menu options within a given menu option may be navigated using horizontal swipe gestures. A selection gesture may allow the user to enable or adjust various menu and sub-menu options, and various earcons or sound effects may guide the navigation process and/or confirm a menu or sub-menu selection in some embodiments. The user may configure the navigation gestures and option selection gestures. The menu options may be structured to allow a user to access content navigation options with upward swipe gestures and access device settings options using downward swipe gestures.
General Overview
As previously explained, electronic display devices such as tablets, eReaders, and smart phones are commonly used for displaying user interfaces and consumable content. The user of such devices can typically consume the displayed content with relative ease. In some instances, users who are unable to view or read text on the screen, may wish to navigate through the device's menu screens and/or select one or more application options. While some electronic devices may aurally present textual content to a user, offer printed instructions on a screen protector (e.g., using a braille-embossed screen cover), or offer a hunt-and-peck approach for navigating through menu options, an accessible menu navigation user interface as described herein may provide a more intuitive or otherwise positive user experience.
Thus, and in accordance with an embodiment of the present invention, accessible menu navigation techniques are disclosed for use in electronic touch screen devices. The techniques facilitate an accessible user interface that may be supported by multiple reading and navigation modes. In some embodiments, an accessible mode for an electronic device may read aloud the text of an eBook, an email message, or some other textual content that may be displayed on the device. In one such embodiment, the accessible mode has an automatic and a manual mode, wherein the manual mode allows the user to actively navigate and/or adjust device settings. The automatic mode, in some embodiments, does not allow manual navigation and settings adjustment, and a specific gesture or control feature activation may switch the device from automatic to manual mode. In one embodiment, the automatic reading mode facilitates an electronic device reading automatically and continuously from a predetermined point with a selected voice font, volume, and rate. The automatic mode may also, for example, play an earcon or audio cue upon passing sentences, paragraphs, pages, chapters, or other boundaries. In manual mode, however, the user may change different reading rates, fonts, volumes, etc. In one embodiment, the entire display screen may be treated as a single button to facilitate transitioning from automatic into manual mode, or the mode transition may be performed using a physical switch or control feature. For example, a single tap on the display screen may transition the device from automatic mode to manual mode.
In the manual accessibility mode the user can perform, for example, the various navigation gestures and selection gestures described herein. A menu navigation gesture may include, for example, a swipe gesture up or down on the device screen. As used herein, a swipe gesture may include a sweeping or dragging gesture across at least a portion of the touch sensitive surface whether directly contacting that surface or hovering over that surface (e.g., within a few centimeters or otherwise close enough to be detected by the touch sensitive surface). In some embodiments, the swipe gesture may be performed at a constant speed in one single direction, or may also be an accelerated flick gesture. The gestures can be performed, for example, with the tip of a finger or a stylus, or any other suitable implement capable of providing a detectable swipe gesture. To facilitate detection of a substantially horizontal and/or vertical swipe gesture with reference to the bottom of the electronic device's screen, any swipe gesture that is, for example, within a range of 45 degrees of the horizontal or vertical may be treated as a horizontal or vertical gesture.
Once invoked, the accessible menu navigation mode may present an audio cue to the user (e.g., reading aloud “entering menu” or playing a distinctive sound effect), or the menu navigation mode may simply begin reading aloud the various menu options. In one embodiment, the user input gestures for the accessible menu navigation mode may be categorized into two types: navigation gestures and selection gestures. Navigation gestures include those that allow the user to navigate through the various menus and sub-menus, while selection gestures allow the user to select a menu or sub-menu option, adjust a settings option, or otherwise complete an action within the accessible menu navigation mode. Upward and/or downward swipe gestures may be configured to navigate through various menu levels, and the user may scroll through various sub-menu options within a menu level using sideways swipe gestures (either left or right), in some embodiments. Selecting a given option within a menu or sub-menu may be performed with a selection gesture that could include a swipe gesture, releasing a contact that was being held down, a double-tap gesture, or some other uniquely identifiable selection gesture. As will be apparent, different gesture configurations performed with one or more contact points may be used for any of the navigation or selection gestures described herein. In some embodiments, the accessible menu navigation mode may read aloud each menu option and an earcon or sound effect may be played upon accepting or cancelling a menu option, entering a manual or automatic mode, passing a boundary such as a sentence, page, paragraph, or chapter, or upon performing a certain function. An earcon may be, for example, a brief and distinctive sound or chime used to represent a specific action or event, convey other information, or prompt a user action.
In addition to adjusting device settings and navigating through content, in some embodiments the accessible menu navigation mode allows for sharing or bookmarking of selected content. In such an embodiment, a user may set up sub-menus to define different communication means, such as Facebook®, email, etc., or to bookmark or highlight selected content. The user may also define a group of friends and create a customized subgroup, such as, a reading club, church group, etc., and then forward their selection or bookmark to that particular group using one or more communication means. In one embodiment, if no bookmark is currently on a page when the bookmark option is selected, a new bookmark is added to that page. If there is already a bookmark on the current page, however, selecting the bookmark option may delete the bookmark.
Given the global nature and/or uniqueness of the engagement mechanism, in accordance with some example embodiments, the accessible menu navigation UI can be similarly invoked within multiple diverse applications, for example, an eReader, Internet browser, picture viewer, file browser, or any other content containing multiple levels of data, menu options, or settings. In such embodiments, the accessible menu navigation UI may be invoked without conflicting with other global gestures that might also be used by the device's operating system. In other embodiments, the techniques described herein may be combined with drag-and-drop UI techniques, or other UI techniques to aid in navigating and organizing content. Numerous uniquely identifiable engagement schemes that exploit a touch sensitive surface can be used as will be appreciated in light of this disclosure. Further note that any touch sensitive device (e.g., track pad, touch screen, or other touch sensitive surface, whether capacitive, resistive, acoustic or other touch detecting technology, regardless of whether a user is physically contacting the device or using some sort of implement, such as a stylus) may be used to detect the user contact, and the claimed invention is not intended to be limited to any particular type of touch sensitive technology, unless expressly stated. For ease of reference, user input is sometimes referred to as contact or user contact; however, direct and/or proximate contact (e.g., hovering within a few centimeters of the touch sensitive surface) can be used. In other words, in some embodiments, a user can operate the accessible menu navigation user interface without physically touching the touch sensitive device.
Architecture
a-b illustrate an example electronic touch sensitive device having an accessible menu navigation user interface configured in accordance with an embodiment of the present invention. As can be seen, in this example embodiment, the touch sensitive surface is a touch screen display. The device could be, for example, a tablet such as the NOOK® tablet or eReader by Barnes & Noble. In a more general sense, the device may be any electronic device having a touch sensitive user interface for detecting direct touch or otherwise sufficiently proximate contact, and capability for displaying content to a user, such as a mobile phone or mobile computing device such as a laptop, a desktop computing system, a television, a smart display screen, or any other device having a touch sensitive display or a non-sensitive display screen that can be used in conjunction with a touch sensitive surface. As will be appreciated in light of this disclosure, the claimed invention is not intended to be limited to any specific kind or type of electronic device or form factor.
As can be seen with this example configuration, the device comprises a housing that includes a number of hardware features such as a power button, control features, and a press-button (sometimes called a home button herein). A user interface is also provided, which in this example embodiment includes a quick navigation menu having six main categories to choose from (Home, Library, Shop, Search, Light, and Settings) and a status bar that includes a number of icons (a night-light icon, a wireless network icon, and a book icon), a battery indicator, and a clock. In one embodiment, an accessible UI may aurally present to the user the various menu categories from which the user may select the desired menu with a touch screen gesture or by activating a control feature. Some embodiments may have fewer or additional such UI features, or different UI features altogether, depending on the target application of the device. Any such general UI controls and features can be implemented using any suitable conventional or custom technology, as will be appreciated.
The hardware control features provided on the device housing in this example embodiment are configured as elongated press-bars and can be used, for example, to page forward (using the top press-bar) or to page backward (using the bottom press-bar), such as might be useful in an eReader application. The power button can be used to turn the device on and off, and may be used in conjunction with a touch-based UI control feature that allows the user to confirm a given power transition action request (e.g., such as a slide bar or tap point graphic to turn power off). Numerous variations will be apparent, and the claimed invention is not intended to be limited to any particular set of hardware buttons or UI features, or device form factor.
In this example configuration, the home button is a physical press-button that can be used as follows: when the device is awake and in use, pressing the button will present to the user (either aurally or visually) the quick navigation menu, which is a toolbar that provides quick access to various features of the device. The home button may also be configured to cease an active function that is currently executing on the device (such as an accessible menu navigation mode), or close a configuration sub-menu that is currently open. The button may further control other functionality if, for example, the user presses and holds the home button. For instance, an example such push-and-hold function could engage a power conservation routine where the device is put to sleep or an otherwise lower power consumption mode. So, a user could grab the device by the button, press and keep holding as the device is stowed into a bag or purse. Thus, one physical gesture may safely put the device to sleep. In such an example embodiment, the home button may be associated with and control different and unrelated actions: 1) present the quick navigation menu; 2) exit a configuration sub-menu; and 3) put the device to sleep. As can be further seen, the status bar may also include a book icon (upper left corner). In some cases, selecting the book icon may provide bibliographic information on the content or provide the main menu or table of contents for the book, movie, playlist, or other content.
In one particular embodiment, an accessible menu navigation configuration sub-menu, such as the one shown in
As will be appreciated, the various UI control features and sub-menus displayed to the user are implemented as touch screen controls in this example embodiment. Such UI screen controls can be programmed or otherwise configured using any number of conventional or custom technologies. In general, the touch screen display translates a touch (direct or hovering, by a user's hand, a stylus, or any other suitable implement) in a given location into an electrical signal which is then received and processed by the device's underlying operating system (OS) and circuitry (processor, display controller, etc.). In some instances, note that the user need not actually physically touch the touch sensitive device to perform an action. For example, the touch screen display may be configured to detect input based on a finger or stylus hovering over the touch sensitive surface (e.g., within 3 centimeters of the touch screen or otherwise sufficiently proximate to be detected by the touch sensing circuitry). Additional example details of the underlying OS and circuitry in accordance with some embodiments will be discussed in turn with reference to
The touch sensitive surface (or touch sensitive display, in this example case) can be any surface that is configured with touch detecting technologies, whether capacitive, resistive, acoustic, active-stylus, and/or other input detecting technology, including direct contact and/or proximate contact. In some embodiments, the screen display can be layered above input sensors, such as a capacitive sensor grid for passive touch-based input, such as with a finger or passive stylus contact in the case of a so-called in-plane switching (IPS) panel, or an electro-magnetic resonance (EMR) sensor grid for sensing a resonant circuit of a stylus. In some embodiments, the touch sensitive display can be configured with a purely capacitive sensor, while in other embodiments the touch screen display may be configured to provide a hybrid mode that allows for both capacitive input and EMR input, for example. In still other embodiments, the touch sensitive surface is configured with only an active stylus sensor. Numerous touch screen display configurations can be implemented using any number of known or proprietary screen based input detecting technologies. In any such embodiments, a touch sensitive controller may be configured to selectively scan the touch sensitive surface and/or selectively report user inputs detected directly on or otherwise sufficiently proximate to (e.g., within a few centimeters, or otherwise sufficiently close so as to allow detection) the detection surface (or touch sensitive display, in this example case).
As previously explained, and with further reference to
As can be further seen, a back button arrow UI control feature may be provisioned on the screen for any of the menus provided, so that the user can go back to the previous menu, if so desired. In other embodiments, a universal back screen gesture may be performed in order to return to the previous menu. Note that configuration settings provided by the user can be saved automatically (e.g., user input is saved as selections are made or otherwise provided). Alternatively, a save button or other such UI feature can be provisioned, or a save gesture performed, which the user can engage as desired. The configuration sub-menu shown in
a illustrates a block diagram of an electronic touch screen device configured in accordance with an embodiment of the present invention. As can be seen, this example device includes a processor, memory (e.g., RAM and/or ROM for processor workspace and storage), additional storage/memory (e.g., for content), a communications module, a touch screen, and an audio module. A communications bus and interconnect is also provided to allow inter-device communication. Other typical componentry and functionality not reflected in the block diagram will be apparent (e.g., battery, co-processor, etc). The touch screen and underlying circuitry is capable of translating a user's contact (direct or proximate) with the touch screen into an electronic signal that can be manipulated or otherwise used to trigger a specific user interface action, such as those provided herein. The principles provided herein equally apply to any such touch sensitive devices. For ease of description, examples are provided with touch screen technology.
In this example embodiment, the memory includes a number of modules stored therein that can be accessed and executed by the processor (and/or a co-processor). The modules include an operating system (OS), a user interface (UI), and a power conservation routine (Power). The modules can be implemented, for example, in any suitable programming language (e.g., C, C++, objective C, JavaScript, custom or proprietary instruction sets, etc), and encoded on a machine readable medium, that when executed by the processor (and/or co-processors), carries out the functionality of the device including a UI having an accessible menu navigation mode as variously described herein. The computer readable medium may be, for example, a hard drive, compact disk, memory stick, server, or any suitable non-transitory computer/computing device memory that includes executable instructions, or a plurality or combination of such memories. Other embodiments can be implemented, for instance, with gate-level logic or an application-specific integrated circuit (ASIC) or chip set or other such purpose-built logic, or a microcontroller having input/output capability (e.g., inputs for receiving user inputs and outputs for directing other components) and a number of embedded routines for carrying out the device functionality. In short, the functional modules can be implemented in hardware, software, firmware, or a combination thereof.
The processor can be any suitable processor (e.g., Texas Instruments OMAP4, dual-core ARM Cortex-A9, 1.5 GHz), and may include one or more co-processors or controllers to assist in device control. In this example case, the processor receives input from the user, including input from or otherwise derived from the power button and the home button. The processor can also have a direct connection to a battery so that it can perform base level tasks even during sleep or low power modes. The memory (e.g., for processor workspace and executable file storage) can be any suitable type of memory and size (e.g., 256 or 512 Mbytes SDRAM), and in other embodiments may be implemented with non-volatile memory or a combination of non-volatile and volatile memory technologies. The storage (e.g., for storing consumable content and user files) can also be implemented with any suitable memory and size (e.g., 2 GBytes of flash memory). The display can be implemented, for example, with a 7 to 9 inch 1920×1280 IPS LCD touchscreen touch screen, or any other suitable display and touchscreen interface technology. The communications module can be, for instance, any suitable 802.11b/g/n WLAN chip or chip set, which allows for connection to a local network, and so that content can be exchanged between the device and a remote system (e.g., content provider or repository depending on the application of the device). In some specific example embodiments, the device housing that contains all the various componentry measures about 7″ to 9″ high by about 5″ to 6″ wide by about 0.5″ thick, and weighs about 7 to 8 ounces. Any number of suitable form factors can be used, depending on the target application (e.g., laptop, desktop, mobile phone, etc). The device may be smaller, for example, for smartphone and tablet applications and larger for smart computer monitor and laptop and desktop computer applications.
The operating system (OS) module can be implemented with any suitable OS, but in some example embodiments is implemented with Google Android OS or Linux OS or Microsoft OS or Apple OS. As will be appreciated in light of this disclosure, the techniques provided herein can be implemented on any such platforms. The power management (Power) module can be configured as typically done, such as to automatically transition the device to a low power consumption or sleep mode after a period of non-use. A wake-up from that sleep mode can be achieved, for example, by a physical button press and/or a touch screen swipe or other action. The user interface (UI) module can be, for example, based on touchscreen technology and the various example screen shots and use-case scenarios shown in
Client-Server System
b illustrates a block diagram of a communication system configured in accordance with an embodiment of the present invention. As can be seen, the system generally includes an electronic touch sensitive device (such as the one in
Accessible Menu Navigation Mode Examples
a-b illustrate tables of two accessible menu navigation gesture sequences, in accordance with an embodiment of the present invention. In these examples, the accessible menu navigation mode is configured according to the menu structure shown in
As can be seen in
a-b collectively illustrate an example accessible menu navigation mode of an electronic touch screen device, in accordance with an embodiment of the present invention. This example gesture sequence shows how the accessible menu navigation mode might be implemented on the surface of a touch screen device operating in the manual mode. As can be seen, the device housing surrounds the touch screen of the device, and the user can interact with the touch screen with fingers or other suitable implement. The example in
a-e collectively illustrate example menu navigation mode functions, in accordance with an embodiment of the present invention.
b shows an example menu navigation function where the “go to page” menu option is selected (using a single finger double tap gesture), in accordance with one embodiment of the present invention. In this example, the user is presented with a phone-pad-style keypad, or other page entry window, that allows the input of page numbers. Each number selected may be added to the display window, and the selection process may be aurally presented and guided using audio cues or earcons, in some embodiments. The user has a delete button to correct mistakes and a “go” button to jump to the selected page. If the entry results in a non-existent page, and appropriate error message or earcon may be aurally presented and the keypad may remain visible with the text field cleared for further input.
c shows an example menu navigation function where the “spell word” menu option has been selected, in accordance with one embodiment of the present invention. In this example, the selected word is stated, spelled, and then re-stated, per normal spelling bee rules. In other embodiments, other spelling presentations may be implemented. In this example case, once the word is spelled the device remains in the spelling menu layer such that a subsequent selection gesture (e.g., a double tap gesture) will spell the word again. In some cases, the “spell word” option is used to spell a pre-selected word, however, if no word is currently selected, a word selection may be performed after the “spell word” function begins.
d-e show an example menu navigation function for adding and deleting notes, in accordance with one embodiment of the present invention. In this example, when the “add note” menu option is selected (e.g., using a single finger double tap gesture), the device displays a text entry window that allows the user to input the content of a note. The user can save the note using a selection gesture, or exit the add note function using a dismiss gesture. In some cases, the user may be aurally prompted to confirm creating an note, e.g., “Double tap to confirm, single tap to return to text entry.” Once the note creation is confirmed, the text entry window may disappear and the device may return to reading content.
Methodology
As can be seen, the method generally includes sensing a user's input by a touch screen display. As soon as the user begins to swipe, drag or otherwise move a contact point, the UI code (and/or hardware) can assume a swipe gesture has been engaged and track the path of the contact point with respect to any fixed point within the touch screen until the user stops engaging the touch screen surface. The release point can also be captured by the UI as it may be used to commit the action started when the user pressed on the touch sensitive screen. In a similar fashion, if the user releases hold without moving the contact point, a tap or press or press-and-hold command may be assumed depending on the amount of time the user was continually pressing on the touch sensitive screen. These main detections can be used in various ways to implement UI functionality, including an accessible menu navigation mode as variously described herein, as will be appreciated in light of this disclosure.
In this example case, the method includes detecting 701 a menu navigation gesture on the touch sensitive interface. As described above, the gesture contact may be performed in any suitable manner using a stylus, the user's finger, or any other suitable implement, and it may be performed on a touch screen surface, a track pad, acoustic sensor, or other touch sensitive surface. The user contact monitoring is essentially continuous. Once a user contact has been detected, the method may continue with reading 702 the menu option aloud to the user. In one embodiment, an upward swipe may prompts a group of navigation menu options to be read aloud, while a downward swipe may prompt a group of settings options to be read aloud. Other embodiments may read aloud all the menu options, either organized or listed at random, upon performing either an upward or downward swipe gesture. Some embodiments may scroll through the entire menu list after a single menu navigation gesture, with each menu option followed by an earcon or a slight pause, giving the user a chance to select that option. Other embodiments, like the one shown in this example method, may require a separate swipe gesture in order to scroll through each menu option.
The method may continue with determining 703 whether the menu option has one or more sub-menus. In one example, a navigation menu may include a “chapters” menu option and this option may include multiple sub-menu options for each chapter in a given book. If sub-menu options are available, the method may continue with determining 704 whether a sub-menu navigation gesture is detected. In one embodiment, if the menu navigation gesture is a vertical swipe gesture, the sub-menu navigation gesture may be a horizontal swipe gesture. If no sub-menu navigation gesture is detected, the method may return to monitoring 701 for another menu navigation gesture. If a sub-menu navigation gesture is detected, the method may read aloud 705 the sub-menu options. The method may continue with determining 706 whether a sub-menu selection gesture is detected, and if none is detected the method may continue with determining 604 whether another sub-menu navigation gesture is detected. If a sub-menu selection gesture is detected, the method may continue with selecting 707 the sub-menu option. In one specific example, a user has just opened a book with an eReader application and performs multiple menu navigation gestures (e.g., upward swipe gestures) in order to access the “chapter” menu. The user then performs three sub-menu navigation gestures (e.g., sideways swipe gestures to the right) in order to access the sub-menu option “chapter 3,” and then performs a sub-menu selection gesture (e.g., a double-tap gesture) in order to access the content of chapter 3.
If the menu option read aloud at 702 does not have sub-menu options, the method may continue with determining 708 whether the menu option selection gesture is detected. One such menu option may be the “spell word” menu option, because the option may either be selected or not, and no sub-menu options are available. Another such menu option may include an “adjust volume” or “adjust rate” menu option, wherein the menu option selection gesture includes a value adjustment gesture, such as a horizontal swipe gesture used to adjust the value for that setting or menu option. If a menu selection gesture is detected, the method may continue with performing 709 the menu option function. If no menu selection gesture is detected, the method may continue with abandoning 710 the menu navigation mode. Abandoning the menu navigation mode may occur if no contact is detected after a certain period of time. Furthermore, at any point during the accessible menu navigation mode the mode may be abandoned if the home button is pressed or if some other hard-coded or configurable abandon action is performed. As discussed above, the various menu navigation gestures and selection gestures may be hard-coded or configured by the user, and different gesture configurations performed with one or more contact points may be used for any of the navigation or selection gestures.
Numerous variations and embodiments will be apparent in light of this disclosure. One example embodiment of the present invention provides a device including a touch sensitive surface for allowing user input. The device also includes a user interface including an accessible menu navigation mode configured to navigate through menu and sub-menu options in response to one or more navigation gestures, wherein each menu and sub-menu option is aurally presented to the user in response to a corresponding navigation gesture, and wherein the accessible menu navigation mode is further configured to adjust and/or enable menu and sub-menu options in response to a selection gesture. In some cases, the accessible menu navigation mode is further configured to activate a manual reading mode in response to a manual mode activation gesture, wherein the manual reading mode allows the user to adjust and select navigation and settings options. In some cases, the accessible menu navigation mode is further configured to aurally present one or more earcons in response to at least one of: accepting a menu option, cancelling a menu option, entering a manual reading mode, entering an automatic reading mode, passing a content boundary, navigating through a menu, and/or adjusting a settings option. In some cases, the accessible menu navigation mode is configured to navigate through one or more menu options in response to one or more substantially vertical swipe gestures, and to perform at least one of: navigate through one or more sub-menu options, and/or change a value for a menu option in response to one or more substantially horizontal swipe gestures. In some cases, the selection gesture includes at least one of: a vertical swipe gesture, a horizontal swipe gesture, a double-tap gesture, and/or lifting a contact point from the touch sensitive surface. In some cases, the selection gesture is user configurable. In some cases, the accessible menu navigation mode is configured to navigate through one or more content navigation options in response to a first menu navigation gesture, and to navigate through one or more settings options in response to a second menu navigation gesture, the first and second menu navigation gestures having opposite orientations. In some cases, the menu and sub-menu options are user configurable. In some cases, one or more menu and/or sub-menu options allow the user to share selected content over the Internet. In some cases, one or more menu and/or sub-menu options allow the user to create a customized group of people and share selected content with that group. In some cases, the accessible menu navigation mode is configured to abandon if no user contact is detected at the touch sensitive surface after a specific period of time.
Another example embodiment of the present invention provides a mobile computing system including a processor and a touch sensitive surface for allowing user input, and a user interface executable on the processor and including an accessible menu navigation mode configured to navigate through menu and sub-menu options in response to one or more navigation gestures, wherein each menu and sub-menu option is aurally presented to the user in response to a corresponding navigation gesture, and wherein the accessible menu navigation mode is further configured to adjust and/or enable menu and sub-menu options in response to a selection gesture. In some cases, the accessible menu navigation mode is further configured to activate a manual reading mode in response to a manual mode activation gesture, wherein the manual reading mode allows the user to adjust and select navigation and settings options. In some cases, the accessible menu navigation mode is further configured to aurally present one or more earcons in response to at least one of: accepting a menu option, cancelling a menu option, entering a manual reading mode, entering an automatic reading mode, passing a content boundary, navigating through a menu, and/or adjusting a settings option. In some cases, the accessible menu navigation mode is configured to navigate through one or more menu options in response to one or more substantially vertical swipe gestures, and to perform at least one of: navigate through one or more sub-menu options, and/or change a value for a menu option in response to one or more substantially horizontal swipe gestures.
Another example embodiment of the present invention provides a computer program product including a plurality of instructions non-transiently encoded thereon to facilitate operation of an electronic device according to a process. The computer program product may include one or more computer readable mediums such as, for example, a hard drive, compact disk, memory stick, server, cache memory, register memory, random access memory, read only memory, flash memory, or any suitable non-transitory memory that is encoded with instructions that can be executed by one or more processors, or a plurality or combination of such memories. In this example embodiment, the process is configured to receive at a touch sensitive surface of the electronic device a menu navigation gesture; aurally present a menu option in response to the menu navigation gesture; receive at the touch sensitive surface a selection gesture; and adjust a navigation option and/or settings option in response to the selection gesture. In some cases, the process is further configured to receive at the touch sensitive surface a sub-menu navigation gesture; and aurally present a sub-menu option in response to the sub-menu navigation gesture. In some cases, the process is further configured to receive at the touch sensitive surface a menu option value adjustment gesture; and adjust a menu option value in response to the value adjustment gesture. In some cases, the process is further configured to activate a manual reading mode in response to a manual reading mode activation gesture detected at the touch sensitive surface, wherein the manual reading mode allows the user to adjust and select navigation and settings options. In some cases, the process is further configured to: aurally present one or more earcons in response to at least one of: accepting a menu option, cancelling a menu option, entering a manual reading mode, entering an automatic reading mode, passing a content boundary, navigating through a menu, and/or adjusting a settings option.
The foregoing description of the embodiments of the invention has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed. Many modifications and variations are possible in light of this disclosure. It is intended that the scope of the invention be limited not by this detailed description, but rather by the claims appended hereto.
This application claims the benefit of U.S. Provisional Application Nos. 61/674,098 and 61/674,102 both filed on Jul. 20, 2012 each of which are herein incorporated by reference in their entirety.
Number | Date | Country | |
---|---|---|---|
61674098 | Jul 2012 | US | |
61674102 | Jul 2012 | US |