The present application relates generally to the field of graphical user interfaces and network communications. More specifically, the application relates to a system and method for a user interface for key-pad driven devices, such as mobile phones for example. The user interface may provide two simultaneous focus elements on a display screen at once, and each focus element can be controlled by a separate set of keys, for example.
Many technological innovations rely upon a user interface design to lesson technical complexity of a product. Technology alone may not win user acceptance and subsequent marketability, but rather, a user's experience, or how the user experiences an end product, may be the key to acceptance. When applied to computer software, a user interface design enables human to computer interaction.
In wireless communication devices, functions are primarily controlled by using a keyboard, and information is displayed to a user using a display. Some devices may be provided with particular browser keys, which are usually implemented as mechanical keys that can be pressed to select a following or preceding alternative. A user presses a key to select a desired control function that is indicated by providing a command of the function in writing or a symbol illustrating the same in the display in a vicinity of the key. A user typically interacts with controls or displays of a computer or computing device through a user interface.
In typical user interfaces for mobile phones, for example, a user has control of only one interface at any given time. For example, a user may initiate a client browser to load a web page, and thus, the user would only be able to use keys on the mobile phone to navigate within the web page. To navigate or utilize other functions on the mobile phone, the user would need to exit out of or close the client browser to enable selection of another application using the keys on the mobile phone. Thus, while any given interface application is running on the mobile phone, the keys on the mobile phone only operate to navigate within the one interface application.
In the present application, a mobile phone is provided that includes a computer-readable medium containing a set of instructions for causing a processing unit to perform the functions of displaying a main content area on a display screen of the mobile phone that includes a focus element, and at the same time as displaying the main content area on the display screen of the mobile phone, displaying a control content area on the display screen of the mobile phone that include selectable icons. The functions further include providing a first input function for enabling movement between and action upon the focus element contained in the main content area while not affecting movement between or action upon the selectable icons contained in the control content area, and providing a second input function for enabling movement between and action upon the selectable icons contained in the control content area while not affecting movement between or action upon the focus element contained in the main content area.
In another aspect, a mobile phone is provided that includes a computer-readable medium containing a set of instructions for causing a processing unit to perform the functions of displaying a first control content area and a second control content area on a screen of the mobile phone, and providing a first key on the mobile phone for controlling movement between and action upon elements in the first control content area and a second key on the mobile phone for controlling movement between and action upon elements in the second control content area. The first key and the second key enable simultaneous control of the first control content area and the second control content area, respectively. The functions further includes controlling movement between and action upon elements in the second control content area based on a received command from the second key by sliding selectable icons left or right within the second control content area to position a desired icon in a focus position of the second control content area.
In still another aspect, a mobile phone is provided that includes a processor that receives inputs from a first input interface and a second input interface, and memory containing a set of instructions executable by the processor to perform the functions of: (i) displaying a main content area on a display screen of the mobile device that includes a focus element; (ii) at the same time as displaying the main content area on the display screen of the mobile device, displaying a control content area on the display screen of the mobile phone that include selectable icons; (iii) receiving inputs from the first input interface for controlling movement between and action upon the focus element contained in the main content area while not affecting movement between or action upon the selectable icons contained in the control content area, and (iv) receiving inputs from the second input function for controlling movement between and action upon the selectable icons contained in the control content area while not affecting movement between or action upon the focus element contained in the main content area.
The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
The present application provides a user interface including multiple content areas on one display within which a user may navigate simultaneously. Separate control keys or functions may be provided for each content area to enable interaction within the content areas. For example, a left softkey may control display of one content area, such as to include a menu of actions for a current web page displayed in the content area, and a right softkey may be context sensitive, for example, and may control functions including back, zoom, etc. in another content area.
Portable computing devices, or mobile phones for example, usually include keyboards that contain keys for moving a cursor up, down, to the left, or to the right on the display. A user may control the cursor on the mobile phone in the same way that a user controls a cursor on a personal computer using a mouse, for example. Other keys may be used for selecting functions on a display of the devices. Corresponding functions of a mouse may also be possible using a touch screen for controlling the cursor. According to the present application, using any of these types of control features may enable the user to interact with multiple content areas of a display simultaneously.
Referring now to
In addition, the main content area 102 may include content that extends beyond the displayable area (e.g., window) of the computing device 100, and the 5-way navigation pad 106 enables scrolling both in a horizontal and vertical fashion within the main content area 102. Thus, the 5-way navigation pad 106 enables navigation between elements that are not in the displayable area resulting in the main content area 102 scrolling to display the elements while the control content area 104 may remain fixed in its display location.
The 5-way navigation pad 106 may not enable navigation within the control content area 104. The control content area 104 may be manipulated via a left softkey 110 and a right softkey 112. Of course, a user may program any of the keys of the computing device 100, such as any of the 5-way navigation pad 106, the left softkey 110, the right softkey 112, or any keys of a numeric keypad area 114, to be used for interfacing with either the main content area 102 or the control content area 104. It may be, however, that a key can only perform as a navigation key for one content area at a time so that a user will use at least two different keys in order to navigate both the main content area 102 and the control content area 104 at the same time.
The left softkey 110 and the right softkey 112 refer to keys below the display screen on the computing device 100 that are not contained within the numeric keypad 114, and perform a special function on the computing device 100. The left softkey 110 and the right softkey 112 are positioned on either side of the 5-way navigation pad 106, or alternatively, the 5-way navigation pad 106 is positioned between the left softkey 110 and the right softkey 112. The left softkey 110 and the right softkey 112 permute or enable navigation between elements contained in the control content area 104 by sliding elements left or right to position an element in a center position. The center position is a control content area focus 116, however, other positions besides the center position could also be programmed to be the control content area focus position, for example. When a user selects the focus 116, an application designated by an icon of the focus 116 will be executed.
Using this configuration, a user may navigate through and within the main content area 102 using the 5-way navigation pad 106, and at the same time, a user may navigate within the control content area 104 using either the left softkey 110, the right softkey 112 or both. The computing device 100 is provided with a graphical user interface (GUI) that enables simultaneous navigation capabilities, for example, within the main content area 102 and the control content area 104. In one example, the main content area 102 and the control content area 104 may be a single graphical user interface within which the left softkey 110 and the right softkey 112 are reserved for switching content screens, and the 5-way navigation pad 106 enables interacting within the screens. For example, the left softkey 110 may control display of the control content area 104 as well as include a menu of actions for a current web page displayed in the main content area 102. The right softkey 112 may be context sensitive, for example, and may control functions including back, zoom, etc.
Alternatively, the computing device 100 may include multiple graphical user interfaces where the main content area 102 comprises a first graphical user interface, and the control content area 104 comprises a second graphical user interface. The computing device 100 can then allow a user to use both the first and second graphical user interfaces at the same time, and the user can navigate through each individually using different keys on the computing device 100 that are designated for use with one of the graphical user interfaces, for example.
Whether a display on the computing device 100 is provided by one or two GUIs, at least two content control areas will be provided. Thus, a user may navigate within the main content area 102 independently of the control content area 104, for example, and a user may do so at the same time, if desired, using separate or different keys for each navigation. The computing device 100 thus provides the opportunity for a user to have multiple focus areas on the same display screen at same time.
Further, as a user scrolls through icons in the control content area 302, a description of a function of a highlighted icon may be provided, such as shown in
As shown in
The icons within the control content area 302 may correspond to actions that may be performed in the main content area 300, such as zoom, back, forward, etc. Further, as a user navigates within the main content area 300 and changes or executes different applications within the main content area 300, icons within the control content area 302 may adjust to designate action or applications associated with or that may be performed within or by an application running in the main content area 300, for example.
The control content area 300 may be hidden from display when not being actively used, resulting in the main content area 302 occupying the entire display screen. Pressing either the left softkey or the right softkey (as described above with respect to
Similarly, actions performed on main content area 400 elements may cause the control content area 402 to react accordingly. For instance, entering a web address into a text field in the main content area 400 may cause the control content area 402 to switch to a different element and perform a related action.
In one example, multiple simultaneous main content areas 400 can be coexisting and a control content area 402 can be used to select which main content area is visible. For example, if multiple windows of a micro-browser on a mobile phone are opened and displayed in the main content area 400, a user may use icons within the control content area 402 to select which window is visible, or to select which window is displayed in a forefront of the main content area.
Alternatively, a user may only be able to navigate within either the control content area 402 or the main content area 400 at a given time. For example, as shown in
In general, it should be understood that the computing device 500 could include hardware objects developed using integrated circuit development technologies, or the combination of hardware and software objects that could be ordered, parameterized, and connected in a software environment to implement different functions described herein. Also, the hardware objects could communicate using electrical signals, with states of the signals representing different data. It should also be noted that the computing device 500 generally executes application programs resident at the computing device 500 under the control of an operating system. The application programs, such as a client browser, may be stored on memory within the computing device 500 and may be provided using machine language instructions or software with object-oriented instructions, such as the Java programming language. However, other programming languages (such as the C++ programming language for instance) could be used as well. The computing device 500 may also include other components (not shown), such as a receiver, a transmitter a microphone, and an audio block for converting a microphone signal from analog to digital form, and for converting a signal to be transmitted to the receiver from digital to analog form, for example.
The computing device 500 may be an electronic device including any of a wireless telephone, personal digital assistant (PDA), hand-held computer, and a wide variety of other types of electronic devices that might have navigational capability (e.g., keyboard, touch screen, mouse, etc.) and an optional display for viewing downloaded information content. Furthermore, the computing device 500 can include any type of device that has the capability to utilize speech synthesis markups such as W3C (www.w3.org) Voice Extensible Markup Language (VoiceXML). One skilled in the art of computer systems will understand that the example embodiments are not limited to any particular class or model of computer employed for the computing device 500 and will be able to select an appropriate system.
Thus, the computing device 500 generally can range from a hand-held device, laptop, or personal computer. One skilled in the art of computer systems will understand that the present example embodiments are not limited to any particular class or model of computer employed for the computing device 500 and will be able to select an appropriate system.
The processor 502 may be embodied as a processor that accesses internal (or external) memory, such as the memory 506, to execute software functions stored therein. One skilled in the art of computer systems design will understand that the example embodiments are not limited to any particular class or model of processor. The processor 502 may operate according to an operating system, which may be any suitable commercially available embedded or disk-based operating system, or any proprietary operating system. Further, the processor 502 may comprise one or more smaller central processing units, including, for example, a programmable digital signal processing engine or may also be implemented as a single application specific integrated circuit (ASIC) to improve speed and to economize space. In general, it should be understood that the processor 502 could include hardware objects developed using integrated circuit development technologies, or yet via some other methods, or the combination of hardware and software objects that could be ordered, parameterized, and connected in a software environment to implement different functions described herein. Also, the hardware objects could communicate using electrical signals, with states of the signals representing different data.
The processor 502 may further comprise, for example, a micro controller unit (MCU) and a programmable logic circuit (ASIC, Application Specific Integrated Circuit), and may execute software to perform functions of a wireless communication device, such as reception and transmission functions, and I/O functions (Input/Output).
The input interface 504 may include a keyboard, a trackball, and/or a two or three-button mouse function, if so desired. The input interface 504 is not, however, limited to the above presented kind of input means, and the input interface 504 can comprise for example several display elements, or merely a touch screen. Further, the input interface 504 may include multiple input functions, or multiple input interfaces, such as a keypad, a touchscreen, etc., depending on the type of computing device 500, for example.
The memory 506 may include a computer readable medium. Computer readable medium may refer to any medium that participates in providing instructions to a processor unit for execution. Such a medium may take many forms, including but not limited to, non-volatile media, and transmission media. Non-volatile media include, for example, optical or magnetic disks, such as storage devices. Volatile media include, for example, dynamic memory, such as main memory or random access memory (RAM). Common forms of computer readable media include, for example, floppy disks, flexible disks, hard disks, magnetic tape, punch cards, CD-ROM, a RAM, a PROM, an EPROM, a FLASH-EPROM, and any other memory chip or cartridge, or any other medium from which a computer can read.
The user interfaces 508 and 510 may be embodied as a module, a segment, or a portion of program code, which includes one or more instructions executable by the processor 502 for implementing specific logical functions or steps. The program code may be stored on any type of computer readable medium, for example, such as a storage device including a disk or hard drive.
In one example, the processor 502 executes software or machine language instructions stored in the memory 506 to perform the functions of the user interfaces 508 and 510. Thus, the processor 502 uses software to create specialized interfaces on the display 512. Each user interface 508 and 510 may be displayed simultaneously on the display 512, and each may be displayed on only a portion of the display 512. A user may navigate within each user interface 508 and 510 separately using respective keys or input functions of the input interface 504. According to one example, when the user interface software 508 and 510 is executed, the processor 502 instructs the display 512 to display a graphical user interface (GUI) that includes multiple content control areas for independent control by a user. Alternatively, the processor 502 may instruct the display 512 to display multiple graphical user interfaces that each include a content control area, and each GUI may be independently controlled by the user.
The user interfaces 508 and 510 may be of a standard type of user interface allowing a user to interact with a computer that employs graphical images in addition to text to represent information and actions available to the user. Actions may be performed through direct manipulation of graphical elements, which include windows, buttons, menus, and scroll bars, for example.
The user interfaces 508 and 510 may include either Java or HTML content, for example. A Java page may be programmed into the computing device 500 for specialized actions, while HTML content may include dynamic content (e.g., web pages, clips, widgets). Content control areas of the GUIs produced by execution of the user interfaces 508 and 510 can be configured using HTML (or XML), for example.
The present disclosure is not to be limited in terms of the particular embodiments described in this application, which are intended as illustrations of various aspects. Many modifications and variations can be made without departing from its spirit and scope, as will be apparent to those skilled in the art. Functionally equivalent methods and apparatuses within the scope of the disclosure, in addition to those enumerated herein, will be apparent to those skilled in the art from the foregoing descriptions. Such modifications and variations are intended to fall within the scope of the appended claims. The present disclosure is to be limited only by the terms of the appended claims, along with the full scope of equivalents to which such claims are entitled. Thus, the various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope and spirit being indicated by the following claims.
The present patent application claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application Ser. No. 61/027,159, filed on Feb. 8, 2008, the entire contents of which are incorporated herein by reference as if fully set forth in this description.
Number | Date | Country | |
---|---|---|---|
61027159 | Feb 2008 | US |