A computer program often includes a user interface by which users can interact with the program. The user interface can provide graphical, textual, or other tools for providing inputs to the program and for receiving outputs from the program. Typical user interfaces can include one or more elements or controls, such as menus, windows, buttons, text boxes, labels, and the like. Input devices for interacting with the user interface can include a mouse, keyboard, touch screen, remote control, game controller, or the like.
One user interface element common to many user interfaces is the menu control. The menu control can be an icon, button, drop-down list control, or the like. In some implementations, when the menu control is selected (e.g., by clicking with a mouse or by typing a shortcut key sequence), a menu including a list of items is displayed. This menu can appear to pop up over underlying display elements. These menus are therefore often referred to as “pop-up menus.”
Many user interfaces have a large number of menus that can overwhelm a user. Many interfaces also have menus that many users rarely use. User productivity can be adversely affected by such user interfaces.
Having several pop-up menus (or menu controls for accessing the pop-up menus) in a user interface window can clutter the window and confuse a user. In addition, some windows include pop-up menus or controls that are infrequently used. Certain users might therefore wish to customize the layout of menu controls and/or the content of the pop-up menus to reduce clutter or otherwise improve organization of the menus. However, currently available user interfaces provide no mechanisms for customizing menus within a user interface window.
Thus, in certain embodiments, systems and methods are provided for customizing menus that address some or all of the above-mentioned problems. In certain embodiments, these systems and methods can include the ability to move, delete, and create menu controls or pop-up menus. In addition, in certain embodiments, pop-up menus can be merged or items from pop-up menus can be moved to other pop-up menus.
For purposes of illustration, the systems and methods described herein are described primarily in the context of menu customization. However, in certain embodiments, user interface elements other than menus can also be customized using the systems and methods described herein. For example, buttons, text boxes, labels, combinations of the same, and the like can be customized in certain embodiments.
The features of these systems and methods will now be described with reference to the drawings summarized above. Throughout the drawings, reference numbers are re-used to indicate correspondence between referenced elements. The drawings, associated descriptions, and specific implementation are provided to illustrate embodiments of the invention and not to limit the scope of the inventions disclosed herein.
In addition, methods and processes described herein are not limited to any particular sequence, and the blocks or states relating thereto can be performed in other sequences that are appropriate. For example, described blocks or states may be performed in an order other than that specifically disclosed, or multiple blocks or states may be combined in a single block or state. Moreover, the various modules of the systems described herein can be implemented as software applications, modules, or components on one or more computers, such as servers. While the various modules are illustrated separately, they may share some or all of the same underlying logic or code.
At block 102, a first pop-up menu in a user interface window is provided. The first pop-up menu can be accessed, for example, by using an input device to select a menu control corresponding to the pop-up menu. The first pop-up menu can include one or more items or options that can be selected by a user. For example, in some computer systems, a file menu control, when selected, presents several items in the form of textual labels, such as a “Save” option for saving a file or an “Exit” option for closing a file. Example menu controls and pop-up menus are illustrated and described below with respect to
At block 104, it is determined whether a user moves one or more items in the first pop-up menu to a target area. The items can be moved by the user in certain embodiments by selecting the items with an input device such as a mouse and by “dragging” the items to the target area. The target area can be any location in the user interface such as a toolbar, any location within a window, on a desktop display, or anywhere else on a display. If it is determined that the user has not moved an item in the pop-up menu to the target area, then process 100 ends.
If, however, the user did move the items to the target area, it is determined at block 106 whether there is a menu control for a second pop-up menu in the target area. If there is a menu control in the target area, then at block 108 the selected items are placed in the second pop-up menu. The selected item from the first pop-up menu can be added to any items already in the second pop-up menu. Alternatively, in certain embodiments, the selected items placed into the second pop-up menu can replace any items that were in the second pop-up menu. If it is instead determined that there is no menu control for a second pop-up menu in the target area, then at block 110 a second pop-up menu and/or corresponding menu control is created that includes the selected items.
Advantageously, if a new pop-up menu is created at block 110, the selected items may be automatically removed from the first pop-up menu. Thus, the new pop-up menu can be intelligently aware of the contents of the first pop-up menu and vice versa. Thereafter process 100 ends.
In addition to the embodiments described, in certain alternative embodiments, process 100 can enable pop-up menus or menu controls to be moved to different areas in a user interface. Thus, for example, a user can swap the location of menus, move menus to different parts of a window, and so on. In addition, in some implementations, customization options other than dragging a pop-up menu or menu item using a mouse are provided.
Turning to
In the depicted embodiment, user interface 200 includes window 210 having toolbar 202 and window body 204. One toolbar 202 is shown, although many toolbars can be used in certain implementations. Toolbar 202 is located at the top or near the top of window 210. In certain implementations, toolbar 202 can be in any other location within the window or outside of the window, for example, as a floating toolbar, which can be in its own window. Window body 204 includes an area for writing software. Window body 204 can have different functions in other applications.
Example toolbar 202 include two menus controls 220, 224. Menu controls 220, 224 each include a textual label (“menu 1” and “menu 2,” respectively) as well as arrows 231 to navigate within menu control 220, 224. In other embodiments, menu controls 220, 224 may not have textual labels but can rather have icons or graphics, a textbox for entering a search term, combinations of the same, and the like. Menu control 220 is shown in
In contrast, menu control 224 is currently selected, as illustrated by a darkened color of menu control 224. Because menu control 224 is selected, pop-up menu 230 is displayed beneath menu control 224. The position of pop-up menu 230 can be configured to be any position within or outside of window 210 in various implementations and need not be beneath menu control 224. Pop-up menu 230 includes first and second sets of items, 234 and 236. Each set of items 234, 236 includes items that are related by type. For example, first set of items 234 includes items 1 and 2 that are of type 1, and second set of items 236 includes items A and B which are of type 2. Other pop-up menus may include items that are not grouped by types in certain implementations.
In certain embodiments, the textual labels (or icons) of a menu control 220, 224 can correspond to the types of items 234, 236 provided in corresponding pop-up menus 230. Examples of textual labels are now provided. In these examples, the user interface 200 is a software development program. One example menu control 224 in the software development program might have a textual label of “Device” corresponding to a device for which software is being developed (e.g., replace “Menu 2” with “Device”). A type of items 234 can include, for instance, “Platform” (e.g., replace “Type 1” with “Platform”). Thus, an example pop-up menu 230 for the menu control “Device” is shown as follows, using various example items 234:
If multiple types of items 234 are shown in the pop-up menu 230, the textual label of the menu control 224 can reflect each type. For example, if a second type (Type 2) in the pop-up menu 230 is “Build Configuration,” the textual label of the menu control 220 might be “Device and Configuration.” A corresponding pop-up menu might be as follows:
However, in one embodiment, if one of the types has only one item, the name of the type may be omitted from the textual label of the menu control 224 to reduce clutter in the user interface 200. Thus, the following example pop-up menu might have the textual label “Device and Configuration” rather than “Device, Configuration, and Architecture”:
In another embodiment, if only one type exists in a pop-up menu 230, the textual label corresponding to that type may be used by menu control 220.
Items in pop-up menus 230 can be moved to other locations in the user interface 220 to create new pop-up menus or be added to existing menu controls. Specific examples of manipulating pop-up menus and menu controls are described below with respect to
Referring again to
Target area 250 can be a user-selected location for placing items, sets of items, menus, and the like. In the depicted embodiment, target area 250 is on toolbar 202. Other options for the location of the target area are described below with respect to
A second set of items 236, when selected by cursor 240 and moved toward the target area, becomes a set of selected items 336 as shown in
Once selected set of items 336 are dropped onto target area 250, a new menu control can be created.
Thus, moving items 336 from menu control 224 to another area in the user interface (target area 250) can advantageously facilitate creation of another menu control 426. In addition, items 336 can be removed from menu control 224 upon creation of new menu control 426. In certain alternative embodiments, items 336 can be left in original menu control 224 when new menu control 426 is created.
In certain embodiments, creating a new menu control 426 from a previous menu control 224 can cause the textual labels of the previous menu control 224 to change. To illustrate certain embodiments of changing textual labels using the software development example of
If the items corresponding to the “Build Configuration” type (e.g., “Release” and “Debug”) are removed from the pop-up menu 230 to create a new menu control 426, the old menu control's 224 textual label might be modified to “Device,” and pop-up menu 230 might include:
Likewise, the new menu control 426 might have a textual label of “Configuration” created and new pop-up menu 460 as follows:
In
Thus, user interfaces 500 and 600 illustrate how a user can combine menus. Advantageously, combining menus can reduce clutter within a user interface window, enabling the user to more easily find options in the user interface.
In certain embodiments, combining the menu control 224 with the menu control 220 can cause the textual label of the menu control 220 to change. Thus, returning to our previous example, the old menu control 220 might have previously had the label “Device” and the following pop-up menu:
Likewise, the old menu control 224 might have had the textual label “Configuration” along with the following items in its pop-up menu:
Adding the items in the pop-up menu 224 to the pop-up menu 220 can result in new menu control 620 having a textual label of “Device and Configuration,” with items in pop-up menu 670 as follows:
In
Selected item 712 from set of items 674 has been selected by cursor 240 and has been removed from set of items 674. In window 810 of
While one item 712 has been shown being moved from a pop-up menu to another, in other embodiments multiple items (including non-consecutive items) can be moved from one pop-up menu to another.
Illustrative computer systems 1000 include general purpose (e.g., PCs) and special purpose (e.g., graphics workstations) computer systems, which may include one or more servers, databases, and the like. In addition, computer system 1000 can be a handheld or portable device, such as a laptop, personal digital assistant (PDA), cell phone, smart phone, or the like. More generally, any processor-based system may be used as computer system 1000.
Computer system 1000 of certain embodiments includes processor 1002 for processing one or more software programs 1006 stored in memory 1004, for accessing data stored in hard data storage 1008, and for communicating with display interface 1010. Display interface 1010 provides an interface to a computer display or displays, such as one or more monitors or screens. In certain embodiments, one or more programs 1006 can use display interface 1010 to effectuate any of the customization features to any user interface described above.
In an embodiment, computer system 1000 further includes, by way of example, one or more processors, program logic, or other substrate configurations representing data and instructions, which operate as described herein. In other embodiments, the processor can comprise controller circuitry, processor circuitry, processors, general purpose single-chip or multi-chip microprocessors, digital signal processors, embedded microprocessors, microcontrollers, graphics processors, and the like.
In some implementations, the mobile device 1100 includes a touch-sensitive display 1102. The touch-sensitive display 1102 can be implemented with liquid crystal display (LCD) technology, light emitting polymer display (LPD) technology, or some other display technology. The touch-sensitive display 1102 can be sensitive to haptic and/or tactile contact with a user.
In some implementations, the touch-sensitive display 1102 can include a multi-touch-sensitive display 1102. A multi-touch-sensitive display 1102 can, for example, process multiple simultaneous touch points, including processing data related to the pressure, degree, and/or position of each touch point. Such processing facilitates gestures and interactions with multiple fingers, chording, and other interactions. Other touch-sensitive display technologies can also be used, e.g., a display in which contact is made using a stylus or other pointing device. Some examples of multi-touch-sensitive display technology are described in U.S. Pat. Nos. 6,323,846, 6,570,557, 6,677,932, and 6,888,536, each of which is incorporated by reference herein in its entirety.
In some implementations, the mobile device 1100 can display one or more graphical user interfaces on the touch-sensitive display 1102 for providing the user access to various system objects and for conveying information to the user. In some implementations, the graphical user interface can include one or more display objects 1104, 1106. In the example shown, the display objects 1104, 1106, are graphic representations of system objects. Some examples of system objects include device functions, applications, windows, files, alerts, events, or other identifiable system objects.
In some implementations, the mobile device 1100 can implement multiple device functionalities, such as a telephony device, as indicated by a Phone object 1110; an e-mail device, as indicated by the Mail object 1112; a map devices, as indicated by the Maps object 1114; a Wi-Fi base station device (not shown); and a network video transmission and display device, as indicated by the Web Video object 1116. In some implementations, particular display objects 1104, e.g., the Phone object 1110, the Mail object 1112, the Maps object 1114, and the Web Video object 1116, can be displayed in a menu bar 1118. In some implementations, device functionalities can be accessed from a top-level graphical user interface, such as the graphical user interface illustrated in
In some implementations, the mobile device 1100 can implement a network distribution functionality. For example, the functionality can enable the user to take the mobile device 1100 and provide access to its associated network while traveling. In particular, the mobile device 1100 can extend Internet access (e.g., Wi-Fi) to other wireless devices in the vicinity. For example, mobile device 1100 can be configured as a base station for one or more devices. As such, mobile device 1100 can grant or deny network access to other wireless devices.
In some implementations, upon invocation of a device functionality, the graphical user interface of the mobile device 1100 changes, or is augmented or replaced with another user interface or user interface elements, to facilitate user access to particular functions associated with the corresponding device functionality. For example, in response to a user touching the Phone object 1110, the graphical user interface of the touch-sensitive display 1102 may present display objects related to various phone functions; likewise, touching of the Mail object 1112 may cause the graphical user interface to present display objects related to various e-mail functions; touching the Maps object 1114 may cause the graphical user interface to present display objects related to various maps functions; and touching the Web Video object 1116 may cause the graphical user interface to present display objects related to various web video functions.
In some implementations, the top-level graphical user interface environment or state of
In some implementations, the top-level graphical user interface can include additional display objects 1106, such as a short messaging service (SMS) object 1130, a Calendar object 1132, a Photos object 1134, a Camera object 1136, a Calculator object 1138, a Stocks object 1140, a Address Book object 1142, a Media object 1144, a Web object 1146, a Video object 1148, a Settings object 1150, and a Notes object (not shown). Touching the SMS display object 1130 can, for example, invoke an SMS messaging environment and supporting functionality; likewise, each selection of a display object 1132, 1134, 1136, 1138, 1140, 1142, 1144, 1146, 1148, and 1150 can invoke a corresponding object environment and functionality.
Additional and/or different display objects can also be displayed in the graphical user interface of
In some implementations, the mobile device 1100 can include one or more input/output (I/O) devices and/or sensor devices. For example, a speaker 1160 and a microphone 1162 can be included to facilitate voice-enabled functionalities, such as phone and voice mail functions. In some implementations, an up/down button 1184 for volume control of the speaker 1160 and the microphone 1162 can be included. The mobile device 1100 can also include an on/off button 1182 for a ring indicator of incoming phone calls. In some implementations, a loud speaker 1164 can be included to facilitate hands-free voice functionalities, such as speaker phone functions. An audio jack 1166 can also be included for use of headphones and/or a microphone.
In some implementations, a proximity sensor 1168 can be included to facilitate the detection of the user positioning the mobile device 1100 proximate to the user's ear and, in response, to disengage the touch-sensitive display 1102 to prevent accidental function invocations. In some implementations, the touch-sensitive display 1102 can be turned off to conserve additional power when the mobile device 1100 is proximate to the user's ear.
Other sensors can also be used. For example, in some implementations, an ambient light sensor 1170 can be utilized to facilitate adjusting the brightness of the touch-sensitive display 1102. In some implementations, an accelerometer 1172 can be utilized to detect movement of the mobile device 1100, as indicated by the directional arrow 1174. Accordingly, display objects and/or media can be presented according to a detected orientation, e.g., portrait or landscape. In some implementations, the mobile device 1100 may include circuitry and sensors for supporting a location determining capability, such as that provided by the global positioning system (GPS) or other positioning systems (e.g., systems using Wi-Fi access points, television signals, cellular grids, Uniform Resource Locators (URLs)). In some implementations, a positioning system (e.g., a GPS receiver) can be integrated into the mobile device 1100 or provided as a separate device that can be coupled to the mobile device 1100 through an interface (e.g., port device 1190) to provide access to location-based services.
In some implementations, a port device 1190, e.g., a Universal Serial Bus (USB) port, or a docking port, or some other wired port connection, can be included. The port device 1190 can, for example, be utilized to establish a wired connection to other computing devices, such as other communication devices 1100, network access devices, a personal computer, a printer, a display screen, or other processing devices capable of receiving and/or transmitting data. In some implementations, the port device 1190 allows the mobile device 1100 to synchronize with a host device using one or more protocols, such as, for example, the TCP/IP, HTTP, UDP and any other known protocol.
The mobile device 1100 can also include a camera lens and sensor 1180. In some implementations, the camera lens and sensor 1180 can be located on the back surface of the mobile device 1100. The camera can capture still images and/or video.
The mobile device 1100 can also include one or more wireless communication subsystems, such as an 802.11b/g communication device 1186, and/or a Bluetooth™ communication device 1188. Other communication protocols can also be supported, including other 802.x communication protocols (e.g., WiMax, Wi-Fi, 3G), code division multiple access (CDMA), global system for mobile communications (GSM), Enhanced Data GSM Environment (EDGE), etc.
In some implementations, each of one or more system objects of device 1100 has a set of system object attributes associated with it; and one of the attributes determines whether a display object for the system object will be rendered in the top-level graphical user interface. This attribute can be set by the system automatically, or by a user through certain programs or system functionalities as described below.
Sensors, devices, and subsystems can be coupled to the peripherals interface 1206 to facilitate multiple functionalities. For example, a motion sensor 1210, a light sensor 1212, and a proximity sensor 1214 can be coupled to the peripherals interface 1206 to facilitate the orientation, lighting, and proximity functions described with respect to
A camera subsystem 1220 and an optical sensor 1222, e.g., a charged coupled device (CCD) or a complementary metal-oxide semiconductor (CMOS) optical sensor, can be utilized to facilitate camera functions, such as recording photographs and video clips.
Communication functions can be facilitated through one or more wireless communication subsystems 1224, which can include radio frequency receivers and transmitters and/or optical (e.g., infrared) receivers and transmitters. The specific design and implementation of the communication subsystem 1224 can depend on the communication network(s) over which the mobile device is intended to operate. For example, a mobile device can include communication subsystems 1224 designed to operate over a GSM network, a GPRS network, an EDGE network, a Wi-Fi or WiMax network, and a Bluetooth™ network. In particular, the wireless communication subsystems 1224 may include hosting protocols such that the mobile device may be configured as a base station for other wireless devices.
An audio subsystem 1226 can be coupled to a speaker 1228 and a microphone 1230 to facilitate voice-enabled functions, such as voice recognition, voice replication, digital recording, and telephony functions.
The I/O subsystem 1240 can include a touch screen controller 1242 and/or other input controller(s) 1244. The touch-screen controller 1242 can be coupled to a touch screen 1246. The touch screen 1246 and touch screen controller 1242 can, for example, detect contact and movement or break thereof using any of a plurality of touch sensitivity technologies, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with the touch screen 1246.
The other input controller(s) 1244 can be coupled to other input/control devices 1248, such as one or more buttons, rocker switches, thumb-wheel, infrared port, USB port, and/or a pointer device such as a stylus. The one or more buttons (not shown) can include an up/down button for volume control of the speaker 1228 and/or the microphone 1230.
In one implementation, a pressing of the button for a first duration may disengage a lock of the touch screen 1246; and a pressing of the button for a second duration that is longer than the first duration may turn power to the mobile device on or off. The user may be able to customize a functionality of one or more of the buttons. The touch screen 1246 can, for example, also be used to implement virtual or soft buttons and/or a keyboard.
In some implementations, the mobile device can present recorded audio and/or video files, such as MP3, AAC, and MPEG files. In some implementations, the mobile device can include the functionality of an MP3 player, such as an iPod™. The mobile device may, therefore, include a 32-pin connector that is compatible with the iPod™. Other input/output and control devices can also be used.
The memory interface 1202 can be coupled to memory 1250. The memory 1250 can include high-speed random access memory and/or non-volatile memory, such as one or more magnetic disk storage devices, one or more optical storage devices, and/or flash memory (e.g., NAND, NOR). The memory 1250 can store an operating system 1252, such as Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks. The operating system 1252 may include instructions for handling basic system services and for performing hardware dependent tasks. In some implementations, the operating system 1252 can be a kernel (e.g., UNIX kernel).
The memory 1250 may also store communication instructions 1254 to facilitate communicating with one or more additional devices, one or more computers and/or one or more servers. The memory 1250 may include graphical user interface instructions 1256 to facilitate graphic user interface processing; sensor processing instructions 1258 to facilitate sensor-related processing and functions; phone instructions 1260 to facilitate phone-related processes and functions; electronic messaging instructions 1262 to facilitate electronic-messaging related processes and functions; web browsing instructions 1264 to facilitate web browsing-related processes and functions; media processing instructions 1266 to facilitate media processing-related processes and functions; GPS/Navigation instructions 1268 to facilitate GPS and navigation-related processes and instructions; camera instructions 1270 to facilitate camera-related processes and functions; and/or other software instructions 1272 to facilitate other processes and functions. The memory 1250 may also store other software instructions (not shown), such as web video instructions to facilitate web video-related processes and functions; and/or web shopping instructions to facilitate web shopping-related processes and functions. In some implementations, the media processing instructions 1266 are divided into audio processing instructions and video processing instructions to facilitate audio processing-related processes and functions and video processing-related processes and functions, respectively. An activation record and International Mobile Equipment Identity (IMEI) 1274 or similar hardware identifier can also be stored in memory 1250.
Each of the above identified instructions and applications can correspond to a set of instructions for performing one or more functions described above. These instructions need not be implemented as separate software programs, procedures, or modules. The memory 1250 can include additional instructions or fewer instructions. Furthermore, various functions of the mobile device may be implemented in hardware and/or in software, including in one or more signal processing and/or application specific integrated circuits.
The disclosed and other embodiments and the functional operations described in this specification can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. The disclosed and other embodiments can be implemented as one or more computer program products, i.e., one or more modules of computer program instructions encoded on a computer-readable medium for execution by, or to control the operation of, data processing apparatus. The computer-readable medium can be a machine-readable storage device, a machine-readable storage substrate, a memory device, a composition of matter effecting a machine-readable propagated signal, or a combination of one or more them. The term “data processing apparatus” encompasses all apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, or multiple processors or computers. The apparatus can include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more of them. A propagated signal is an artificially generated signal (e.g., a machine-generated electrical, optical, or electromagnetic signal), that is generated to encode information for transmission to suitable receiver apparatus.
A computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program does not necessarily correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
The processes and logic flows described in this specification can be performed by one or more programmable processors executing one or more computer programs to perform functions by operating on input data and generating output. The processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit).
Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read-only memory or a random access memory or both. The essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks. However, a computer need not have such devices. Computer-readable media suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
To provide for interaction with a user, the disclosed embodiments can be implemented on a computer having a display device, e.g., a CRT (cathode ray tube), LCD (liquid crystal display) monitor, touch sensitive device or display, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input.
While this specification contains many specifics, these should not be construed as limitations on the scope of what is being claimed or of what may be claimed, but rather as descriptions of features specific to particular embodiments. Certain features that are described in this specification in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.
Similarly, while operations are depicted in the drawings in a particular order, this should not be understand as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.
Thus, particular embodiments have been described. Other embodiments are within the scope of the following claims.
The present application claims priority to U.S. Provisional Application No. 61/033,745, filed Mar. 4, 2008, and entitled “CUSTOMIZATION OF USER INTERFACE ELEMENTS .”
Number | Date | Country | |
---|---|---|---|
61033745 | Mar 2008 | US |