User interface for displaying selectable software functionality controls that are contextually relevant to a selected object

Information

  • Patent Grant
  • 8117542
  • Patent Number
    8,117,542
  • Date Filed
    Thursday, September 30, 2004
    20 years ago
  • Date Issued
    Tuesday, February 14, 2012
    12 years ago
Abstract
An improved user interface is provided for displaying selectable functionality controls that identifies the context of the object to which the user interface is relevant, which contains rich functionality controls for applying contextually relevant functionality to a selected object, and which may be efficiently switched to a different context for applying a different set of functionalities to a different or neighboring editable object. A context menu of functionalities is displayed adjacent to a selected object where the menu of functionalities includes functionalities associated with editing the selected object. The context menu includes an identification of the object context and a control for selectively changing the context and associated functionalities of the menu to a different object context for displaying a different set of functionalities associated with the different context.
Description
FIELD OF THE INVENTION

The present invention generally relates to software application user interfaces. More particularly, the present invention relates to an improved user interface for displaying selectable software controls that are contextually relevant to a selected object.


BACKGROUND OF THE INVENTION

With the advent of the computer age, computer and software users have grown accustomed to user-friendly software applications that help them write, calculate, organize, prepare presentations, send and receive electronic mail, make music, and the like. For example, modern electronic word processing applications allow users to prepare a variety of useful documents. Modern spreadsheet applications allow users to enter, manipulate, and organize data. Modern electronic slide presentation applications allow users to create a variety of slide presentations containing text, pictures, data or other useful objects.


To assist users to locate and utilize functionality of a given software application, a user interface containing a plurality of generic functionality controls is typically provided along an upper, lower or side edge of a displayed workspace in which the user may enter, copy, manipulate and format text or data. Such functionality controls often include selectable buttons with such names as “file,” “edit,” “view,” “insert,” “format,” and the like. Typically, selection of one of these top-level functionality buttons, for example “format,” causes a drop-down menu to be deployed to expose one or more selectable functionality controls associated with the top-level functionality, for example “font” under a top-level functionality of “format.”


Prior user interface systems provide pop-up menus for displaying a set of selectable functionality controls that would deploy onto a user's display screen adjacent to a selected object (e.g., text selection, data object, picture object, etc.) for allowing the user to apply a selected functionality of a software application to the selected object. Such context menus are typically deployed upon a user action such as right-clicking a mouse when the mouse cursor is focused on the selected object. Unfortunately, prior context menus have been limited to a small set of selectable controls, and it is typically difficult to identify the editing context of the menu (e.g., text selection, picture object, etc.). Moreover, if a given document being edited by a user contains different (and distinctly editable) objects, it is often difficult to switch the context of such a context menu from one editing context to another (e.g., text selection to picture object).


Accordingly, there is a need in the art for an improved user interface for displaying a menu of selectable functionality controls that identifies the context of the object to which the menu is relevant, that contains rich functionality controls for applying contextually relevant functionality to a selected object, and that may be efficiently switched to a different context for applying a different set of functionalities to a different or neighboring editable object. It is with respect to these and other considerations that the present invention has been made.


SUMMARY OF THE INVENTION

Embodiments of the present invention solve the above and other problems by providing an improved user interface for displaying selectable functionality controls that identifies the context of the object to which the user interface is relevant, which contains rich functionality controls for applying contextually relevant functionality to a selected object, and which may be efficiently switched to a different context for applying a different set of functionalities to a different or neighboring editable object. Generally, according to aspects of the present invention, a menu of functionalities is displayed adjacent to a selected object where the menu of functionalities includes functionalities associated with editing the selected object. The context menu includes an identification of the object context and a control for selectively changing the context and associated functionalities of the menu to a different object context for displaying a different set of functionalities associated with the different context.


According to another aspect of the invention, methods and systems provide a user interface that is contextually relevant to an edited object. A plurality of functionalities available from a software application is provided. Upon receiving a selection of an object for editing via the software application, the contextually relevant user interface is displayed adjacent to the selected object. One or more selectable functionality controls representing a subset of the plurality of functionalities is displayed in the user interface, and the subset of the plurality of functionalities comprise one or more of the plurality of functionalities that are at least substantially the most used in editing an object of the same type as the selected object.


According to other aspects of the invention, the one or more selectable functionality controls may be displayed in the user interface hierarchically based on frequency of previous use. A control may be displayed in the user interface for changing a context of the user interface based on an attribute of the object selected for editing. Upon changing a context of the user interface based on an attribute of the object selected for editing, one or more selectable controls representing a second subset of the plurality of functionalities may be displayed in the user interface, and the second subset of the plurality of functionalities may comprise one or more of the plurality of functionalities that are at least substantially the most used in editing a second attribute of the object selected for editing. A control may be displayed in the user interface for causing a display in a second user interface of all selectable controls that may be used for editing the selected object.


These and other features and advantages, which characterize the present invention, will be apparent from a reading of the following detailed description and a review of the associated drawings. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention as claimed.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a block diagram showing the architecture of a personal computer that provides an illustrative operating environment for embodiments of the present invention.



FIG. 2 is an illustration of a computer screen display showing a ribbon-shaped user interface for displaying task-based top-level functionality tabs and for displaying a plurality of functionalities available under a selected top-level functionality tab.



FIG. 3 illustrates a computer screen display showing a context menu according to embodiments of the present invention displayed adjacent to a selected object.



FIG. 4 illustrates the screen display of FIG. 3 showing an expanded version of the context menu for providing a menu of formatting options combinations.



FIG. 5 illustrates the computer screen display of FIG. 3 showing the display of a different set of functionality controls available from the displayed context menu.



FIG. 6 illustrates the computer screen display of FIG. 5 showing expansion of the displayed context menu to provide a gallery of images associated with different formatting options combinations that may be applied to an associated document object.



FIG. 7 illustrates a computer screen display showing a context menu according to embodiments of the present invention displayed adjacent to a selected object.



FIG. 8 illustrates the computer screen display of FIG. 7 showing the context menu selectively changed to a different editing context.



FIG. 9 illustrates the computer screen display of FIG. 7 showing the context menu selectively changed to a different editing context.





DETAILED DESCRIPTION

As briefly described above, embodiments of the present invention are directed to an improved user interface for displaying a menu of selectable functionality controls adjacent to or overlaying a selected document or object that identifies the context of the object to which the selectable functionality controls are relevant, which contains rich functionality controls for applying contextually relevant functionality to the selected object, and which may be efficiently switched to a different context for applying a different set of functionalities to a different or neighboring editable object.


In the following detailed description, references are made to the accompanying drawings that form a part hereof, and in which are shown by way of illustrations specific embodiments or examples. These embodiments may be combined, other embodiments may be utilized, and structural changes may be made without departing from the spirit or scope of the present invention. The following detailed description is therefore not to be taken in a limiting sense and the scope of the present invention is defined by the appended claims and their equivalents.


Referring now to the drawings, in which like numerals represent like elements through the several figures, aspects of the present invention and the exemplary operating environment will be described. FIG. 1 and the following discussion are intended to provide a brief, general description of a suitable computing environment in which the invention may be implemented. While the invention will be described in the general context of program modules that execute in conjunction with an application program that runs on an operating system on a personal computer, those skilled in the art will recognize that the invention may also be implemented in combination with other program modules.


Generally, program modules include routines, programs, components, data structures, and other types of structures that perform particular tasks or implement particular abstract data types. Moreover, those skilled in the art will appreciate that the invention may be practiced with other computer system configurations, including hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers, and the like. The invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.


Turning now to FIG. 1, an illustrative computer architecture for a personal computer 2 for practicing the various embodiments of the invention will be described. The computer architecture shown in FIG. 1 illustrates a conventional personal computer, including a central processing unit 4 (“CPU”), a system memory 6, including a random access memory 8 (“RAM”) and a read-only memory (“ROM”) 10, and a system bus 12 that couples the memory to the CPU 4. A basic input/output system containing the basic routines that help to transfer information between elements within the computer, such as during startup, is stored in the ROM 10. The personal computer 2 further includes a mass storage device 14 for storing an operating system 16, application programs, such as an application program 105, and data.


The mass storage device 14 is connected to the CPU 4 through a mass storage controller (not shown) connected to the bus 12. The mass storage device 14 and its associated computer-readable media, provide non-volatile storage for the personal computer 2. Although the description of computer-readable media contained herein refers to a mass storage device, such as a hard disk or CD-ROM drive, it should be appreciated by those skilled in the art that computer-readable media can be any available media that can be accessed by the personal computer 2.


By way of example, and not limitation, computer-readable media may comprise computer storage media and communication media. Computer storage media includes volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EPROM, EEPROM, flash memory or other solid state memory technology, CD-ROM, DVD, or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by the computer.


According to various embodiments of the invention, the personal computer 2 may operate in a networked environment using logical connections to remote computers through a TCP/IP network 18, such as the Internet. The personal computer 2 may connect to the TCP/IP network 18 through a network interface unit 20 connected to the bus 12. It should be appreciated that the network interface unit 20 may also be utilized to connect to other types of networks and remote computer systems. The personal computer 2 may also include an input/output controller 22 for receiving and processing input from a number of devices, including a keyboard or mouse (not shown). Similarly, an input/output controller 22 may provide output to a display screen, a printer, or other type of output device.


As mentioned briefly above, a number of program modules and data files may be stored in the mass storage device 14 and RAM 8 of the personal computer 2, including an operating system 16 suitable for controlling the operation of a networked personal computer, such as the WINDOWS operating systems from Microsoft Corporation of Redmond, Wash. The mass storage device 14 and RAM 8 may also store one or more application programs. In particular, the mass storage device 14 and RAM 8 may store an application program 105 for providing a variety of functionalities to a user. For instance, the application program 105 may comprise many types of programs such as a word processing application, a spreadsheet application, a desktop publishing application, and the like. According to an embodiment of the present invention, the application program 105 comprises a multiple functionality software application for providing word processing functionality, slide presentation functionality, spreadsheet functionality, database functionality and the like. Some of the individual program modules comprising the multiple functionality application 105 include a word processing application 125, a slide presentation application 135, a spreadsheet application 140 and a database application 145. An example of such a multiple functionality application 105 is OFFICE manufactured by Microsoft Corporation. Other software applications illustrated in FIG. 1 include an electronic mail application 130.



FIG. 2 is an illustration of a computer screen display showing a ribbon-shaped user interface for displaying task-based top-level functionality tabs and for displaying a plurality of functionalities available under a selected top-level functionality tab. As briefly described above, the improved user interface of the present invention includes a ribbon-shaped user interface for displaying selectable controls associated with task-based functionality available under a given software application, such as the software application 105 illustrated in FIG. 1. A first section 210 of the user interface 200 includes generic selectable controls for functionality not associated with a particular task, such as word processing versus spreadsheet data analysis. For example, the section 210 includes selectable controls for general file commands such as “file open,” “file save” and “print.” According to one embodiment of the present invention, the selectable controls included in the first section 210 are controls that may be utilized by a variety of software applications comprising a multiple functionality application 105. That is, the selectable controls included in the first section 210 may be controls that are generally found and used across a number of different software applications.


Selectable controls included in the first section 210 may be utilized for all such applications comprising such a multiple functionality application, but other selectable controls presented in the user interface 200 described below, may be tailored to particular tasks which may be performed by particular software applications comprising the multiple functionality application. On the other hand, it should be appreciated that the user interface 200 described herein may be utilized for a single software application such as a word processing application 125, a slide presentation application 135, a spreadsheet application 140, a database application 145, or any other software application which may utilize a user interface for allowing users to apply functionality of the associated application.


Referring still to FIG. 2, adjacent to the first section 210 of the user interface 200 is a task-based tab section. The tab section includes selectable tabs associated with task-based functionality provided by a given software application. For purposes of example, the task-based tabs illustrated in FIG. 2 are associated with tasks that may be performed using a word processing application 125. For example, a “Writing” tab 215 is associated with functionality that may be utilized for performing writing tasks. An “Insert” tab 220 is associated with functionality associated with performing insert operations or tasks. A “Page Layout” tab 230 is associated with functionality provided by the associated application for performing or editing page layout attributes of a given document.


As should be appreciated, many other task-based tabs or selectable controls may be added to the tab section of the user interface for calling functionality associated with other tasks. For example, task tabs may be added for text effects, document styles, review and comment, and the like. And, as described above, the user interface 200 may be utilized for a variety of different software applications. For example, if the user interface 200 is utilized for a slide presentation application, tabs contained in the tab section may include such tabs as “Create Slides,” “Insert,” “Format,” “Drawing,” “Effects,” and the like associated with a variety of tasks that may be performed by a slide presentation application. Similarly, tabs that may be utilized in the tab section of the user interface 200 for a spreadsheet application 140 may include such tabs as “Data” or “Data Entry,” “Lists,” “Pivot Tables,” “Analysis,” “Formulas,” “Pages and Printing,” and the like associated with tasks that may be performed using a spreadsheet application.


Immediately beneath the generic controls section 210 and the task-based tab section is a selectable functionality control section for displaying selectable functionality controls associated with a selected tab 215, 220, 230 from the task-based tab section. According to embodiments of the present invention, when a particular tab, such as the “Writing” tab 215 is selected, selectable functionality available from the associated software application for performing the selected task, for example a writing task, is displayed in logical groupings. For example, referring to FIG. 2, a first logical grouping 240 is displayed under a heading “Clipboard.” According to embodiments of the present invention, the clipboard section 240 includes selectable functionality controls logically grouped together and associated with clipboard actions underneath the general task of writing. For example, the clipboard section 240 may include such selectable controls as a cut control, a copy control, a paste control, a select all control, etc. Adjacent to the clipboard section 240, a second logical grouping 250 is presented under the heading “Formatting.”


Selectable controls presented in the “Formatting” section 250 may include such selectable controls as text justification, text type, font size, line spacing, boldface, italics, underline, etc. Accordingly, functionalities associated with formatting operations are logically grouped together underneath the overall task of “Writing.” A third logical grouping 260 is presented under the heading “Writing Tools.” The writing tools section 260 includes such writing tools as find/replace, autocorrect, etc. According to embodiments of the present invention, upon selection of a different task-based tab from the tab section, a different set of selectable functionality controls in different logical groupings is presented in the user interface 200 associated with the selected task-based tab. For example, if the “Insert” task tab 220 is selected, the selectable functionality controls presented in the user interface 200 are changed from those illustrated in FIG. 2 to include selectable functionality controls associated with the insert task. For detailed information regarding the user interface 200, illustrated in FIG. 2, see U.S. patent application Ser. No. 12/372,386, entitled “Command User Interface for Displaying Selectable Software Functionality Controls,” which is incorporated herein by reference as if fully set out herein.


Referring to FIG. 3, an improved context menu according to embodiments of the present invention is illustrated. A document including an embedded picture object 310 is illustrated in a word processing application workspace. According to embodiments of the present invention, the context menu 320 may be launched adjacent to or near a selected object through a variety of methods. One method of launching the context menu 320 is by focusing the mouse cursor on the desired object followed by right-clicking the mouse right-click button. Other methods may be used for launching the context menu 320, including focusing on the selected object for more than a set amount of time, or by selecting a button from the user interface 200 programmed for launching the context menu 320, or by selecting keyboard keys (for example F1) programmed for launching the context menu. The modality of the context menu allows the menu to stay visible while changes are made to an underlying object or to the software application enabling the context menu. That is, display of the context menu of the present invention does not block execution of the application with which an associated object is edited and with which the menu is displayed.


The context menu 320 includes selectable functionality controls that are relevant to editing the selected object in the selected document. That is, the context menu 320 is populated with one or more selectable functionality controls that may be utilized for editing a particular selected object in a selected document. For example, referring to the context menu 320 illustrated in FIG. 3, the context menu is launched in the context of a selected picture object 310. Accordingly, the selectable functionality controls, such as the paste control, copy control, position control, reset picture control, and the like provide functionality to a user for editing attributes of the selected picture object 310. As should be understood by those skilled in the art, if the context menu 320 is launched in the context of another type of object, then the selectable functionality controls populated in the context menu 320 will be related to the other type of object. For example, a context menu 320 launched in the context of a text object will be populated with functionality controls utilized for editing a text selection.


The one or more selectable controls displayed in the context menu represent a subset of a plurality of functionalities available for use with a selected object. According to an embodiment, the subset of functionalities represented by the one or more controls are selected based on likelihood of immediate usefulness to end users based on historical use. The subset of the plurality of functionalities may comprise one or more of the plurality of functionalities that are at least substantially the most used in editing an object of a same type as the selected object. Additionally, the one or more selectable functionality controls may be displayed in the user interface hierarchically based on frequency of previous use or according to other ordering criteria.


According to embodiments of the present invention, the improved context menu 320 includes rich functionality controls such as the paste control, the copy control, the cut control, and the picture orientation controls illustrated in the upper portion of the context menu 320. In addition, a button 325 is illustrated along a bottom edge of the context menu 320 for allowing a user to selectively display an enhanced listing of tools available for editing the selected object. According to one aspect, the button 325 may be colored differently from other controls in the menu 320 to distinguish the button 325 from other controls. As should be understood, the “Show Picture Tools” button 325 is illustrative of a similar button that may be used in other context menus 320 associated with other editing contexts such as text objects, table objects, spreadsheet objects, and the like.


A context identification and selection tab 328 is illustrated along an upper edge of the context menu 320. The context identification and selection tab 328 both identifies the current context of the context menu 320 and allows the user to change the context of the context menu 320 to provide selectable functionality controls associated with a different editing context in the selected document. For example, referring to FIG. 3, the selected document includes both a picture object and a text object. According to the context identification and selection tab 328, the present context of the context menu 320 is a “Picture” context meaning that the selectable functionality controls displayed in the context menu 320 are associated with functionality for editing a selected picture object. If the user desires to change the context of the context menu 320 to a text editing context, for example, the user may select the context identification and selection tab 328 to drop down a list of available contexts that may be applied to the context menu 320. The user may then select a text context to change the context of the context menu 320 so that selectable functionality controls that may be utilized for text editing will be displayed in the context menu 320.


Referring now to FIG. 4, a pop-out visual picker gallery of images is illustrated adjacent to the context menu 320. According to embodiments of the present invention, selection of certain selectable functionality controls results in a pop-out menu, such as the menu 330, for providing additional selectable functionality controls to the user. The visual picker display 330 provides a gallery of images showing the result of applying a variety of formatting options combinations to the selected object. For example, an image 335 illustrates the way the document will look if the picture object 310 is centered and enlarged. The image 340 illustrates how the document will look if the picture object is moved to a top-left position in the document, and the image 345 illustrates how the document will look if text is positioned both above and below a centered picture object 310. According to embodiments of the present invention, all commands necessary for formatting the selected document according to one of the displayed images 335, 340, 345 are associated with the individual images so that selecting a given image automatically causes the formatting options combination illustrated thereby to be executed on the selected document. For detailed information regarding the gallery of images 330, see U.S. patent application Ser. No. 10/955,942, entitled “An Improved User Interface For Displaying A Gallery Of Formatting Options Applicable To A Selected Object,” which is incorporated herein by reference as if fully set out herein.


Referring now to FIG. 5, presentation of an additional menu of selectable functionality controls is illustrated adjacent to the context menu 320 in response to selecting a functionality control from the context menu 320. As should be appreciated by those skilled in the art, the pop-out menu 520 contains additional selectable functionality controls that may be selected by the user to apply identified functionality to a selected document or object. As shown in FIG. 6, upon selection of a given functionality control from the pop-out menu 620, for example the “3D” functionality control, an additional pop-out menu 640 is provided for displaying additional functionality associated with the selected control from the pop-out menu 620. As illustrated in FIG. 6, the “3D” control 630 is selected, which causes a pop-out gallery of images 640 to be displayed. As described above with reference to FIG. 4, each of the gallery of images 640 illustrates how the selected picture object will look if a formatting options combination associated with a selected image from the gallery of images 640 is applied to the selected object 310.


Referring now to FIG. 7, the context menu 320 is illustrated adjacent to the selected picture object 310 and over a text object where the picture object 310 and the text object are placed in a table structure. Because the context menu 320 is launched in the context of the selected picture object 310, the context menu 320 is still in a picture context and still has selectable functionality controls relevant to editing a picture object 310. However, referring to FIG. 8, if the user elects to apply functionality from the context menu 320 to the text object contained in the table object 700, the user may change the context of the context menu 320 from a picture context to a text context. By selecting the context identification and selection tab 328, as described above with reference to FIG. 3 and by selecting a text context, the context menu 320 is changed so that the selectable functionality controls displayed in the context menu 320 are in the context of a text selection, as illustrated in FIG. 8. That is, the context menu 320 illustrated in FIG. 8 includes such text oriented selectable functionality controls as font, paragraph, bullets and numbering, boldfacing, italics, underlining, and the like.


Referring now to FIG. 9, if the user now elects to change the context of the context menu 320 once again so that the functionality controls displayed in the context menu 320 are associated with yet another context, for example the table object 700, the user may once again select the context identification and selection tab 328 to change the context of the context menu 320 from a text context illustrated in FIG. 8 to a table or row context illustrated in FIG. 9. Upon changing the context of the context menu 320 to a table or row context for editing a row 910 of the table object 700, the selectable functionality displayed in the context menu 320 is changed as illustrated in FIG. 9. For example, after changing the context of the context menu 320 to a row context, such selectable functionality controls as “Insert Rows,” “Delete Rows,” “Merge Cells,” “Distribute Rows Evenly,” and the like are displayed in the context menu 320. As should be understood, the example functionality controls illustrated in the context menus 320 described herein and the example picture and text objects illustrated herein are for purposes of example only and are not restrictive of the invention as claimed herein. That is, the context menu 320 may be displayed according to a variety of different editing contexts, and a variety of different selectable functionality controls may be displayed in the context menu 320 according to the associated editing context.


As described herein, an improved user interface for displaying selectable functionality controls in a context menu is provided. It will be apparent to those skilled in the art that various modifications or variations may be made in the present invention without departing from the scope or spirit of the invention. Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein.

Claims
  • 1. A method for providing a contextually relevant user interface, the method comprising: upon receiving a selection of a first object for editing within a document, displaying the contextually relevant user interface adjacent to the selected first object;displaying, in the user interface, a first context identification control for identifying a first context of the user interface based on at least one first attribute of the selected first object;displaying, in the user interface, a second context changing control for changing the first context of the user interface to a second context of the user interface based on at least one second attribute of a second object within the document, the second control being operative to provide a plurality of available contexts simultaneously, each of the plurality of available contexts being associated with an attribute of an unselected object within the document; anddisplaying, in the user interface, at least one selectable control representing a first subset of a plurality of functionalities, wherein the first subset of the plurality of functionalities comprises at least a first functionality that is at least substantially used in editing objects of a same type as the selected first object.
  • 2. The method of claim 1, further comprising displaying the at least one selectable control in the user interface hierarchically based on frequency of previous use.
  • 3. The method of claim 1, further comprising, displaying, in response to a selection of the second control, the second context associated with the selection of the second control, the second context comprising a second subset of the plurality of functionalities, the second subset of the plurality of functionalities comprising functionalities that are at least substantially used in editing objects of a same type as the second object.
  • 4. The method of claim 1, further comprising displaying in the user interface a third control for causing a display, adjacent to the user interface, of a plurality of additional selectable controls operative to edit the selected first object.
  • 5. The method of claim 1, wherein displaying the at least one selectable control comprises displaying the at least one selectable control further representing functionality of the software operative to edit the selected first object.
  • 6. The method of claim 1, wherein displaying, in the user interface, the at least one selectable control representing the subset of a plurality of functionalities comprises displaying, in the user interface the at least one selectable control representing the subset of the plurality of functionalities associated with a picture object.
  • 7. The method of claim 6, wherein the subset of the plurality of functionalities associated with the picture object comprises functionalities associated with at least one of the following: a paste control, a copy control, a position control, and a reset control.
  • 8. The method of claim 1, further comprising providing a gallery of formatting images in response to a selection of the at least one selectable control, the gallery of formatting comprising at least one formatting image associated with at least one corresponding formatting attribute to be applied to the selected first object, wherein the at least one formatting image provides a visual representation of how the selected object would appear if the at least one formatting image is selected and the at least one corresponding formatting attribute is applied to the selected first object, the at least one formatting image comprising a textual identification of the at least one corresponding formatting attribute.
  • 9. The method of claim 8, further comprising applying the at least one corresponding formatting attribute associated with the at least one formatting image to the selected first object in response to a selection of the at least one formatting image.
  • 10. The method of claim 9, wherein providing the gallery of formatting images includes providing the gallery of formatting images within a second user interface deployed adjacent to the contextually relevant user interface.
  • 11. A system for providing an improved contextually relevant user interface, the system comprising: a memory storage; anda processing unit coupled to the memory storage, wherein the processing unit is operative to:upon receiving a selection of a first object for editing within a document, display the contextually relevant user interface adjacent to the selected first object;display, in the user interface, a context identification tab having a first context identifying control for identifying a first context of the user interface based on at least one first attribute of the selected first object and changing the first context to a second context of the user interface based on at least one second attribute of a second object within the document, the context identification tab being operative to provide a plurality of available contexts simultaneously which, upon selection of one of the plurality of available contexts, causes a replacement of the first context populating the user interface with the second context, each of the available contexts being associated with an attribute of an unselected object in the document; anddisplay, in the user interface, at least one selectable control representing a subset of a plurality of functionalities, wherein the subset of the plurality of functionalities comprises at least one functionality that is at least substantially used in editing objects of a same type as the selected first object.
  • 12. The system of claim 11, further comprising the processing unit being operative to provide a gallery of formatting images in response to a selection of the at least one selectable control, the gallery of formatting comprising at least one formatting image associated with at least one corresponding formatting attribute to be applied to the selected first object, wherein the at least one formatting image provides a visual representation of how the selected object would appear if the at least one formatting image is selected and the at least one corresponding formatting attribute is applied to the selected first object, the at least one formatting image comprising a textual identification of the at least one corresponding formatting attribute.
  • 13. The system of claim 11, further comprising the processing unit being operative to display a third control for selectively causing a display, adjacent to the user interface, of additional selectable controls operative to edit the selected first object.
  • 14. The system of claim 11, wherein the subset of the plurality of functionalities is associated with a picture object.
  • 15. The system of claim 14, wherein the subset of the plurality of functionalities associated with the picture object comprises functionalities associated with at least one of the following: a paste control, a copy control, a position control, and a reset control.
  • 16. A computer readable storage medium storing computer executable instructions which when executed by a computer perform a method for providing a contextually relevant user interface, the method executed by the computer executable instructions comprising: upon receiving a selection of a first object for editing within a document, displaying the contextually relevant user interface adjacent to the selected first object;displaying, in the user interface, a first context identification control for identifying a first context of the user interface based on at least one first attribute of the selected first object;displaying, in the user interface, a second context changing control for changing the first context of the user interface to a second context of the user interface based on at least one second attribute of a second object within the document, the second control being operative to provide a plurality of available contexts simultaneously which, upon selection of one of the plurality of available contexts, causes a replacement of the first context populating the user interface with the second context, each of the plurality of available contexts being associated with an attribute of an unselected object within the document; anddisplaying, in the user interface, at least one selectable control representing a subset of a plurality of functionalities, wherein the subset of the plurality of functionalities comprises at least a first functionality of the plurality of functionalities that is at least substantially the most used in editing objects of a same type as the selected first object.
  • 17. The computer readable storage medium of claim 16, further comprising displaying the at least one selectable control in the user interface hierarchically based on frequency of previous use.
  • 18. The computer readable storage medium of claim 16, further comprising, displaying, in response to a selection of the second control, the second context associated with the selection of the second control, the second context comprising a second subset of the plurality of functionalities, the second subset of the plurality of functionalities comprising functionalities that are at least substantially used in editing objects of a same type as the second object.
  • 19. The computer readable storage medium of claim 16, further comprising displaying in the user interface a third control for causing a display, adjacent to the user interface, of a plurality of additional selectable controls operative to edit the selected first object.
  • 20. The computer readable storage medium of claim 16, wherein displaying the at least one selectable control comprises displaying the at least one selectable control further representing functionality of the software operative to edit the selected first object.
  • 21. The computer readable storage medium of claim 16, wherein displaying, in the user interface, the at least one selectable control representing the subset of a plurality of functionalities comprises displaying, in the user interface the at least one selectable control representing the subset of the plurality of functionalities associated with a picture object.
  • 22. The computer readable storage medium claim 21, wherein the subset of the plurality of functionalities associated with the picture object comprises functionalities associated with at least one of the following: a paste control, a copy control, a position control, and a reset control.
  • 23. The computer readable storage of claim 16, further comprising providing a gallery of formatting images in response to a selection of the at least one selectable control, the gallery of formatting comprising at least one formatting image associated with at least one corresponding formatting attribute to be applied to the selected first object, wherein the at least one formatting image provides a visual representation of how the selected object would appear if the at least one formatting image is selected and the at least one corresponding formatting attribute is applied to the selected first object, the at least one formatting image comprising a textual identification of the at least one corresponding formatting attribute.
  • 24. The computer readable storage medium of claim 16, further comprising applying the at least one corresponding formatting attribute associated with the at least one formatting image to the selected first object in response to a selection of the at least one formatting image.
  • 25. The computer readable storage medium of claim 24, wherein providing the gallery of formatting images includes providing the gallery of formatting images within a second user interface deployed adjacent to the contextually relevant user interface.
  • 26. A computer readable storage medium storing computer executable instructions which when executed by a computer perform a method for providing a contextually relevant user interface, the method executed by the computer executable instructions comprising: upon receiving a selection of first object for editing within a document, displaying a first user interface near the selected first object in a software application workspace, wherein receiving the selection comprises one of the following: detecting that a cursor is focused on the first object, receiving a cursor selection of the first object, and receiving a keyboard shortcut;persisting a display of the contextually relevant user interface without preventing the software application workspace from executing further operations on the selected first object;displaying in the first user interface a first context identification control for identifying a context of the user interface based on at least one first attribute of the selected first object;displaying in the first user interface a first set of selectable controls representing a first subset of a plurality of functionalities, the first subset of the plurality of functionalities comprising at least one first functionality that is at least used in editing objects of a same type as the selected first object, wherein displaying the first set of selectable controls representing the first subset of the plurality of functionalities comprises displaying the first set of selectable controls hierarchically based on frequency of previous use;displaying in the first user interface a second context changing control for changing the context of the user interface for editing a second object within the document, the second control being operative to, upon selection, provide a drop-down listing of a plurality of available contexts simultaneously which, upon selection of one of the plurality of available contexts, causes a replacement of a current context populating the user interface with the selected available context, each of the plurality of available contexts being associated with an attribute of an unselected object in the document;displaying in the first user interface a second set of selectable controls representing a second subset of the plurality of functionalities in response to a selection of an available context provided by the second control for changing the context of the first user interface, the second subset of the plurality of functionalities comprising at least one second functionality that is at least substantially used in editing objects of a same type as the available object associated with the selected available context;displaying in the first user interface a third control for causing a display in a second user interface of a third set of selectable controls operative to edit the selected first object;displaying in the second user interface, in response to a selection of the third control, the third set of selectable controls;providing a gallery of formatting images in response to a selection of at least one selectable control of the third set of selectable controls, the gallery of formatting images comprising at least one formatting image associated with at least one corresponding formatting attribute to be applied to an applicable one of the first object and the second object upon selection of the at least one formatting image, wherein providing the gallery of formatting images includes providing the gallery of formatting images within the second user interface, and wherein the at least one formatting image provides a visual representation of how the applicable object would appear if the formatting image is selected and the at least one corresponding formatting attribute is applied to the applicable object, the at least one formatting image comprising a textual identification of the at least one corresponding formatting attribute; andapplying the at least one corresponding formatting attribute associated with the at least one formatting image to the applicable object in response to a selection of the at least one formatting image.
  • 27. The computer readable storage medium of claim 26, wherein displaying, in the first user interface, the first set of selectable controls representing the first subset of a plurality of functionalities comprises displaying, in the user interface the first set of selectable controls representing the first subset of the plurality of functionalities associated with a picture object.
  • 28. The computer readable storage medium claim 27, wherein the first subset of the plurality of functionalities associated with the picture object comprises functionalities associated with at least one of the following: a paste control, a copy control, a position control, and a reset control.
CROSS-REFERENCE TO RELATED APPLICATIONS

The present application claims priority to U.S. Provisional Application No. 60/601,815, filed Aug. 16, 2004, entitled “Improved User Interfaces for Computer Software Applications.”

US Referenced Citations (444)
Number Name Date Kind
4823283 Diehm et al. Apr 1989 A
5155806 Hoeber et al. Oct 1992 A
5220675 Padawer et al. Jun 1993 A
5247438 Subas et al. Sep 1993 A
5323314 Baber et al. Jun 1994 A
5377354 Scannell et al. Dec 1994 A
5412772 Monson May 1995 A
5461708 Kahn Oct 1995 A
5500936 Allen et al. Mar 1996 A
5519606 Frid-Nielsen et al. May 1996 A
5559875 Bieselin et al. Sep 1996 A
5559944 Ono Sep 1996 A
5570109 Jenson Oct 1996 A
5581677 Myers et al. Dec 1996 A
5588107 Bowden et al. Dec 1996 A
5592602 Edmunds Jan 1997 A
5596694 Capps Jan 1997 A
5625783 Ezekiel et al. Apr 1997 A
5634100 Capps May 1997 A
5634128 Messina May 1997 A
5638504 Scott et al. Jun 1997 A
5644737 Tuniman et al. Jul 1997 A
5659693 Hansen et al. Aug 1997 A
5664127 Anderson et al. Sep 1997 A
5664208 Pavley et al. Sep 1997 A
5673403 Brown et al. Sep 1997 A
5721847 Johnson Feb 1998 A
5734915 Roewer Mar 1998 A
5760768 Gram Jun 1998 A
5760773 Berman et al. Jun 1998 A
5761646 Frid-Nielsen et al. Jun 1998 A
5778402 Gipson Jul 1998 A
5778404 Capps et al. Jul 1998 A
5787295 Nakao Jul 1998 A
5793365 Tang et al. Aug 1998 A
5805167 Van Cruyningen Sep 1998 A
5812132 Goldstein Sep 1998 A
5821936 Shaffer et al. Oct 1998 A
5828376 Solimene et al. Oct 1998 A
5838321 Wolf Nov 1998 A
5842009 Borovoy et al. Nov 1998 A
5844558 Kumar et al. Dec 1998 A
5844572 Schott Dec 1998 A
5855006 Huemoeller et al. Dec 1998 A
5872973 Mitchell et al. Feb 1999 A
5873108 Goyal et al. Feb 1999 A
5885006 Sheedy Mar 1999 A
5893073 Kasso et al. Apr 1999 A
5893125 Shostak Apr 1999 A
5898436 Stewart et al. Apr 1999 A
5899979 Miller et al. May 1999 A
5905863 Knowles et al. May 1999 A
5926806 Marshall et al. Jul 1999 A
5936625 Kahl et al. Aug 1999 A
5937160 Davis et al. Aug 1999 A
5940078 Nagarajayya et al. Aug 1999 A
5940847 Fein et al. Aug 1999 A
5943051 Onda et al. Aug 1999 A
5960406 Rasansky et al. Sep 1999 A
5970466 Detjen et al. Oct 1999 A
5999173 Ubillos Dec 1999 A
5999938 Bliss et al. Dec 1999 A
6008806 Nakajima et al. Dec 1999 A
6012075 Fein et al. Jan 2000 A
6016478 Zhang et al. Jan 2000 A
6018343 Wang et al. Jan 2000 A
6034683 Mansour et al. Mar 2000 A
6038395 Chow et al. Mar 2000 A
6038542 Ruckdashel Mar 2000 A
6067087 Krauss et al. May 2000 A
6067551 Brown et al. May 2000 A
6072492 Schagen et al. Jun 2000 A
6073142 Geiger et al. Jun 2000 A
6085206 Domini et al. Jul 2000 A
6101480 Conmy et al. Aug 2000 A
6133915 Arcuri et al. Oct 2000 A
6175363 Williams et al. Jan 2001 B1
6188403 Sacerdoti et al. Feb 2001 B1
6192381 Stiegemeier et al. Feb 2001 B1
6195094 Celebiler Feb 2001 B1
6199102 Cobb Mar 2001 B1
6211879 Soohoo Apr 2001 B1
6216122 Elson Apr 2001 B1
6219670 Mocek et al. Apr 2001 B1
6222540 Sacerdoti Apr 2001 B1
6232971 Haynes May 2001 B1
6236396 Jenson et al. May 2001 B1
6237135 Timbol May 2001 B1
6256628 Dobson et al. Jul 2001 B1
6269341 Redcay, Jr. Jul 2001 B1
6278450 Arcuri et al. Aug 2001 B1
6289317 Peterson Sep 2001 B1
6307544 Harding Oct 2001 B1
6307574 Ashe Oct 2001 B1
6323883 Minoura et al. Nov 2001 B1
6326962 Szabo Dec 2001 B1
6327046 Miyamoto et al. Dec 2001 B1
6341277 Coden et al. Jan 2002 B1
6353451 Teibel et al. Mar 2002 B1
6359634 Cragun et al. Mar 2002 B1
6373507 Camara et al. Apr 2002 B1
6384849 Morcos et al. May 2002 B1
6385769 Lewallen May 2002 B1
6405216 Minnaert et al. Jun 2002 B1
6424829 Kraft Jul 2002 B1
6429882 Abdelnur et al. Aug 2002 B1
6430563 Fritz et al. Aug 2002 B1
6433801 Moon et al. Aug 2002 B1
6433831 Dinnwiddie et al. Aug 2002 B1
6434598 Gish Aug 2002 B1
6442527 Worthington Aug 2002 B1
6456304 Angiulo et al. Sep 2002 B1
6457062 Pivowar et al. Sep 2002 B1
6459441 Perroux et al. Oct 2002 B1
6466236 Pivowar et al. Oct 2002 B1
6469722 Kinoe et al. Oct 2002 B1
6469723 Gould Oct 2002 B1
6480865 Lee et al. Nov 2002 B1
6484180 Lyons et al. Nov 2002 B1
6493006 Gourdol et al. Dec 2002 B1
6493731 Jones et al. Dec 2002 B1
6507845 Cohen et al. Jan 2003 B1
6546417 Baker Apr 2003 B1
6570596 Frederiksen May 2003 B2
6578192 Boehme et al. Jun 2003 B1
6583798 Hoek et al. Jun 2003 B1
6618732 White et al. Sep 2003 B1
6621504 Nadas et al. Sep 2003 B1
6621508 Shiraishi et al. Sep 2003 B1
6635089 Burkett et al. Oct 2003 B1
6664983 Ludolph Dec 2003 B2
6680749 Anderson et al. Jan 2004 B1
6686938 Jobs et al. Feb 2004 B1
6691281 Sorge et al. Feb 2004 B1
6701513 Bailey Mar 2004 B1
6708205 Sheldon et al. Mar 2004 B2
6721402 Usami Apr 2004 B2
6727919 Reder et al. Apr 2004 B1
6732330 Claussen et al. May 2004 B1
6734880 Chang et al. May 2004 B2
6750890 Sugimoto Jun 2004 B1
6785868 Raff Aug 2004 B1
6789107 Bates et al. Sep 2004 B1
6825859 Severenuk et al. Nov 2004 B1
6826729 Giesen et al. Nov 2004 B1
6850255 Muschetto Feb 2005 B2
6871195 Ryan et al. Mar 2005 B2
6882354 Nielsen Apr 2005 B1
6895426 Cortright et al. May 2005 B1
6904449 Quinones Jun 2005 B1
6906717 Couckuyt et al. Jun 2005 B2
6915492 Kurtenbach et al. Jul 2005 B2
6924797 MacPhail Aug 2005 B1
6925605 Bates et al. Aug 2005 B2
6928613 Ishii Aug 2005 B1
6941304 Gainey et al. Sep 2005 B2
6964025 Angiulo Nov 2005 B2
6983889 Alles Jan 2006 B2
6988241 Guttman et al. Jan 2006 B1
6990637 Anthony et al. Jan 2006 B2
6990652 Parthasarathy et al. Jan 2006 B1
7027463 Mathew et al. Apr 2006 B2
7032210 Alloing et al. Apr 2006 B2
7039596 Lu May 2006 B1
7046848 Olcott May 2006 B1
7069538 Renshaw Jun 2006 B1
7107544 Luke Sep 2006 B1
7110936 Hiew et al. Sep 2006 B2
7111238 Kuppusamy et al. Sep 2006 B1
7117370 Khan et al. Oct 2006 B2
7149983 Robertson et al. Dec 2006 B1
7152207 Underwood et al. Dec 2006 B1
7181697 Tai et al. Feb 2007 B2
7188073 Tam et al. Mar 2007 B1
7188317 Hazel Mar 2007 B1
7206813 Dunbar et al. Apr 2007 B2
7206814 Kirsch Apr 2007 B2
7212208 Khozai May 2007 B2
7216301 Moehrle May 2007 B2
7219305 Jennings May 2007 B2
7240323 Desai et al. Jul 2007 B1
7249325 Donaldson Jul 2007 B1
7263668 Lentz Aug 2007 B1
7290033 Goldman et al. Oct 2007 B1
7296241 Oshiro et al. Nov 2007 B2
7325204 Rogers Jan 2008 B2
7328409 Awada et al. Feb 2008 B2
7337185 Ellis et al. Feb 2008 B2
7346705 Hullot et al. Mar 2008 B2
7346769 Forlenza et al. Mar 2008 B2
7356772 Brownholtz et al. Apr 2008 B2
7386535 Kalucha et al. Jun 2008 B1
7392249 Harris et al. Jun 2008 B1
7395221 Doss et al. Jul 2008 B2
7395500 Whittle et al. Jul 2008 B2
7421660 Charmock et al. Sep 2008 B2
7421690 Forstall et al. Sep 2008 B2
7469385 Harper et al. Dec 2008 B2
7472117 Dettinger et al. Dec 2008 B2
7499907 Brown et al. Mar 2009 B2
7505954 Heidloff et al. Mar 2009 B2
7530029 Satterfield et al. May 2009 B2
7555707 Labarge et al. Jun 2009 B1
7567964 Brice et al. Jul 2009 B2
7627561 Pell et al. Dec 2009 B2
7664821 Ancin et al. Feb 2010 B1
7703036 Satterfield et al. Apr 2010 B2
7707255 Satterfield et al. Apr 2010 B2
7711742 Bennett et al. May 2010 B2
7716593 Durazo et al. May 2010 B2
7739259 Hartwell et al. Jun 2010 B2
7747966 Leukart et al. Jun 2010 B2
7788598 Bansal et al. Aug 2010 B2
7802199 Shneerson et al. Sep 2010 B2
7831902 Sourov et al. Nov 2010 B2
7853877 Giesen et al. Dec 2010 B2
7865868 Falzone Schaw et al. Jan 2011 B2
7870465 VerSteeg Jan 2011 B2
7886290 Dhanjal et al. Feb 2011 B2
7895531 Radtke et al. Feb 2011 B2
20010032220 Van Hoff Oct 2001 A1
20010035882 Stoakley et al. Nov 2001 A1
20010049677 Talib et al. Dec 2001 A1
20020007380 Bauchot et al. Jan 2002 A1
20020029247 Kawamoto Mar 2002 A1
20020037754 Hama et al. Mar 2002 A1
20020052721 Ruff et al. May 2002 A1
20020052880 Fruensgaard et al. May 2002 A1
20020070977 Morcos et al. Jun 2002 A1
20020075330 Rosenzweig et al. Jun 2002 A1
20020078143 DeBoor et al. Jun 2002 A1
20020083054 Peltonen et al. Jun 2002 A1
20020091697 Huang et al. Jul 2002 A1
20020091739 Ferlitsch et al. Jul 2002 A1
20020122071 Camara et al. Sep 2002 A1
20020133557 Winarski Sep 2002 A1
20020135621 Angiulo et al. Sep 2002 A1
20020140731 Subramaniam et al. Oct 2002 A1
20020140740 Chen Oct 2002 A1
20020149623 West et al. Oct 2002 A1
20020149629 Craycroft et al. Oct 2002 A1
20020154178 Barnett et al. Oct 2002 A1
20020158876 Janssen Oct 2002 A1
20020163538 Shteyn Nov 2002 A1
20020175938 Hackworth Nov 2002 A1
20020175955 Gourdol et al. Nov 2002 A1
20020186257 Cadiz et al. Dec 2002 A1
20020196293 Suppan et al. Dec 2002 A1
20030009455 Carlson et al. Jan 2003 A1
20030011638 Chung Jan 2003 A1
20030011639 Webb Jan 2003 A1
20030014421 Jung Jan 2003 A1
20030014490 Bates et al. Jan 2003 A1
20030022700 Wang Jan 2003 A1
20030025732 Prichard Feb 2003 A1
20030035917 Hyman Feb 2003 A1
20030038832 Sobol Feb 2003 A1
20030043200 Faieta et al. Mar 2003 A1
20030043211 Kremer et al. Mar 2003 A1
20030046528 Haitani et al. Mar 2003 A1
20030066025 Garner et al. Apr 2003 A1
20030070143 Maslov Apr 2003 A1
20030093490 Yamamoto et al. May 2003 A1
20030097361 Huang et al. May 2003 A1
20030097640 Abrams et al. May 2003 A1
20030098891 Molander May 2003 A1
20030106024 Silverbrook et al. Jun 2003 A1
20030110191 Handsaker et al. Jun 2003 A1
20030112278 Driskell Jun 2003 A1
20030135825 Gertner et al. Jul 2003 A1
20030156140 Watanabe Aug 2003 A1
20030160821 Yoon Aug 2003 A1
20030163537 Rohall et al. Aug 2003 A1
20030167310 Moody et al. Sep 2003 A1
20030169284 Dettinger et al. Sep 2003 A1
20030195937 Kircher et al. Oct 2003 A1
20030206646 Brackett Nov 2003 A1
20030218611 Ben-Tovim et al. Nov 2003 A1
20030226106 McKellar et al. Dec 2003 A1
20030227487 Hugh Dec 2003 A1
20030233419 Beringer Dec 2003 A1
20040003351 Sommerer et al. Jan 2004 A1
20040006570 Gelb et al. Jan 2004 A1
20040012633 Helt Jan 2004 A1
20040030993 Hong et al. Feb 2004 A1
20040056894 Zaika et al. Mar 2004 A1
20040083432 Kawamura et al. Apr 2004 A1
20040088359 Simpson May 2004 A1
20040090315 Mackjust et al. May 2004 A1
20040100504 Sommer May 2004 A1
20040100505 Cazier May 2004 A1
20040107197 Shen et al. Jun 2004 A1
20040109025 Hullot et al. Jun 2004 A1
20040109033 Vienneau et al. Jun 2004 A1
20040117451 Chung Jun 2004 A1
20040119760 Grossman et al. Jun 2004 A1
20040122789 Ostertag et al. Jun 2004 A1
20040125142 Mock et al. Jul 2004 A1
20040128275 Moehrle Jul 2004 A1
20040133854 Black Jul 2004 A1
20040142720 Smethers Jul 2004 A1
20040153968 Ching et al. Aug 2004 A1
20040164983 Khozai Aug 2004 A1
20040168153 Marvin Aug 2004 A1
20040186775 Margiloff et al. Sep 2004 A1
20040189694 Kurtz et al. Sep 2004 A1
20040192440 Evans et al. Sep 2004 A1
20040215612 Brody Oct 2004 A1
20040221234 Imai Nov 2004 A1
20040230508 Minnis et al. Nov 2004 A1
20040230906 Pik et al. Nov 2004 A1
20040239700 Bacshy Dec 2004 A1
20040243938 Weise et al. Dec 2004 A1
20040260756 Forstall et al. Dec 2004 A1
20040261013 Wynn et al. Dec 2004 A1
20040268231 Tunning Dec 2004 A1
20040268270 Hill et al. Dec 2004 A1
20050004989 Satterfield et al. Jan 2005 A1
20050004990 Durazo et al. Jan 2005 A1
20050005235 Satterfield et al. Jan 2005 A1
20050005249 Hill et al. Jan 2005 A1
20050010871 Ruthfield et al. Jan 2005 A1
20050021504 Atchison Jan 2005 A1
20050022116 Bowman et al. Jan 2005 A1
20050033614 Lettovsky et al. Feb 2005 A1
20050039142 Jalon et al. Feb 2005 A1
20050043015 Muramatsu Feb 2005 A1
20050044500 Orimoto et al. Feb 2005 A1
20050055449 Rappold, III Mar 2005 A1
20050057584 Gruen et al. Mar 2005 A1
20050086135 Lu Apr 2005 A1
20050091576 Relyea et al. Apr 2005 A1
20050097465 Giesen et al. May 2005 A1
20050114778 Branson et al. May 2005 A1
20050117179 Ito et al. Jun 2005 A1
20050132010 Muller Jun 2005 A1
20050132053 Roth et al. Jun 2005 A1
20050138576 Baumert et al. Jun 2005 A1
20050144241 Stata et al. Jun 2005 A1
20050144284 Ludwig et al. Jun 2005 A1
20050144568 Gruen et al. Jun 2005 A1
20050172262 Lalwani Aug 2005 A1
20050177789 Abbar et al. Aug 2005 A1
20050183008 Crider et al. Aug 2005 A1
20050203975 Jindal et al. Sep 2005 A1
20050216863 Schumacher et al. Sep 2005 A1
20050223066 Buchheit et al. Oct 2005 A1
20050223329 Schwartz et al. Oct 2005 A1
20050234910 Buchheit et al. Oct 2005 A1
20050251757 Farn Nov 2005 A1
20050256867 Walther et al. Nov 2005 A1
20050278656 Goldthwaite et al. Dec 2005 A1
20050289109 Arrouye et al. Dec 2005 A1
20050289156 Maryka et al. Dec 2005 A1
20050289158 Weiss et al. Dec 2005 A1
20060015816 Kuehner et al. Jan 2006 A1
20060020962 Stark Jan 2006 A1
20060026033 Brydon et al. Feb 2006 A1
20060026213 Yaskin et al. Feb 2006 A1
20060026242 Kuhlmann et al. Feb 2006 A1
20060036580 Stata Feb 2006 A1
20060036945 Radtke et al. Feb 2006 A1
20060036946 Radtke et al. Feb 2006 A1
20060036950 Himberger et al. Feb 2006 A1
20060036964 Satterfield et al. Feb 2006 A1
20060036965 Harris et al. Feb 2006 A1
20060041545 Heidloff et al. Feb 2006 A1
20060047644 Bocking et al. Mar 2006 A1
20060064434 Gilbert et al. Mar 2006 A1
20060069604 Leukart et al. Mar 2006 A1
20060069686 Beyda et al. Mar 2006 A1
20060080303 Sargent et al. Apr 2006 A1
20060095865 Rostom May 2006 A1
20060101051 Carr et al. May 2006 A1
20060101350 Scott May 2006 A1
20060111931 Johnson et al. May 2006 A1
20060117249 Hu et al. Jun 2006 A1
20060117302 Mercer et al. Jun 2006 A1
20060129937 Shafron Jun 2006 A1
20060132812 Barnes et al. Jun 2006 A1
20060155689 Blakeley et al. Jul 2006 A1
20060161849 Miller et al. Jul 2006 A1
20060161863 Gallo Jul 2006 A1
20060168522 Bala Jul 2006 A1
20060173824 Bensky Aug 2006 A1
20060173961 Turski et al. Aug 2006 A1
20060218500 Sauve et al. Sep 2006 A1
20060242557 Nortis, III Oct 2006 A1
20060242575 Winser Oct 2006 A1
20060248012 Kircher et al. Nov 2006 A1
20060259449 Betz et al. Nov 2006 A1
20060271869 Thanu et al. Nov 2006 A1
20060271910 Burcham et al. Nov 2006 A1
20060282817 Darst et al. Dec 2006 A1
20060294452 Matsumoto Dec 2006 A1
20060294526 Hambrick et al. Dec 2006 A1
20070006206 Dhanjal et al. Jan 2007 A1
20070050182 Sneddon et al. Mar 2007 A1
20070050401 Young et al. Mar 2007 A1
20070055936 Dhanjal et al. Mar 2007 A1
20070055943 McCormack et al. Mar 2007 A1
20070061306 Pell et al. Mar 2007 A1
20070061307 Hartwell et al. Mar 2007 A1
20070061308 Hartwell et al. Mar 2007 A1
20070061738 Taboada et al. Mar 2007 A1
20070106951 McCormack et al. May 2007 A1
20070143662 Carlson et al. Jun 2007 A1
20070143671 Paterson et al. Jun 2007 A1
20070180040 Etgen et al. Aug 2007 A1
20070185826 Brice et al. Aug 2007 A1
20070203991 Fisher et al. Aug 2007 A1
20070240057 Satterfield et al. Oct 2007 A1
20070260996 Jakobson Nov 2007 A1
20070279417 Garg et al. Dec 2007 A1
20070282956 Staats Dec 2007 A1
20070300168 Bosma et al. Dec 2007 A1
20080005686 Singh Jan 2008 A1
20080034304 Feuerbacher et al. Feb 2008 A1
20080040682 Sorenson et al. Feb 2008 A1
20080052670 Espinosa et al. Feb 2008 A1
20080104505 Keohane et al. May 2008 A1
20080109787 Wang et al. May 2008 A1
20080134138 Chamieh et al. Jun 2008 A1
20080141242 Shapiro Jun 2008 A1
20080155555 Kwong Jun 2008 A1
20080178110 Hill et al. Jul 2008 A1
20080244440 Bailey Oct 2008 A1
20090007003 Dukhon et al. Jan 2009 A1
20090012984 Ravid et al. Jan 2009 A1
20090083656 Dukhon et al. Mar 2009 A1
20090100009 Karp Apr 2009 A1
20090106375 Carmel et al. Apr 2009 A1
20090217192 Dean et al. Aug 2009 A1
20090222763 Dukhon et al. Sep 2009 A1
20090319619 Affronti Dec 2009 A1
20090319911 McCann Dec 2009 A1
20100060645 Garg et al. Mar 2010 A1
20100180226 Satterfield et al. Jul 2010 A1
20100191818 Satterfield et al. Jul 2010 A1
20100211889 Durazo et al. Aug 2010 A1
20100223575 Leukart et al. Sep 2010 A1
20100293470 Zhao et al. Nov 2010 A1
20110072396 Giesen et al. Mar 2011 A1
20110138273 Radtke et al. Jun 2011 A1
Foreign Referenced Citations (35)
Number Date Country
0 910 007 Apr 1999 EP
1 077 405 Feb 2001 EP
1 672 518 Jun 2001 EP
1223503 Jul 2002 EP
1376337 Feb 2004 EP
1 462 999 Sep 2004 EP
1 542 133 Jun 2005 EP
1 835 434 Sep 2007 EP
2391148 Jan 2004 GB
P 0027717 Mar 2011 ID
P 0027754 Mar 2011 ID
05-204579 Aug 1993 JP
06-342357 Dec 1994 JP
10-074217 Mar 1998 JP
10-326171 Dec 1998 JP
11-175258 Jul 1999 JP
2001-503893 Mar 2001 JP
2001-337944 Dec 2001 JP
2003-101768 Apr 2003 JP
2003-256302 Sep 2003 JP
2004-342115 Dec 2004 JP
2005-236089 Sep 2011 JP
4832024 Sep 2011 JP
10-2005-0023805 Mar 2005 KR
10-2005-0036702 Apr 2005 KR
1-2005-000404 Aug 2011 PH
WO 9904353 Jan 1999 WO
WO 9927495 Jun 1999 WO
WO 01055894 Aug 2001 WO
WO 02091162 Nov 2002 WO
WO 03003240 Sep 2003 WO
WO 003098500 Nov 2003 WO
WO 2007033159 Mar 2007 WO
WO 2007027737 Aug 2007 WO
WO 2008121718 Sep 2008 WO
Related Publications (1)
Number Date Country
20060036945 A1 Feb 2006 US
Provisional Applications (1)
Number Date Country
60601815 Aug 2004 US