Lightweight reference user interface

Information

  • Patent Grant
  • 7992085
  • Patent Number
    7,992,085
  • Date Filed
    Tuesday, May 15, 2007
    17 years ago
  • Date Issued
    Tuesday, August 2, 2011
    12 years ago
Abstract
A lightweight information user interface is provided for displaying information about a focused-on (e.g., mouse-over) text item, data item or other object in an electronic document that minimizes interruption of workflow with the electronic document. Upon focus on a word or other object in an electronic document a quick look-up function may be invoked for retrieving information from a local or remote information source about the focused-on item. Retrieved information, for example, translations, dictionary definitions and research information, is displayed in close proximity to the focused-on item in a lightweight information user interface. Information may be displayed according to a variety of media types including text, audio, video, pictures, bitmap images, etc.
Description
BACKGROUND

Often when a user is reading, editing or otherwise reviewing an electronic document, the user finds he or she needs additional information about a word, name or other information contained in a given document. For example, the user may need contact information for a name contained in a document or electronic mail message. For another example, the user may need a translation of one or more words contained in a document or message written in a language other than the user's native language. For another example, the user may need research information about a company or other institution identified in a document or message.


According to prior methods and systems, the user typically must interrupt the flow of his or her work with the document to launch and utilize some type of external information or research tool. For example, the user may launch a contacts application to obtain contact information on a name contained in a document or message. The user may launch a dictionary or translation tool to obtain a definition or translation for one or more words contained in a document or message. The user may launch a research tool, for example, an Internet or intranet browsing application, associated with the application in use or separate from the application in use to obtain research information on one or more words, data items or objects contained or referenced in a document or message. While such methods and systems may provide the user with the desired information, the interruption to the user's work flow is cumbersome, time consuming and distracting, particularly when the user must obtain needed information many times for a given document or message.


It is with respect to these and other considerations that the present invention has been made.


SUMMARY

This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended as an aid in determining the scope of the claimed subject matter.


The above and other problems are solved by methods, systems and computer products for connecting users with needed information via a lightweight user interface that minimizes interruption of workflow. According to an aspect of the invention, a lightweight information user interface is deployed in an electronic document or electronic mail message in close proximity to a word, data item or other object for providing quick access to information about the word, data item or other object. The lightweight information user interface is initially deployed according to a default size, but the user interface may be selectively expanded to provide additional information.


According to other aspects of the invention, the lightweight information user interface is automatically deployed with information about a selected text item, data item or other object. The lightweight information user interface may be deployed by selecting deployment from a menu or in response to a keyboard accelerator combination, for example, ALT or CTRL key plus mouse click.


According to other aspects of the invention, information about a selected text item, data item or other object may be obtained for display in the lightweight information user interface from local or remote information sources. According to one aspect, a selected text or data item may be “smart tagged” as a particular text or data type by sending a selected text or data item to a recognizer module for identification. Once the selected text or data item is recognized as belonging to a particular type, information associated with the identified type may be provided in the lightweight user interface. For example, if a given word is identified as a name, the word may then be used to obtain contact information from a local or remote contacts information source.


According to another aspect of the invention, information may be obtained for the lightweight information user interface via an Extensible Markup Language (XML) protocol. According to this aspect, an XML-based information query is utilized for obtaining a limited amount of information associated with a selected text item, data item or object for increasing speed and efficiency of information retrieval.


According to another aspect, a lightweight information user interface may automatically provide functionality op a focused-on word, data item or object, including automatic conversion of a word or text string from one language to another language. In addition, information on a focused-on word, data item or object may be provided according to a variety of media types, including text, audio, video, music, bitmap images, etc. The multi-media information may be provided via an extended XML-based schema.


These and other features and advantages, which characterize the present invention, will be apparent from a reading of the following detailed description and a review of the associated drawings. It is to be understood that both the foregoing general description and the following detailed description are explanatory only and are not restrictive of the invention as claimed.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 illustrates an exemplary computing operating environment for embodiments of the present invention.



FIG. 2 illustrates a computer screen display of an electronic document showing a lightweight information user interface in proximity to a selected text item.



FIG. 3 illustrates a lightweight information user interface in both a default size and an expanded size according to embodiments of the present invention.



FIG. 4 illustrates a drop-down menu for selecting deployment of a lightweight information user interface according to embodiments of the present invention.



FIG. 5 illustrates a simplified block diagram of a computing architecture for obtaining information for populating a lightweight information user interface according to embodiments of the present invention.



FIG. 6 illustrates a simplified block diagram of an XML-based information query for information required for populating a lightweight information user interface according to embodiments of the present invention.



FIG. 7 illustrates a computer screen display of an electronic document showing a lightweight information user interface deployed for language translation.



FIG. 8 illustrates a computer screen display of an electronic document showing a user interface component for selectively activating the lightweight information user interface of FIG. 7.



FIG. 9 illustrates a computer screen display of an electronic document showing a user interface component associated with a deployed context menu, the user interface component allowing activation of the lightweight information user interface of FIG. 7.



FIG. 10 illustrates a computer screen display of an electronic document showing a lightweight information user interface deployed with language translation information for a focused-on word.



FIG. 11 illustrates a computer screen display of an electronic document showing a lightweight information user interface deployed for language translation and showing icons for selectively receiving information according to different media types.





DETAILED DESCRIPTION

As briefly described above, embodiments of the present invention are directed to methods, systems and computer products for providing information via a lightweight information user interface about a focused-on or selected text item, data item or other object in an electronic document that minimizes interruption of workflow with the electronic document. In the following detailed description, references are made to the accompanying drawings that form a part hereof, and in which are shown by way of illustrations specific embodiments or examples. These embodiments may be combined, other embodiments may be utilized, and structural changes may be made without departing from the spirit or scope of the present invention. The following detailed description is therefore not to be taken in a limiting sense and the scope of the present invention is defined by the appended claims and their equivalents.


Referring now to the drawings, in which like numerals refer to like elements through the several Figs., aspects of the present invention and an exemplary computing operating environment will be described. FIG. 1 and the following discussion are intended to provide a brief, general description of a suitable computing environment in which the invention may be implemented. While the invention will be described in the general context of program modules that execute in conjunction with an application program that runs on an operating system on a personal computer, those skilled in the art will recognize that the invention may also be implemented in combination with other program modules.


Generally, program modules include routines, programs, components, data structures, and other types of structures that perform particular tasks or implement particular abstract data types. Moreover, those skilled in the art will appreciate that the invention may be practiced with other computer system configurations, including hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers, and the like. The invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.


Embodiments of the invention may be implemented as a computer process (method), a computing system, or as an article of manufacture, such as a computer program product or computer readable media. The computer program product may be a computer storage media readable by a computer system and encoding a computer program of instructions for executing a computer process. The computer program product may also be a propagated signal on a carrier readable by a computing system and encoding a computer program of instructions for executing a computer process.


With reference to FIG. 1, one exemplary system for implementing the invention includes a computing device, such as computing device 100. In a basic configuration, the computing device 100 typically includes at least one processing unit 102 and system memory 104. Depending on the exact configuration and type of computing device, the system memory 104 may be volatile (such as RAM), non-volatile (such as ROM, flash memory, etc.) or some combination of the two. System memory 104 typically includes an operating system 105 suitable for controlling the operation of a networked personal computer, such as the WINDOWS® operating systems from MICROSOFT CORPORATION of Redmond, Wash. The system memory 104 may also include one or more software applications 106 and may include program data 107. This basic configuration is illustrated in FIG. 1 by those components within dashed line 108.


According to embodiments of the invention, the application 106 may comprise many types of software applications, such as an electronic mail program, a calendaring program, an Internet browsing program, and the like. An example of such programs is OUTLOOK® manufactured by MICROSOFT CORPORATION. The application 106 may include a number of other types of software applications including a multiple-functionality software application for providing many other types of functionalities. Such a multiple-functionality application may include a number of program modules, such as a word processing program, a spreadsheet program, a slide presentation program, a database program, and the like. An example of such a multiple-functionality application is OFFICE® manufactured by MICROSOFT CORPORATION. According to embodiments of the present invention, the application 106 is illustrative of any software application with which an electronic document (including electronic mail messages) may be created or edited and in which a lightweight information user interface may be utilized for providing information associated with a selected text item, data item or other object in the electronic document.


The computing device 100 may have additional features or functionality. For example, the computing device 100 may also include additional data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape. Such additional storage is illustrated in FIG. 1 by removable storage 109 and non-removable storage 110. Computer storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data. System memory 104, removable storage 109 and non-removable storage 110 are all examples of computer storage media. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computing device 100. Any such computer storage media may be part of device 100. Computing device 100 may also have input device(s) 112 such as keyboard, mouse, pen, voice input device, touch input device, etc. Output device(s) 114 such as a display, speakers, printer, etc. may also be included. These devices are well known in the art and need not be discussed at length here.


The computing device 100 may also contain communication connections 116 that allow the device to communicate with other computing devices 118, such as over a network in a distributed computing environment, for example, an intranet or the Internet. Communication connection 116 is one example of communication media. Communication media may typically be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. The term computer readable media as used herein includes both storage media and communication media.


The example user interface 200 shown in FIG. 2 is illustrative of an electronic document workspace provided by a software application 106, for example, a word processing application, a slide presentation application, a spreadsheet application, an electronic mail application, and the like, in which an electronic document or electronic mail message may be displayed. The user interface 200 includes one or more functionality tools, buttons or controls 205, 210, 215 for providing functionality of the application in use with text, data or other objects displayed in the workspace of the user interface 200. As should be appreciated, the example user interface 200 is for purposes of illustration and example only and is not limiting of the vast number of layouts and designs that may be utilized for a user interface 200 in which a text item, data item or other object may be displayed and with which the lightweight information user interface of the present invention may be utilized.


Referring still to FIG. 2, a text item 220 is displayed in the workspace of the user interface 200. As should be appreciated, the text item 220 is illustrative of text or data displayed in a word processing application user interface, a spreadsheet application user interface, a slide presentation user interface, or the text item 220 is illustrative of the text of an electronic mail message displayed in an electronic mail message display area of an electronic mail application user interface. According to embodiments of the present invention, the lightweight information user interface 230 is deployed in close proximity to a selected text item, data item or other object in an electronic document or electronic mail message for providing one or more types of information to a user about the selected item. For example, the user interface 230 may be utilized for providing a definition, translation or research information about a selected text item in a word processing document, slide presentation document or electronic mail message document. Similarly, the user interface 230 may be utilized for providing information about a numerical value or formula contained in a spreadsheet application document. Similarly, the user interface 230 may be utilized for providing information about a selected object, for example, a picture, shape or other object contained in an electronic document.


In general, it should be understood that items in a document or electronic mail for which information may be obtained according to the present invention includes a wide range of document content such as text items, images, sounds, shapes, tables or other objects. Thus, any discussion of embodiments of the present invention with respect to a particular item, such as a text item, is for purposes of example and is not limiting of the wide rage of document content for which information may be obtained as described herein.


An example use of the lightweight information user interface 230 is illustrated in FIG. 2 where the user interface 230 is deployed in close proximity to a selected text item 225, “John Brown.” According to embodiments of the present invention, the example text item 225 may have been selected because a user desires information about the selected text item, for example, a definition of the text item, a translation of the text item into a different language, or information about the text item such as contact information for the text item if the text item is a name or organizational information about the text item if the text item is a company or other institution name or symbol. According to the example lightweight information user interface 230, illustrated in FIG. 2, contact information for the name “John Brown” is obtained from a local or remote contacts file and is provided in the user interface 230 for providing the user quick and efficient contact information for the selected name without the need for interrupting the user's workflow to launch a separate contacts application.


As illustrated in FIG. 3, the user interface 230 may be deployed in a variety of display sizes. According to one embodiment, a default display size for the user interface 230 has a width of 200 pixels and a height of 100 pixels when information from a single information source is provided. If information from two or more information sources is provided, a default size for the user interface 230 may include a width of 300 pixels and a height of 150 pixels. As should be appreciated, however, a variety of other default sizes may be utilized for the user interface 230 depending on available display space and depending upon the desires of a given user or application developer.


An ellipsis control 330 is provided in the user interface 230 for selectively expanding the user interface 230 to the expanded version 335 illustrated in the lower portion of FIG. 3. According to one embodiment, the size of the expanded user interface 335 may grow up to a variety of different sizes. According to a preferred embodiment, the user interface 230 may be expanded as required to fit displayed content up to a width of 600 pixels and a height of 480 pixels. According to this embodiment, a preferred width/height ratio of 2:1 is desired when determining a size to which to expand the user interface 230 for displaying additional information content. According to embodiments, information that will not fit in the display space of the user interface 230 prior to expanding the user interface 230 is truncated from display. Likewise, if the user interface 230 is expanded to the expanded version 335, information that still will not fit in the available display space is truncated from display. A user interface control 370 is provided in the expanded version 335 of the user interface 230 for collapsing the expanded version 335 back to the default size user interface 230.


According to embodiments of the present invention, information displayed in the lightweight information user interface 230, 335 may be prioritized for display. For example, contact information about a selected name may receive a first priority, electronic mail functionality associated with a selected name may receive a second priority and navigation to a personal website associated with the selected name may receive a third priority. For another example, if a dictionary definition is obtained for a selected word, a first dictionary definition may receive a first priority, a second dictionary definition may receive a second priority, and so on. For another example, if research information is obtained about a company stock symbol contained in a text document or spreadsheet document, a company name associated with the symbol may receive a first priority, and research information about the associated company may receive a second priority.


Information may be displayed in the user interface 230 according to priority level and according to available display space. That is, the display of the information may be prioritized according to a preferred display order where a most preferred information section is displayed first and a least preferred information section is displayed last. For example, priority one information may be first displayed, followed by priority two, and so on. If available display space dictates that displayed information must be truncated, as described above, the information having the lowest priority may be first truncated, leaving the maximum available display space for information having the highest priority. If information from each priority of information may fit in the available display space, then all information from each priority level of information is displayed. In addition, information with different priority levels may be displayed in different sections of the user interface in the same manner as information from different sources, as described below. As should be appreciated, if the default size user interface 230 is expanded, as described above, then additional information is displayed as display space permits.


When information is returned for display in the user interface 230, 335 from multiple information sources, information from the different information sources may be displayed in different sections of the user interface 230, 335. If the available display space in the user interface 230, 335 does not allow for display of all information for each section of information, one or more sections of information may be displayed, while other sections of information may be reduced or collapsed. Sectionalized information displayed in the user interface 230, 335 may be provided in different display orientations. For example, a first section of information may be displayed in the upper left-hand corner of the user interface 230, a second section of information may be displayed in the upper right-hand corner of the user interface 230, a third section of information may be displayed in the lower left-hand corner of the user interface 230, and so on. Alternatively, a first section of displayed information may be displayed in the upper left-hand corner of the user interface 230, a second section of information may be displayed in the lower left-hand corner of the user interface 230, a third section of information may be displayed in the upper right-hand corner of the user interface 230, and so on.


As illustrated in FIG. 3, the user interface 230 includes a first information section 315 containing example contact information for a named individual. Other sections of information 320, 325 are illustrated in a collapsed form so that enough display space for displaying the information section 315 is provided. According to an embodiment, selection of one of the collapsed sections of information, for example, section 320, causes full display of the information contained in section 320 and simultaneously causes a collapse of the presently displayed information section 315. If the user interface 230 is expanded to the expanded version 335, illustrated in FIG. 3, information from previously collapsed information sections may be fully displayed as display space permits. For example, referring to the expanded version 335, personal contact information for the example named person is provided in the left-hand side of the expanded user interface 330 to include a name 310, title 312, office department 313, company name 315 and other information. In a second section, on the right-hand side, the previously collapsed “contacts” section 320 and “email” section 325 are fully displayed for displaying free/busy information 337, contacts editing functionality 339, electronic mail functionality 338 and website access functionality 350. As should be appreciated, the information illustrated in the example user interface 230, 335 is for purposes of example only and is not restrictive of the different types of information that may be displayed in the user interface 230, 335 according to embodiments of the present invention.


Referring to the user interface 335, if information from one or more information sources is truncated because the information will not fully fit in the available display space, a numbered tab 340, 345 is provided for each truncated section of information. Selection of a displayed numbered tab causes a display of truncated information in the display space of the user interface 230, 335. For example, if a first section of information provides contact information for a selected name in a document and the contact information is truncated so that only a first half of the contact information is displayed, a numbered tab 340 is deployed for the truncated information. If the numbered tab 340 is subsequently selected, the truncated information, for example, the second half of the contact information, is displayed for review by the user. According to one embodiment, the number of deployed truncated information tabs cannot exceed the vertical length of the associated user interface 230, 335.


Content displayed in the lightweight information user interface 230, 335 may be displayed according to a variety of formatting properties. For example, the information may be displayed as rich text or other text display types, and the information may be displayed according to different font sizes, different font colors, text styles, etc. In addition, information displayed in the user interface 230, 335 may include images/icons, audio files, tables, hyperlinks to other content, hyperlinks to external files, functionality buttons or controls, forms or templates.


According to embodiments of the present invention, a search for information about a selected text item, data item or object may be initiated, and the lightweight user interface 230 may be deployed according to different means. According to a first embodiment, if an electronic document is being utilized in an edit mode, where normal edit functionality, for example, cut, copy, paste, formatting, and the like may be applied to content of the electronic document, the lightweight information user interface 230 is typically invoked and deployed by first selecting a text item, data item or object in the electronic document followed by a user action for initiating an information search on the selected item. As should be appreciated, a secondary user action, as described below, is required when the electronic document is in edit mode because a number of edit functionalities may be applied to a selected item in the electronic document 220 after selection of the item.


Referring back to FIG. 2, if the document 220 is being utilized in an edit mode, initiation of a search on the text item 225 and deployment of the user interface 230 containing information about the selected text item is performed by first selecting the text item, for example, by mouse-clicking over the text item, followed by a secondary user action. According to one embodiment, a quick information look-up may be initiated using a keyboard accelerator combination, for example, ALT plus mouse click or CTRL plus mouse click, or any other suitable keyboard accelerator combination programmed for initiating an information search on the selected item. According to another embodiment, a quick information look-up button or control may be positioned in a toolbar of buttons and controls such as the controls 205, 210, 215 illustrated in FIG. 2.


Referring to FIG. 4, according to another embodiment, a drop-down menu 400 may be deployed in the user interface 200 for containing various edit mode functionalities that may be selected for application to the selected item in the document 220. For example, upon selecting a particular item in the document 220, the user may perform many editing functions, for example, the cut function 410, the copy function 415, the paste function 420, and the like. In addition, a look-up function 425 is provided which when selected causes a pop-out menu 430 that provides additional look-up functionality. According to embodiments of the present invention, selection of the quick look-up function 435 causes initiation of an information search for retrieving information on the selected item for population in the lightweight information user interface 230, described herein. A research function 440 may be selected for providing an exhaustive information research on the selected term that may be provided via an external research tool. As should be appreciated, the menu 400 may be displayed according to a variety of mechanisms including pop-up dialog boxes, drop down menus or as a context menu that may be deployed upon selection of a text item, data item or object contained in the electronic document 220 that provides functionality applicable to the selected item, including the user interface 230 of the present invention.


If an electronic document, including an electronic mail message, is deployed in a reading mode where normal edit functionalities, such as cut, copy, paste, formatting, and the like are disabled, the information search and display via the lightweight information user interface 230 may be initiated and provided in an automatic mode upon the selection of an item in the electronic document 220. That is, because the electronic document 220 is in a reading mode, selection of an item in the electronic document 220 may be utilized for triggering an automatic information search on the selected item followed by a deployment of the lightweight information user interface 230 containing the results of the search. That is, because there is no expectation of the selection of an editing functionality following the selection of an item in the electronic document when the electronic document is in reading mode, selection of an item in the electronic document may be used for triggering an automatic information search and display, as described herein.


According to embodiments, after a user is finished reviewing the contents of the user interface 230, it may be dismissed from display according to a number of means. A user interface dismissal control 331 is provided in the default and expanded user interfaces 230, 335. Selection of the control 331 causes the user interface to be dismissed from display. Another means for dismissing the user interface 230 includes selecting a different text item, data item or object in the electronic document. Similarly, selecting, for example, by mouse clicking, in another location in the electronic document or on a different user interface component may dismiss the user interface 230, 335. In addition, a button or control may be provided in a toolbar or menu for selectively dismissing the user interface 230, 335. In addition, keyboard keys, for example, the “ESC” key, or combinations of keys, for example, “CTRL” plus “D,” may be designated for dismissing the user interface 230, 335 upon selection by a user.



FIG. 5 illustrates a simplified block diagram of a computing architecture for obtaining information for populating a lightweight information user interface according to embodiments of the present invention. When an item is selected in an electronic document 220, as described above, and when an information search or lookup is initiated according to one of the methods described above, the application 106 in use with the electronic document 220 may obtain the requested information from a local or remote source. As should be appreciated, the local source 515 may be maintained in memory on the local computer 100 in use by the user. Alternatively, the application 106 may query a remote source 525 via a distributed computing network 520, such as the Internet or an intranet. For example, the remote source 525 may be in the form of a server maintained in a corporate network from which individual users may obtain information associated with a selected text item in the electronic document 220 being reviewed or edited by the user. Alternatively, the remote source 525 may be a research site maintained by a third party that is accessible via the network 520.


According to one embodiment, the application 106 may obtain dictionary information or translation information on a selected text item from a local or remote source 515, 525 by comparing the selected text item against a dictionary or translation service contained on the local or remote sources. Language tools, for example, dictionary sources and translation services may be provided in the lightweight information user interface 230 by comparing the selected item against items (e.g., words or phrases) contained in the dictionary sources or translation services. According to one embodiment, information retrieval from a dictionary source may be based on the user interface language in use for the application 106. For example, if the current user interface language in use for the application 106 is French, then upon the initiation of an information lookup for display in the user interface 230, described above, a French dictionary source at the local source 515 or remote source 525 will be utilized. Alternatively, a default language may be set for the application 106, and dictionary sources associated with the default language may be utilized. Alternatively, any editing languages that have been enabled by the user for use with the application 106 may be utilized for obtaining dictionary information from the local source 515 or remote source 525. Alternatively, the language, for example, French, German, Italian, etc., of text being entered or edited into an associated electronic document or mail document may be used to control the language of an associated dictionary source.


According to another embodiment, bilingual dictionary and translation sources may be utilized where a selected term may be automatically translated from a first language, for example, English, to the selected user interface language and vise versa. Or, an automatic translation from a first language to a selected default language and vice versa may be selected. Alternatively, an automatic translation of the selected term from a first language to an enabled editing language and vice versa may be selected. Or, an automatic translation may be obtained for a selected text item for any editing language enabled for the application 106 to any other editing language enabled for the application 106 and vice versa. As should be appreciated, in addition to obtaining translations of selected text items, as described above, definitions for a selected text item may be obtained in different languages, for example, the user interface language, a default language, or any editing languages enabled for the application 106. In addition, the language, for example, French, German, Italian, etc., of text being entered or edited into an associated electronic document or mail document may be used for determining a first or starting language for translation of an item to a second language or for obtaining a definition in a second language.


In addition to obtaining dictionary definitions or translations of selected text items, as described above, selected items in an electronic document 220 may be “smart tagged” for identifying a type for the selected item which may then be compared against an information source applicable to the identified text or data item type. As described below, “smart tagging” an item allows the item to be recognized and tagged in a manner that facilitates a more accurate information lookup based on the context and/or meaning of the tagged item. For example, if a selected text item may be identified as a name, then the name may be compared against a database of names, for example, a contacts database, for retrieving information about the identified name, for example, name, address, telephone number, and the like, for population in the lightweight information user interface 230. Similarly, if a number string, for example, a five-digit number, may be identified as a ZIP Code, then the number string may similarly be compared against ZIP Codes contained in a database, for example, a contacts database for retrieving information associated with the identified ZIP Code.


Referring to FIG. 5, according to this embodiment, when a text or data item is selected by the user, the selected text or data item is passed to a recognizer module 530 where the selected text or data item is compared against text or data items of various types for recognizing and identifying the text or data item as a given type. For example, if a text item 225, such as the name “John Brown,” is selected by a user from an electronic document 220, or from an electronic mail message displayed by the application 106, the selected text item is passed to the recognizer module 530. At the recognizer module 530, the selected text item is compared against one or more databases of text items. For example, the text item “John Brown” may be compared against a contacts database for finding a matching entry in the contacts database. For another example, the text item “John Brown” may be compared against a telephone directory for finding a matching entry in a telephone directory. For another example, the text item “John Brown” may be compared against a corporate or other institutional directory for a matching entry.


For each of these examples, if the text item or other content is matched against content contained in any available information source, then information applicable to the selected text item of the type associated with the matching information source may be returned. According to one embodiment, once a given text item is identified as associated with a given type, for example, a name, an action module 535 may be invoked for passing the identified text item to a local information source 515 or to a remote source 525 for retrieval of information applicable to the text item according to its identified type. For example, if the text item “John Brown” is recognized by the recognizer module 530 as belonging to the type “name,” then the action module 535 may pass the identified text item to all information sources contained at the local source 515 and/or the remote source 525 for obtaining available information associated with the selected text item of the type name. For example, if the local source 515 and/or remote source 525 contains a contacts database, a telephone directory database, and a corporate directory database where each of the example databases contain information associated with the data type name, the identified text item “John Brown” may be compared against data contained in each of those databases for matching entries.


Information matching the selected text item from each available source may be returned to the application 106 for populating the lightweight information user interface 230. Thus, following from the present example, if the user selects the text item “John Brown” and information associated with the selected text item is found in each of a contacts database, telephone directory, and corporate directory, three information sections may be populated in the lightweight information user interface 230 for providing the user contact information, telephone directory information, and corporate directory information for the selected text item.


As should be appreciated, the recognizer module may be programmed for recognizing many data types, for example, book titles, movie titles, addresses, important dates, geographic locations, and the like. Accordingly, as should be understood, any text or data item passed to the recognizer module 530 from the application 106 that may be recognized and identified as a particular data type may be compared against a local or remote information source for obtaining information applicable to the selected text or data item according to the text or data item type identified for the selected text or data item. As will be described below, according to an embodiment of the invention, information may be returned according to a variety of media types. For example, if the recognizer module recognizes a text string as a movie title, an audio clip from the associated movie may be returned which may be played to a user via the lightweight reference user interface. Similarly, a returned video clip may be played. In addition, information about a selected word, text item or object may be returned in different media types, for example, audio, video, picture, bitmap image, etc.


According to another embodiment, the recognizer module 530 and action module 535 may be provided by third parties for conducting specialized information retrieval associated with different data types. For example, a third-party application developer may provide a recognizer module 530 and action module 535 for recognizing text or data items as stock symbols. Thus, if a user selects a stock symbol contained in an electronic document 220 or received in an electronic mail message, the stock symbol may be passed to the recognizer module 530 supplied by the third-party developer for recognizing the selected text item as a stock symbol. Once the selected text item is recognized as a stock symbol, for example, the associated action module 535 may pass the identified text or data item to a local or remote information source, for example, an information source provided by a financial information network, for obtaining a company name and/or information about a company associated with the identified stock symbol.


As should be appreciated, any number of text or data types may be utilized for identification via the recognizer module 530 and for obtaining information on a selected text or data item identified as a given text or data item type, as described herein. Further, according to embodiments of the present invention, the action module 535 may be programmed for providing additional research functionality in the lightweight information user interface 230. For example, if a search in the local or remote information source obtains a vast amount of information on an identified text or data item, for example, a company name, an executable functionality, for example, a “additional research” button, may be populated in the lightweight information user interface which, when subsequently selected by a user, causes additional research for expanded search information on the selected text or data item.



FIG. 6 illustrates a simplified block diagram of an XML-based information query for information required for populating a lightweight information user interface 230. According to embodiments of the present invention, speed and efficiency of information retrieval and display to the lightweight information user interface 230 is enhanced by use of a lightweight Extensible Markup Language (XML) protocol for information retrieval. The XML representation 600 illustrates example XML markup for a query for contacts information associated with a selected text item identified as a name in an electronic document or electronic mail message. An associated schema 650 is illustrated which provides XML grammar, syntax and validation rules governing the XML markup applied to the query 600. As described below, according to one embodiment, the schema 650 is extensible to support information returns to the lightweight reference user interface according to a variety of media types, including text, audio (including voice), video, music, pictures (in multiple formats), bitmap images, etc.


According to embodiments of the present invention, when an information source query is passed from the application 106 to a local source 515 or remote source 525, the XML-based query 600 is utilized as a lightweight query for returning a limited amount of information, for example, contact information responsive to the selected text item, data item or object selected in the electronic document.


When the XML-based query 600 is received at the local or remote information source, the query is parsed in association with the attached schema 650 for quickly determining the types of data that should be returned in response to the query. As should be appreciated, because the local source and remote source 525 may parse the XML-based query based on the associated schema, data from the local and/or remote sources 515, 525 may be efficiently matched to corresponding XML markup in the query 600 for producing responsive information to the query. For example, referring to the query 600, illustrated in FIG. 6, once the local and/or remote sources 515, 525 receive the XML-based query, information in the local and/or remote sources 515, 525 corresponding to XML markup contained in the query 600 and corresponding to the identified text item, data item or object may be populated into a response that is passed back to the application 106 for populating the lightweight information user interface 230, 335 described herein. Thus, a limited and/or targeted amount of information may be obtained from the local and/or remote information sources for presentation in the lightweight information user interface 230, 335 as opposed to obtaining all available information associated with a given text item, data item, or object selected in an electronic document 220.


As should be appreciated, the example XML markup 600 is not intended to show well-formed XML, but is for purposes of illustration only. Further, the example XML-based query 600 is not limiting of the vast number of different markup-based queries that may be generated for obtaining different amounts and types of information from one or more information sources for populating the lightweight information user interfaces 230, 335, described herein.


As briefly described above, according to an embodiment of the present invention, the lightweight information user interface 230 may be utilized for providing functionality on a word, data item or object, including automatic conversion of a word or text string from one language to another language. In addition, information on a selected word, data item or object may be provided according to a variety of media types, including text, audio video, music, pictures (in multiple formats), bitmap images, etc.


Display of Linguistic and Translation Information on Focus


Referring now to FIG. 7, another embodiment of a lightweight information user interface is illustrated and described. According to this embodiment, the user interface 720 may be automatically deployed in a displayed document 702 for providing information on a focused-on word or object 725. For example, upon focus on a given word in a displayed document, such as the word “insert” 725, the lightweight user interface 720 may be automatically deployed to provide a translation, definition or other useful information for the focused-on word or object. According to one embodiment, focus on a given word or object may be in the form of a “mouse-over” on the desired word or object. Other focus methods may include touching the screen over a desired word or object with an electronic pen or stylus. As should be appreciated, any suitable method of applying focus to a given word or object in an electronic document 702 may be utilized for triggering automatic deployment of the user interface 720.


As described above, when the functionality associated with the user interface (UI) 720 is activated (“turned on”), the user interface 720 is automatically deployed in an associated document 702 upon focus on a given word or object for providing the activated functionality. For example, if a translation functionality is activated, then a translation for a focused-on word will automatically be displayed in the user interface 720 upon focus (e.g., mouse-over) on a given word according to a selected translation language. That is, if a French translation language is selected for the UI 720, then if a user mouses over a given word, for example, “insert” 725, then the UI 720 automatically will be displayed, as illustrated in FIG. 7, and will include a translation, for example, the French translation “inserer” for the focused-on word. As illustrated in FIG. 7, a definition for the focused-on word is provided in addition to a translation according to a selected translation language. In the example definitions provided, a couple of possible definitions are displayed, including an example use of the focused-on word according to the translation language applied to the focused-on word.


A great variety of information may be provided in the UI 720 upon focus on a given word or object. For example, in addition to translation and definition information, thesaurus information, antonyms, synonyms and the like may be provided for a focused-on word or object. In addition, a focused-on word or object may be sent to a recognizer function, as described above with respect to FIG. 5, and a variety of other information may be provided, for example, research information on a given word or object (e.g., Internet-based, encyclopedia-based, etc.). As should be appreciated, the layout and display location of the example user interface 720, illustrated in FIG. 7, are for purposes of example only and are not limiting of the different ways in which the translation and definition information may be displayed to a requesting user.


As described above, deployment of the user interface 720 is automatic upon the focus (e.g., mouse over) on a desired word or object. According to embodiments, the user interface 720 may be activated for automatic deployment via a variety of entry points and/or deployment means. FIG. 8 illustrates a computer screen display of an electronic document showing a drop down menu with which the functionality of the user interface 720 may be activated (“turned on”) and with which a given translation language may be selected for application to focused-on words or objects. As illustrated in FIG. 8, a row of functionality buttons or controls at the top of the example user interface 700 provide functionalities that may be applied to text, data items or objects in an associated document.


A drop down menu 830 may be deployed in response to the selection of or focus on a “Translation Screentip” control 710. The drop down menu includes a number of translation languages that may be applied to a focused-on word or object, for example, the word “insert” 725. As should be appreciated, the translation languages illustrated in FIG. 8 are for purposes of example only and are not exhaustive of the numerous translation languages available for use in accordance with embodiments of the invention. The drop down menu 830 also illustrates a “turn off” control which may be selected for deactivating the UI 720 and associated functionality.


After a given translation language is selected from the drop down menu 830, the translation functionality is activated or “turned on,” and the menu 830 is dismissed. Subsequently, when a user focuses (e.g., mouse-over) on a given word, for example, “insert” 725, the focused-on word is sent to a translation module as described above with reference to FIG. 5, and a returned translation is automatically populated in a deployed user interface 720. According to one embodiment, in addition to a translation, a definition for the focused-on word may be populated in the UI 720, or a concise version of the UI 720 may be deployed to show only the translation, as illustrated below with respect to FIG. 10. As should be appreciated, whether a definition is shown with the translation, as illustrated in FIG. 7, is readily configurable.


As mentioned above, the UI 720 may be utilized for displaying information other than translations and definitions, for example, a “Research” button or control 705 may cause deployment of the menu 830 for providing research on a selected word, data item or object in the associated document. For example, the menu 830 deployed in response to selection of or focus on “Research” button 705 may provide resources, such as encyclopedias, Internet-based research tools, or other lookup methods for finding information associated with a word, data item or object focused on in the associated document.


The user interface 720 may be activated by other means in addition to the drop down menu 830, illustrated in FIG. 8. For example, in response to a “mouse-over” focus on a given word when the UI 720 has not been activated, a context menu 920 may be displayed, as illustrated in FIG. 9. In FIG. 9, a document 702 is illustrated in which a context menu 920 has been deployed for providing functionality of an associated application, for example, a word processing application, based on the context of a document focus point or insertion point. For example, upon focus or placement of an insertion point in a text selection contained in the document 702, a context menu 920 may be automatically deployed for providing functionality associated with the editing of a text selection. For example, text editing functions such as “cut,” “copy” and “paste” are illustrated. In contrast, if an object, such as an embedded picture, is selected or focused on, the context menu 920 may be deployed and may be populated with functionality associated with the editing of embedded pictures.


As illustrated in FIG. 9, the context menu 920 includes, among other things, a “Translate” functionality, which when selected, may cause deployment of the fly out menu 925, which in similarity to the drop down menu 830, may automatically activate the functionality of the lightweight information user interface 720 and may provide one or more translation languages for application to a focused-on word. As described above with reference to FIG. 8, the fly out menu 925 may be launched in response to selection of a “research” type function from the context menu 920 for providing access to other information that may be displayed for a focused-on word or object.


As described above, a selected word, data item or object may be passed to a recognition resource, such as the recognizer module 530 illustrated and described with respect to FIG. 5, for returning information and actions responsive to the request. Thus, if a user selects a research, lookup or translation functionality for application to a particular word, text string or object, a focused-on word, text string or object may be passed to the recognizer module 530 and then on to an appropriate local or remote source for research information and potential actions responsive to the request.


As described above with reference to FIG. 6, an XML-based query may be used to query one or more local or remote resources for information and/or actions, for example, a translation or dictionary lookup, in association with a schema 650. According to an embodiment, a dictionary schema 650 may be structured and defined with property tags for enhancing the display of dictionary information as illustrated in FIG. 7. An example XML-based schema 650 for dictionary information is set out in Table 1 below.









TABLE 1







<Headword><B>insert</B> </Headword>


<Num>1.</Num> <Text> <I>noun in magazine etc encart masculin









</I> </Text>







<Num>2.</Num><Text>transitive verb; <B>insert something into









something inserer something dans something</B></Text>










With a defined schema structure, extra display properties may be provided in associated user interface 720 for displaying responsive information. For example, through use of a “concise” tag (XML-based), the user interface 720 may be used to display a concise version of responsive information instead of a full display of information, as illustrated in FIG. 7. Thus, as illustrated in FIG. 10, an example document 702 is provided in which a user has focused on a desired word “insert” after activating the translation function for translating words from English to French. Through use of the “concise” tag, a smaller version of the user interface 1020 is deployed and displays only a translation for the focused-on word, but does not display additional information, such as a definition for the word. As illustrated in FIG. 10, the lightweight information user interface 1020 is deployed in the document 702 and shows the French translation “inserer” for the focused-on word “insert.” Rather than display a lengthy text string such as “The French translation of the English term ‘insert’ is ‘inserer,’” a concise display simply provides the translated word corresponding to the starting word. An example XML-based schema 650 for translation services utilizing an example “concise” tag is set out in Table 2 below.









TABLE 2







<Headword><B>insert</B> </Headword>


<Num>1.</Num> <Text> <I>noun in magazine etc encart masculin









</I> </Text>







<Num>2.</Num><Text>transitive verb; <B>insert something into









something; inserer something dans something</B></Text>







<concise> inserer</concise>










FIG. 11 illustrates a computer screen display of an electronic document showing the lightweight information user interface 720 deployed for language translation and showing icons for selectively receiving information according to different media types. As briefly described above, information may be provided on a focused-on word, data item or object according to a variety of media types, including text, audio, video, music, pictures (in multiple formats) bitmap images, etc.


According to the embodiment illustrated in FIG. 11, a focused-on word, data item or object may be matched with information available for the focused-on word according to different media types, and each type of information may be provided via the lightweight information user interface 720, described herein. As illustrated in FIG. 11, a number of icons 1135, 1140, 1145 are displayed in the user interface 720 for providing information on a focused-on word according to different media types. For example, selection of the picture control 1135 may result in a picture associated with the focused-on word being displayed to a user in similar fashion as illustrated above with reference to FIGS. 7 and 10. For example, upon focus on the word “insert” and upon selection of the picture icon 1135, a picture showing an object being inserted into a container may be displayed for providing a visual definition for the word. Selection of the movie control 1140 may cause the display of a movie clip associated with the focused-on word, for example, a video clip of an object being inserted into a container, in response to a mouse-over on the word “insert.” Likewise, selection of the voice control 1145 may cause the playing of an audio clip that speaks the focused-on word in the starting language and again in the translated language, followed by a speaking of a definition for the focused-on word. Other media type controls may be provided for providing other types of responsive media, such as music.


According to an embodiment, the media type controls are populated in the user interface 720 based on availability of the associated media. That is, if information responsive to a research, lookup or translation, for example, is available in an audio clip, then the audio clip will be returned from the appropriate local or remote resource, and the audio (e.g., voice) icon will be populated in the user interface 720 for selecting the returned audio clip for play. Similarly, if a movie clip is available as a responsive resource, then the movie clip will be returned to the user interface 720, and the movie icon will be populated in the user interface 720, as illustrated in FIG. 11.


For an example of the provision of information via alternate media types, if a given word or text string is selected from an associated document for translation from English to French, a French translation of the word may be provided, as illustrated and described above with reference to FIG. 10. On the other hand, if an audio clip for the translation is available, the audio clip may be returned and the associated audio icon will be populated in the user interface 730. If the audio icon 1145 is selected, an audio clip may be played to the user with which the user is able to hear a voice translation of the selected word. For example, the user may hear a voice translation such as “The French translation of the word “insert” is ‘inserer.’”


For another example, if the user focuses on a name in a given document, for example, “Neil Armstrong,” and then selects a research functionality, for example, an encyclopedia functionality, from the user interface 830, the user may be provided a variety of information on the example subject according to a variety of different media types. For example, selection of the picture control may provide the user-with a picture according to a variety of picture formats of the research subject, for example, “Neil Armstrong.” Similarly, a movie clip may be provided to the user which when selected, may provide the user a movie clip of Neil Armstrong's famous walk on the surface of the moon. Similarly, selection of the voice control 1145 may cause the playing of an audio clip to the user, for example, Neil Armstrong's famous quotes from the surface of the moon. Likewise, if the user selects another functionality, for example, a dictionary functionality associated with a focused-on word, for example, “astronaut” the user may receive a picture associated with the defined word or text string, for example, a picture of an astronaut in a space suit. Of course, in all of the above instances, the user may receive textual information, as illustrated in FIGS. 7 and 10, in addition to the other media types discussed above.


In order to facilitate the presentation of the multimedia information in the UI 720, as described above, the schema provided for display of information in the UI 720 is extended to provide additional multimedia information. An example schema 650 for providing multimedia information via the user interface 720 is set out below in Table 3. For example, in the schema set out below, use of a “multimedia” tag is illustrated in a dictionary data schema definition for extending the dictionary data schema to include additional media types for providing definition information for focused on words or text strings.











TABLE 3









<Headword><B>insert</B> </Headword>



<Num>1.</Num> <Text> <I>noun in magazine etc encart masculin









 </I> </Text>









<Num>2.</Num><Text>transitive verb; <B>insert something into









 something; inserer something dans something</B></Text>







<multimedia>











 <voice>\Program
Files\Common
Files\Microsoft







Shared\TRANSLAT\ENFR\voice\245685123.mpg</voice>











 <picture>\Program
Files\Common
Files\Microsoft







Shared\TRANSLAT\ENFR\picture\245685123.jpg</picture>












<movie>
\Program
Files\Common
Files\Microsoft







Shared\TRANSLAT\ENFR\movie\245685123.avi</movie>


</multimedia>









As described herein, methods, systems and computer products provide a lightweight information user interface for displaying information about a focused-on text item, data item or other object in an electronic document that minimizes interruption of workflow with the electronic document. It will be apparent to those skilled in the art that various modifications or variations may be made in the present invention without departing from the scope or spirit of the invention. Other embodiments of the present invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein.

Claims
  • 1. A method of providing information associated with a focused-on item in an electronic document, comprising: receiving a first user-initiated selection comprising a focus on an item contained in an electronic document;evaluating a current mode of the electronic document selected from a first mode in which a plurality of editing functions of the electronic document are disabled, and a second mode in which the plurality of editing functions of the electronic document are enabled;automatically initiating a search for information on the focused-on item when the current mode is evaluated as the first mode;receiving a second user-initiated selection for initiating the search for information on the focused-on item when the current mode is evaluated as the second mode, wherein the second user selection is selected from a group comprising: a simultaneous keyboard and mouse selection; and a control selection accessed in one of a control toolbar and a drop-down menu;passing data representing the focused-on item to an information source following initiating the search for information on the focused-on item;receiving the information associated with the focused-on item from the information source;automatically displaying the returned information associated with the focused-on item in a user interface in the electronic document in proximity to the focused-on item; and when the information associated with the focused-on item will not fit in an available display space provided in the user interface: truncating a portion of the information such that only a portion of the information that will fit in the available display space provided in the user interface is displayed; anddisposing a selectable control tab on a first side the user interface which, when selected, causes a display of the truncated portion of the information, the selectable control tab including: a first selectable tab which, when selected, causes to be displayed a first portion of the truncated portion of the information; anda second selectable tab which, when selected, causes to be displayed a second portion of the truncated portion of the information;the first and second selectable tabs extending from the user interface, with the first selectable tab being positioned above the second selectable tab.
  • 2. The method of claim 1, wherein receiving the information associated with the focused-on item includes receiving the information associated with the focused-on item in a format associated with each of a plurality of media types, and further comprising providing access to the information associated with the focused-on item in the format associated with each of the plurality of media types via the displayed user interface.
  • 3. The method of claim 2, wherein providing access to the information associated with the focused-on item in the format associated with each of the plurality of media types includes providing an audio file that may be selected from the displayed user interface for providing the information associated with the focused-on item in audio format.
  • 4. The method of claim 2, wherein providing access to the information associated with the focused-on item in the format associated with each of the plurality of media types includes providing a video file that may be selected from the displayed user interface for providing the information associated with the focused-on item in video format.
  • 5. The method of claim 2, wherein providing access to the information associated with the focused-on item in the format associated with each of the plurality of media types includes providing a picture that may be formatted according to one or more formatting types that may be displayed in the user interface for providing the information associated with the focused-on item in video format.
  • 6. The method of claim 1, wherein passing data representing the focused-on item to an information source includes passing an Extensible Markup Language (XML) formatted query to the information source that identifies a type of information associated with the focused-on item that is requested from the information source.
  • 7. The method of claim 6, further comprising providing a schema definition for information contained in the information source for allowing the XML formatted query to obtain the type of information associated with the focused-on item that is requested from the information source.
  • 8. The method of claim 7, wherein providing a schema definition for information contained in the information source for allowing the XML formatted query to obtain the type of information associated with the focused-on item that is requested from the information source, includes providing a schema definition for the information associated with the focused-on item in each of the plurality of media types for allowing the XML formatted query to request the information associated with the focused-on item according to any media type supported by the schema definition.
  • 9. The method of claim 7, wherein providing a schema definition for information contained in the information source for allowing the XML formatted query to obtain the type of information associated with the focused-on item that is requested from the information source, includes providing a schema definition for the information associated with the focused-on item for allowing the XML formatted query to return the information associated with the focused-on item for presentation in the displayed user interface according to a prescribed layout.
  • 10. The method of claim 1, wherein passing data representing the focused-on item to an information source includes: passing the data representing the focused-on item to a recognizer module for determining whether the focused-on item corresponds to a given data type;when the focused-on item corresponds to a given data type, passing the data representing the focused-on item to an information source containing information associated with the given data type; andwherein receiving the information associated with the focused-on item in a format associated with each of a plurality of media types includes receiving information associated with the focused-on item that is further associated with the given data type in a format associated with each of a plurality of media types.
  • 11. The method of claim 1, wherein passing data representing the focused-on item to an information source includes passing the data representing the focused-on item to a dictionary source for obtaining a definition of the focused-on item.
  • 12. The method of claim 1, wherein passing data representing the focused-on item to an information source includes passing the data representing the focused-on item to a translation source for obtaining a translation of the focused-on item from a first language to a second language.
  • 13. The method of claim 1, wherein if the information associated with the focused-on item is received from more than one information source, displaying the information in the user interface in one or more sections where information from a first information source is displayed in a first section and where information from a second information source is displayed in a second section.
  • 14. The method of claim 13, wherein if an available display space in the user interface will not accommodate a display of all sections of information received for display in the user interface, displaying information from a number of sections that will fit in the user interface;collapsing any sections of information the display of which will not fit in the user interface; andproviding a selectable control in the user interface, which when selected, causes a display of an associated collapsed section of information.
  • 15. The method of claim 14, further comprising prioritizing the sections of information according to a preferred display order where a most preferred section is displayed first and a least preferred section is displayed last.
  • 16. A computer readable storage medium containing computer executable instructions which when executed by a computer perform a method of providing information associated with a focused-on item in an electronic document, comprising: receiving a first user selection comprising a focus on an item contained in an electronic document;passing data representing the focused-on item to a recognizer module for determining whether the focused-on item corresponds to a given data type;evaluating a current mode of the electronic document selected from a first mode in which a plurality of editing functions of the electronic document are disabled, and a second mode in which the plurality of editing functions of the electronic document are enabled;automatically initiating a search for information on the focused-on item when the current mode is evaluated as the first mode;receiving a second user selection for initiating the search for information on the focused-on item when the current mode is evaluated as the second mode, wherein the second user selection is selected from a group comprising: a simultaneous keyboard and mouse selection; and a control selection accessed in one of a control toolbar and a drop-down menu;when the focused-on item corresponds to a given data type, passing the data representing the focused-on item to an information source containing information associated with the given data type following initiating the search for information on the focused-on item;at the information source, parsing a data source for information associated with the focused-on item;returning information associated with the focused-on item;displaying a user interface in the electronic document in proximity to the focused-on item; anddisplaying the returned information associated with the focused-on item in the user interface, wherein, when the information associated with the focused-on item will not fit in an available display space provided in the user interface: truncating a portion of the information such that only a portion of the information that will fit in the available display space provided in the user interface is displayed; anddisposing a selectable control tab on a first side the user interface, which when selected, causes a display of the truncated portion of the information, the selectable control tab including: a first selectable tab which, when selected, causes to be displayed a first portion of the truncated portion of the information; anda second selectable tab which, when selected, causes to be displayed a second portion of the truncated portion of the information;the first and second selectable tabs extending from the user interface, with the first selectable tab being positioned above the second selectable tab.
  • 17. A computer readable storage medium containing computer executable instructions which when executed by a computer perform a method of providing information associated with a focused-on item in an electronic document, comprising: receiving a first user-initiated selection comprising a focus on an item contained in an electronic document;evaluating a current mode of the electronic document selected from a first mode in which a plurality of editing functions of the electronic document are disabled, and a second mode in which the plurality of editing functions of the electronic document are enabled;automatically initiating a search for information on the focused-on item when the current mode is evaluated as the first mode;receiving a second user selection for initiating the search for information on the focused-on item when the current mode is evaluated as the second mode, wherein the second user selection is selected from a group comprising: a simultaneous keyboard and mouse selection; and a control selection accessed in one of a control toolbar and a drop-down menu;querying one or more data sources for information associated with the focused-on item following initiating the search for information on the focused-on item;displaying a user interface in the electronic document in proximity to the focused-on item;when the information associated with the focused-on item is received from more than one source, displaying the information in the user interface in one or more sections where information from a first data source is displayed in a first section and where information from a second data source is displayed in a second section; andwhen an available display space in the user interface will not accommodate a display of all sections of information received for display in the user interface: displaying information from a number of sections that will fit in the user interface;collapsing any sections of information the display of which will not fit in the user interface; andproviding a selectable control tab on a first side of the user interface which, when selected, causes a display of an associated collapsed section of information, the selectable control tab including: a first selectable tab which, when selected, causes to be displayed a first portion of the information; anda second selectable tab which, when selected, causes to be displayed a second portion of the information;the first and second selectable tabs extending from the user interface, with the first selectable tab being positioned above the second selectable tab.
RELATED APPLICATIONS

This application is a continuation-in-part of U.S. patent application, Ser. No. 11/234,968, filed with the United States Patent and Trademark Office on Sep. 26, 2005 entitled “Lightweight Reference User Interface,” which is incorporated herein by reference.

US Referenced Citations (430)
Number Name Date Kind
4674065 Lange et al. Jun 1987 A
4791587 Doi Dec 1988 A
4868750 Kucera et al. Sep 1989 A
5005127 Kugimiya et al. Apr 1991 A
5020019 Ogawa May 1991 A
5128865 Sadler Jul 1992 A
5159552 van Gasteren et al. Oct 1992 A
5251130 Andrews et al. Oct 1993 A
5267155 Buchanan et al. Nov 1993 A
5287448 Nicol et al. Feb 1994 A
5297039 Kanaegami et al. Mar 1994 A
5303151 Neumann Apr 1994 A
5317546 Balch et al. May 1994 A
5337233 Hofert et al. Aug 1994 A
5341293 Vertelney et al. Aug 1994 A
5351190 Kondo Sep 1994 A
5386564 Shearer et al. Jan 1995 A
5392386 Chalas Feb 1995 A
5418902 West et al. May 1995 A
5446891 Kaplan et al. Aug 1995 A
5522089 Kikinis et al. May 1996 A
5535323 Miller et al. Jul 1996 A
5541836 Church et al. Jul 1996 A
5546521 Martinez Aug 1996 A
5581684 Dudzik et al. Dec 1996 A
5596700 Darnell et al. Jan 1997 A
5617565 Augenbraun et al. Apr 1997 A
5625783 Ezekiel et al. Apr 1997 A
5627567 Davidson May 1997 A
5627958 Potts et al. May 1997 A
5629846 Crapo May 1997 A
5634019 Koppolu et al. May 1997 A
5640560 Smith Jun 1997 A
5657259 Davis et al. Aug 1997 A
5685000 Cox Nov 1997 A
5708825 Sotomayor Jan 1998 A
5715415 Dazey et al. Feb 1998 A
5717923 Dedrick Feb 1998 A
5752022 Chiu et al. May 1998 A
5761689 Rayson et al. Jun 1998 A
5764794 Perlin Jun 1998 A
5765156 Guzak et al. Jun 1998 A
5781189 Holleran et al. Jul 1998 A
5781904 Oren et al. Jul 1998 A
5794257 Liu et al. Aug 1998 A
5799068 Kikinis et al. Aug 1998 A
5802253 Gross et al. Sep 1998 A
5802262 Van De Vanter Sep 1998 A
5802299 Logan et al. Sep 1998 A
5802530 van Hoff Sep 1998 A
5805911 Miller Sep 1998 A
5809318 Rivette et al. Sep 1998 A
5815830 Anthony Sep 1998 A
5818447 Wolf et al. Oct 1998 A
5819273 Vora et al. Oct 1998 A
5821931 Berquist et al. Oct 1998 A
5822539 van Hoff Oct 1998 A
5822720 Bookman et al. Oct 1998 A
5826025 Gramlich Oct 1998 A
5832100 Lawton et al. Nov 1998 A
5845077 Fawcett Dec 1998 A
5845278 Kirsch et al. Dec 1998 A
5848386 Motoyama Dec 1998 A
5855007 Jovicic et al. Dec 1998 A
5859636 Pandit Jan 1999 A
5872973 Mitchell et al. Feb 1999 A
5875443 Nielsen Feb 1999 A
5877757 Baldwin et al. Mar 1999 A
5884266 Dvorak Mar 1999 A
5892919 Nielsen Apr 1999 A
5893073 Kasso et al. Apr 1999 A
5893132 Huffman et al. Apr 1999 A
5895461 De La Huerga et al. Apr 1999 A
5896321 Miller et al. Apr 1999 A
5900004 Gipson May 1999 A
5907852 Yamada May 1999 A
5913214 Madnick et al. Jun 1999 A
5920859 Li Jul 1999 A
5924099 Guzak et al. Jul 1999 A
5933139 Feigner et al. Aug 1999 A
5933140 Strahorn et al. Aug 1999 A
5933498 Schneck et al. Aug 1999 A
5940614 Allen et al. Aug 1999 A
5944787 Zoken Aug 1999 A
5946647 Miller et al. Aug 1999 A
5948061 Merriman et al. Sep 1999 A
5956681 Yamakita Sep 1999 A
5974409 Sanu et al. Oct 1999 A
5974413 Beauregard et al. Oct 1999 A
5978754 Kumano Nov 1999 A
5983216 Kirsch et al. Nov 1999 A
5983218 Syeda-Mahmood Nov 1999 A
5987402 Murata et al. Nov 1999 A
5987480 Donohue et al. Nov 1999 A
5991719 Yazaki et al. Nov 1999 A
5995756 Hermann Nov 1999 A
5995979 Cochran Nov 1999 A
6006265 Rangan et al. Dec 1999 A
6006279 Hayes Dec 1999 A
6014616 Kim Jan 2000 A
6018761 Uomini Jan 2000 A
6026388 Liddy et al. Feb 2000 A
6028605 Conrad et al. Feb 2000 A
6029135 Krasle Feb 2000 A
6029171 Smiga et al. Feb 2000 A
6031525 Perlin Feb 2000 A
6052531 Waldin et al. Apr 2000 A
6061516 Yoshikawa et al. May 2000 A
6061701 Hirai et al. May 2000 A
6064951 Park et al. May 2000 A
6067087 Krauss et al. May 2000 A
6072475 Van Ketwich Jun 2000 A
6073090 Fortune Jun 2000 A
6085201 Tso Jul 2000 A
6088711 Fein et al. Jul 2000 A
6092074 Rodkin et al. Jul 2000 A
6102969 Christianson et al. Aug 2000 A
6108640 Slotznick Aug 2000 A
6108674 Murakami et al. Aug 2000 A
6112209 Gusack Aug 2000 A
6121968 Arcuri et al. Sep 2000 A
6122647 Horowitz et al. Sep 2000 A
6126306 Ando Oct 2000 A
6137911 Zhilyaev Oct 2000 A
6141005 Hetherington et al. Oct 2000 A
6151643 Cheng et al. Nov 2000 A
6154738 Call Nov 2000 A
6167469 Safai et al. Dec 2000 A
6167523 Strong Dec 2000 A
6167568 Gandel et al. Dec 2000 A
6173316 De Boor et al. Jan 2001 B1
6182029 Friedman Jan 2001 B1
6185550 Snow et al. Feb 2001 B1
6185576 McIntosh Feb 2001 B1
6199046 Heinzle et al. Mar 2001 B1
6199081 Meyerzon et al. Mar 2001 B1
6208338 Fischer et al. Mar 2001 B1
6219698 Iannucci et al. Apr 2001 B1
6246404 Feigner et al. Jun 2001 B1
6262728 Alexander Jul 2001 B1
6272074 Winner Aug 2001 B1
6272505 De La Huerga Aug 2001 B1
6282489 Bellesfield et al. Aug 2001 B1
6282537 Madnick et al. Aug 2001 B1
6291785 Koga et al. Sep 2001 B1
6292768 Chan Sep 2001 B1
6295061 Park et al. Sep 2001 B1
6297822 Feldman Oct 2001 B1
6300950 Clark et al. Oct 2001 B1
6308171 De La Huerga Oct 2001 B1
6311152 Bai et al. Oct 2001 B1
6311177 Dauerer et al. Oct 2001 B1
6311194 Sheth et al. Oct 2001 B1
6320496 Sokoler et al. Nov 2001 B1
6323853 Hedloy Nov 2001 B1
6336125 Noda et al. Jan 2002 B2
6336131 Wolfe et al. Jan 2002 B1
6338059 Fields et al. Jan 2002 B1
6339436 Amro et al. Jan 2002 B1
6339755 Hetherington et al. Jan 2002 B1
6347398 Parthasarathy et al. Feb 2002 B1
6349295 Tedesco et al. Feb 2002 B1
6353926 Parthesarathy et al. Mar 2002 B1
6381742 Forbes et al. Apr 2002 B2
6382350 Jezewski et al. May 2002 B1
6392668 Murray May 2002 B1
6396515 Hetherington et al. May 2002 B1
6401067 Lewis et al. Jun 2002 B2
6408323 Kobayashi et al. Jun 2002 B1
6413100 Dickmeyer et al. Jul 2002 B1
6415304 Horvitz Jul 2002 B1
6421678 Smiga et al. Jul 2002 B2
6424979 Livingston et al. Jul 2002 B1
6424980 Izuka et al. Jul 2002 B1
6434567 De La Huerga Aug 2002 B1
6438545 Beauregard et al. Aug 2002 B1
6441753 Montgomery Aug 2002 B1
6442545 Feldman et al. Aug 2002 B1
6442591 Haynes et al. Aug 2002 B1
6456304 Anguilo et al. Sep 2002 B1
6470091 Koga et al. Oct 2002 B2
6473069 Gerpheide Oct 2002 B1
6493006 Gourdol et al. Oct 2002 B1
6477510 Johnson Nov 2002 B1
6480860 Monday Nov 2002 B1
6498982 Bellesfield Dec 2002 B2
6507839 Ponte Jan 2003 B1
6510504 Satyanarayanan Jan 2003 B2
6516321 De La Huerga Feb 2003 B1
6519557 Emens et al. Feb 2003 B1
6519603 Bays et al. Feb 2003 B1
6529899 Kraft et al. Mar 2003 B1
6546433 Matheson Apr 2003 B1
6553385 Johnson et al. Apr 2003 B2
6556972 Bakis et al. Apr 2003 B1
6556984 Zien Apr 2003 B1
6563514 Samar May 2003 B1
6564264 Creswell et al. May 2003 B1
6571241 Nosohara May 2003 B1
6571253 Thompson et al. May 2003 B1
6591260 Schwarzhoff et al. Jul 2003 B1
6595342 Maritzen et al. Jul 2003 B1
6601075 Huang et al. Jul 2003 B1
6604099 Chung et al. Aug 2003 B1
6615131 Rennard et al. Sep 2003 B1
6618733 White et al. Sep 2003 B1
6622140 Kantrowitz Sep 2003 B1
6623527 Hamzy Sep 2003 B1
6625581 Perkowski Sep 2003 B1
6629079 Spiegel et al. Sep 2003 B1
6629092 Berke Sep 2003 B1
6631519 Nicholson et al. Oct 2003 B1
6636880 Bera Oct 2003 B1
6643650 Slaughter et al. Nov 2003 B1
6654734 Mani et al. Nov 2003 B1
6654932 Bahrs et al. Nov 2003 B1
6658623 Schilit et al. Dec 2003 B1
6687485 Hopkins et al. Feb 2004 B2
6694307 Julien Feb 2004 B2
6697824 Bowman-Amuah Feb 2004 B1
6697837 Rodov Feb 2004 B1
6708189 Fitzsimons et al. Mar 2004 B1
6715144 Daynes et al. Mar 2004 B2
6717593 Jennings Apr 2004 B1
6718516 Claussen et al. Apr 2004 B1
6724403 Santoro et al. Apr 2004 B1
6728679 Strubbe et al. Apr 2004 B1
6732090 Shanahan et al. May 2004 B2
6732361 Andreoli et al. May 2004 B1
6741994 Kang et al. May 2004 B1
6742054 Upton, IV May 2004 B1
6745178 Emens et al. Jun 2004 B1
6745208 Berg et al. Jun 2004 B2
6766326 Cena Jul 2004 B1
6785670 Chiang Aug 2004 B1
6795808 Strubbe et al. Sep 2004 B1
6802061 Parthasarathy et al. Oct 2004 B1
6826726 Hsing et al. Nov 2004 B2
6829631 Forman et al. Dec 2004 B1
6845499 Srivastava et al. Jan 2005 B2
6857103 Wason Feb 2005 B1
6859908 Clapper Feb 2005 B1
6868525 Szabo Mar 2005 B1
6868625 Jacobsen et al. Mar 2005 B2
6874125 Carroll et al. Mar 2005 B1
6874143 Murray et al. Mar 2005 B1
6880129 Lee et al. Apr 2005 B1
6883137 Girardot et al. Apr 2005 B1
6889260 Hughes May 2005 B1
6898604 Ballinger et al. May 2005 B1
6901402 Corston-Oliver et al. May 2005 B1
6901403 Bata et al. May 2005 B1
6904560 Panda Jun 2005 B1
6925457 Britton et al. Aug 2005 B2
6925470 Sangudi et al. Aug 2005 B1
6944857 Glaser et al. Sep 2005 B1
6948133 Haley Sep 2005 B2
6950831 Haley Sep 2005 B2
6950982 Dourish Sep 2005 B1
6957385 Chan et al. Oct 2005 B2
6963867 Ford Nov 2005 B2
6964010 Sharp Nov 2005 B1
6964053 Ho et al. Nov 2005 B2
6968346 Hekmatpour Nov 2005 B2
6975983 Fortescue Dec 2005 B1
6976090 Ben-Shaul et al. Dec 2005 B2
6976209 Storisteanu et al. Dec 2005 B1
6981212 Claussen et al. Dec 2005 B1
6986104 Green et al. Jan 2006 B2
6990654 Carroll, Jr. Jan 2006 B2
7003522 Renar et al. Feb 2006 B1
7003560 Mullen et al. Feb 2006 B1
7013289 Horn et al. Mar 2006 B2
7013303 Faybishenko et al. Mar 2006 B2
7017046 Doyle et al. Mar 2006 B2
7017175 Alao et al. Mar 2006 B2
7024658 Cohen et al. Apr 2006 B1
7028312 Merrick et al. Apr 2006 B1
7032174 Montero et al. Apr 2006 B2
7039859 Sundaresan May 2006 B1
7051076 Tsuchiya May 2006 B2
7082392 Butler et al. Jul 2006 B1
7100115 Yennaco Aug 2006 B1
7111077 Starkovich et al. Sep 2006 B1
7113976 Watanabe Sep 2006 B2
7209915 Taboada et al. Apr 2007 B1
7216351 Maes May 2007 B1
7237190 Rollins et al. Jun 2007 B2
7281245 Reynar et al. Oct 2007 B2
7296230 Fukatsu et al. Nov 2007 B2
7302634 Lucovsky et al. Nov 2007 B2
7325194 Moore et al. Jan 2008 B2
7356537 Reynar et al. Apr 2008 B2
7356615 Cai et al. Apr 2008 B2
7392479 Jones et al. Jun 2008 B2
7421645 Reynar Sep 2008 B2
7454459 Kapoor et al. Nov 2008 B1
20010016880 Cai et al. Aug 2001 A1
20010029605 Forbes et al. Oct 2001 A1
20010041328 Fisher Nov 2001 A1
20010042098 Gupta et al. Nov 2001 A1
20010049676 Kepler et al. Dec 2001 A1
20010049702 Najmi Dec 2001 A1
20010056461 Kampe et al. Dec 2001 A1
20020002590 King et al. Jan 2002 A1
20020003469 Gupta Jan 2002 A1
20020003898 Wu Jan 2002 A1
20020004803 Serebrennikov Jan 2002 A1
20020007309 Reynar Jan 2002 A1
20020023113 Hsing et al. Feb 2002 A1
20020023136 Silver et al. Feb 2002 A1
20020026450 Kuramochi Feb 2002 A1
20020029304 Reynar et al. Mar 2002 A1
20020032775 Venkataramaiah et al. Mar 2002 A1
20020035581 Reynar et al. Mar 2002 A1
20020038180 Bellesfield et al. Mar 2002 A1
20020065110 Enns et al. May 2002 A1
20020065891 Malik May 2002 A1
20020066073 Lienhard et al. May 2002 A1
20020078222 Compas et al. Jun 2002 A1
20020087591 Reynar et al. Jul 2002 A1
20020091803 Imamura et al. Jul 2002 A1
20020099687 Krishnaprasad et al. Jul 2002 A1
20020100036 Moshir et al. Jul 2002 A1
20020103794 Chang Aug 2002 A1
20020103829 Manning et al. Aug 2002 A1
20020104080 Woodard et al. Aug 2002 A1
20020107735 Henkin et al. Aug 2002 A1
20020110225 Cullis Aug 2002 A1
20020120685 Srivastava et al. Aug 2002 A1
20020129107 Loughran et al. Sep 2002 A1
20020133523 Ambler et al. Sep 2002 A1
20020149601 Rajarajan et al. Oct 2002 A1
20020156774 Beauregard et al. Oct 2002 A1
20020156792 Gombocz et al. Oct 2002 A1
20020156929 Hekmatpour Oct 2002 A1
20020169802 Brewer et al. Nov 2002 A1
20020175955 Gourdol et al. Nov 2002 A1
20020178008 Reynar Nov 2002 A1
20020178182 Wang et al. Nov 2002 A1
20020184247 Jokela et al. Dec 2002 A1
20020188637 Bailey et al. Dec 2002 A1
20020188941 Cicciarelli et al. Dec 2002 A1
20020194166 Fowler Dec 2002 A1
20020196281 Audleman et al. Dec 2002 A1
20020198909 Huynh et al. Dec 2002 A1
20030002391 Biggs Jan 2003 A1
20030004937 Salmenkaita et al. Jan 2003 A1
20030005411 Gerken Jan 2003 A1
20030009489 Griffin Jan 2003 A1
20030014745 Mah et al. Jan 2003 A1
20030025728 Ebbo et al. Feb 2003 A1
20030030672 Hughes et al. Feb 2003 A1
20030046316 Gergic et al. Mar 2003 A1
20030050911 Lucovsky et al. Mar 2003 A1
20030050924 Faybishenko et al. Mar 2003 A1
20030051236 Pace et al. Mar 2003 A1
20030055818 Faybishenko et al. Mar 2003 A1
20030056207 Fischer et al. Mar 2003 A1
20030081791 Erickson et al. May 2003 A1
20030083910 Sayal et al. May 2003 A1
20030084138 Tavis et al. May 2003 A1
20030088544 Kan et al. May 2003 A1
20030097318 Yu et al. May 2003 A1
20030101190 Horvitz et al. May 2003 A1
20030101204 Watson May 2003 A1
20030101416 McInnes et al. May 2003 A1
20030105806 Gayle et al. Jun 2003 A1
20030106040 Rubin et al. Jun 2003 A1
20030115039 Wang Jun 2003 A1
20030121033 Peev et al. Jun 2003 A1
20030126120 Faybishenko et al. Jul 2003 A1
20030126136 Omoigui Jul 2003 A1
20030140308 Murthy et al. Jul 2003 A1
20030154144 Pokorny et al. Aug 2003 A1
20030158841 Britton et al. Aug 2003 A1
20030158851 Britton et al. Aug 2003 A1
20030167445 Su et al. Sep 2003 A1
20030172343 Leymaster et al. Sep 2003 A1
20030176995 Sukehiro Sep 2003 A1
20030177341 Devillers Sep 2003 A1
20030182258 Sakamoto et al. Sep 2003 A1
20030182391 Leber et al. Sep 2003 A1
20030192040 Vaughan Oct 2003 A1
20030195871 Luo et al. Oct 2003 A1
20030195937 Kircher et al. Oct 2003 A1
20030212527 Moore et al. Nov 2003 A1
20030220795 Arayasantiparb et al. Nov 2003 A1
20030220913 Doganata et al. Nov 2003 A1
20030229593 Raley et al. Dec 2003 A1
20030233330 Raley et al. Dec 2003 A1
20030237049 Sawicki et al. Dec 2003 A1
20040002939 Arora et al. Jan 2004 A1
20040003389 Reynar et al. Jan 2004 A1
20040006564 Lucovsky et al. Jan 2004 A1
20040006741 Radja et al. Jan 2004 A1
20040024875 Horvitz et al. Feb 2004 A1
20040039990 Bakar et al. Feb 2004 A1
20040044959 Shanmugasundaram et al. Mar 2004 A1
20040068694 Kaler et al. Apr 2004 A1
20040075697 Maudlin Apr 2004 A1
20040083218 Feng Apr 2004 A1
20040133846 Khoshatefeh et al. Jul 2004 A1
20040143581 Bohannon et al. Jul 2004 A1
20040162833 Jones et al. Aug 2004 A1
20040165007 Shafron Aug 2004 A1
20040172584 Jones et al. Sep 2004 A1
20040199861 Lucovsky Oct 2004 A1
20040201867 Katano Oct 2004 A1
20040230666 Taboada et al. Nov 2004 A1
20040236717 Demartini et al. Nov 2004 A1
20040243575 Ohashi Dec 2004 A1
20050050164 Burd et al. Mar 2005 A1
20050055330 Britton et al. Mar 2005 A1
20050094850 Nakao May 2005 A1
20050108195 Yalovsky et al. May 2005 A1
20050120313 Rudd et al. Jun 2005 A1
20050155017 Berstis et al. Jul 2005 A1
20050182617 Reynar et al. Aug 2005 A1
20050187926 Britton et al. Aug 2005 A1
20050193335 Dorai et al. Sep 2005 A1
20050278309 Evans et al. Dec 2005 A1
20060026170 Kreitler et al. Feb 2006 A1
20060101005 Yang et al. May 2006 A1
20060173674 Nakajima et al. Aug 2006 A1
20070005702 Tokuda et al. Jan 2007 A1
20070073652 Taboada et al. Mar 2007 A1
20070136261 Taboada et al. Jun 2007 A1
20080021886 Wang-Aryattanwanich et al. Jan 2008 A1
20080046812 Reynar et al. Feb 2008 A1
Foreign Referenced Citations (38)
Number Date Country
2 246 920 Mar 2000 CA
100429655 Oct 2008 CN
ZL 200410005390.8 Oct 2008 CN
0364180 Apr 1990 EP
0481784 Apr 1992 EP
0598511 May 1994 EP
0872827 Oct 1998 EP
0810520 Dec 1998 EP
1093058 Apr 2001 EP
1280068 Jan 2003 EP
1361523 Nov 2003 EP
1376392 Jan 2004 EP
1447754 Aug 2004 EP
1 452 966 Sep 2004 EP
64-0088771 Apr 1989 JP
05-174013 Jul 1993 JP
08-272662 Oct 1996 JP
09-138636 May 1997 JP
2000-222394 Aug 2000 JP
2000-231566 Aug 2000 JP
2001-014303 Jan 2001 JP
2001-125994 May 2001 JP
2001-522112 Nov 2001 JP
2002-041353 Feb 2002 JP
2002163250 Jun 2002 JP
2002-222181 Aug 2002 JP
WO 9507510 Mar 1995 WO
WO 9917240 Apr 1999 WO
WO 0054174 Sep 2000 WO
WO 0067117 Nov 2000 WO
WO 0073949 Dec 2000 WO
WO 0118687 Mar 2001 WO
WO 0137170 May 2001 WO
WO 00186390 Nov 2001 WO
WO 0299627 Jan 2002 WO
WO 0215518 Feb 2002 WO
WO 0242928 May 2002 WO
WO 2004012099 Feb 2004 WO
Related Publications (1)
Number Date Country
20080021886 A1 Jan 2008 US
Continuation in Parts (1)
Number Date Country
Parent 11234968 Sep 2005 US
Child 11803689 US