Dynamically displaying current status of tasks

Information

  • Patent Grant
  • 7877686
  • Patent Number
    7,877,686
  • Date Filed
    Wednesday, October 11, 2006
    17 years ago
  • Date Issued
    Tuesday, January 25, 2011
    13 years ago
Abstract
The current status of a list of tasks to be performed is dynamically displayed. The tasks may be performed by a user (e.g., data entered by the user, words spoken by the user, actions taken by the user, and so forth) or alternatively by a computer (e.g., the steps it follows in carrying out a programmed task). At least a portion of the list is displayed at any given time along with an indication of which task is the next task to be performed. As the tasks are completed, the current status of the progression through the items on the list is dynamically updated so as to readily inform the user (or someone else) as to what the current task is that needs to be performed, as well as what tasks have already been performed and/or what tasks remain to be performed.
Description
TECHNICAL FIELD

The present invention is directed to graphical user interfaces and more particularly to dynamically displaying the current status of tasks.


BACKGROUND

As computers become increasingly powerful and commonplace, they are being used for an increasingly broad variety of tasks. For example, in addition to traditional activities such as running word processing and database applications, computers are increasingly becoming an integral part of users' daily lives. Programs to schedule activities, generate reminders, and provide rapid communication capabilities are becoming increasingly popular. Moreover, computers are increasingly present during virtually all of a person's daily activities. For example, hand-held computer organizers (e.g., PDAS) are increasingly common, and communication devices such as portable phones are increasingly incorporating computer capabilities. More recently, the field of wearable computers (e.g., with eyeglass displays) has begun to expand, creating a further presence of computers in people's daily lives.


Computers often progress through a particular series of steps when allowing a user to accomplish a particular task. For example, if a user desires to enter a new name and address to an electronic address book, the computer progresses through a series of steps prompting the user to enter the desired information (e.g., name, street address, city, state, zip code, phone number, etc.). On computers with large displays (e.g., typical desktop computers), sufficient area exists on the display to provide an informative and useable user interface (UI) that allows the user to enter the necessary data for the series of steps. However, problems exist when attempting to guide the user through the particular series of steps on smaller displays. Without the large display area, there is frequently insufficient room to provide the prompts in the same informative and useable manner.


Additionally, the nature of many new computing devices with small displays (e.g., PDAs and wearable computers) is that the computing devices are transported with the user. However, traditional computer programs are not typically designed to efficiently present information to users in a wide variety of environments. For example, most computer programs are designed with a prototypical user being seated at a stationary computer with a large display device, and with the user devoting fill attention to the display. In that environment, the computer program can be designed with the assumption that the user's attention is predominately on the display device. However, many new computing devices with small displays can be used when the user's attention is more likely to be diverted to some other task (e.g., driving, using machinery, walking, etc.). Many traditional computer programs, designed with large display devices in mind, frequently do not allow the user to quickly and easily reorient him- or her-self to the task being carried out by the computer. For example, if the user is performing a task by following a series of steps on a wearable computer, looks away from the display to focus his or her attention on crossing a busy intersection, and then returns to the task, it would be desirable for the user to be able to quickly and easily reorient him- or her-self to the task (in other words, readily know what steps he or she has accomplished so far and what the next step to be performed is).


Accordingly, there is a need for new techniques to display the current status of tasks to a user.


SUMMARY

Dynamically displaying current status of tasks is described herein.


According to one aspect, a list of items corresponding to tasks that are to be performed are displayed. The tasks may be performed by a user (e.g., data entered by the user, words spoken by the user, actions taken by the user, and so forth) or alternatively by a computer (e.g., the steps followed in carrying out a programmed task). At least a portion of the list is displayed at any given time along with an indication of which task is the next task to be performed. As the user progresses through the set of tasks, the current status of his or her progression through the corresponding items on the list is dynamically updated so as to readily inform the user (or someone else) as to what the current task is that needs to be performed, as well as what tasks have already been performed and/or what tasks remain to be performed.


According to another aspect, only a subset of the list of items is displayed at any given time. The list is scrolled through as the tasks are performed so that different items are displayed as part of the subset as tasks are performed.


According to another aspect, multiple lists of tasks to be performed by multiple individuals (or computing devices) are displayed on a display of the user. As the multiple individuals (or computing devices) finish the tasks in their respective lists, an indication of such completion is forwarded to the user's computer, which updates the display to indicate the next task in the list to be displayed. The user is thus able to monitor the progress of the multiple individuals (or computing devices) in carrying out their respective tasks.





BRIEF DESCRIPTION OF THE DRAWINGS

The present invention is illustrated by way of example and not limitation in the figures of the accompanying drawings. The same numbers are used throughout the figures to reference like components and/or features.



FIG. 1 illustrates an exemplary computing device such as may be used in accordance with certain embodiments of the invention.



FIG. 2 illustrates an exemplary user interface display in accordance with certain embodiments of the invention.



FIG. 3 illustrates an exemplary display of an item list and current location marker such as may be used in accordance with certain embodiments of the invention.



FIGS. 4A and 4B illustrates different ways in which the prompt in a sequence can be changed.



FIG. 5 is a flowchart illustrating an exemplary process for displaying the current status of tasks in accordance with certain embodiments of the invention.



FIGS. 6 and 7 illustrate alternative displays of the item list and current location identifiers with reference to a sequence of tasks to be completed in order to record a new inspection (e.g., a building inspection).



FIG. 8 illustrates an exemplary distributed environment in which the status of tasks being performed by multiple users can be monitored.



FIG. 9 illustrates an exemplary group of lists that may be displayed for the distributed environment of FIG. 8.





DETAILED DESCRIPTION

Dynamically displaying the current status of tasks is described herein. A list of items or prompts that is to be traversed by a user in a particular order is displayed to the user (e.g., a set of tasks the user is to perform in a particular sequence as part of his or her job, a set of words to be spoken, a list of questions or fields to be answered, and so forth). At least a portion of the list is displayed at any given time along with an indication of which item in the list is the next item that the user needs to handle (e.g., the next task to perform, the next word to speak, the next question to answer, and so forth). As the user progresses through the list of tasks, the current status of his or her progression through the prompts on the list is dynamically updated so as to readily inform the user as to what the current task is that needs to be performed, as well as what tasks have already been performed and/or what tasks remain to be performed.



FIG. 1 illustrates an exemplary computing device 100 such as may be used in accordance with certain embodiments of the invention. Computing device 100 represents a wide variety of computing devices, such as wearable computers, personal digital assistants (PDAs), handheld or pocket computers, telephones (e.g., cell phones), laptop computers, gaming consoles or portable gaming devices, desktop computers, Internet appliances, etc. Although the dynamic displaying of current status of tasks described herein is particularly useful if computing device 100 has a small display, any size display may be used with the invention.


Computing device 100 includes a central processing unit (CPU) 102, memory 104, a storage device 106, one or more input controllers 108, and one or more output controllers 110 (alternatively, a single controller may be used for both input and output) coupled together via a bus 112. Bus 112 represents one or more conventional computer buses, including a processor bus, system bus, accelerated graphics port (AGP), universal serial bus (USB), peripheral component interconnect bus (PCI), etc.


Memory 104 may be implemented using volatile and/or non-volatile memory, such as random access memory (RAM), read only memory (ROM), Flash memory, electronically erasable programmable read only memory (EPROM), disk, and so forth. Storage device 106 is typically implemented using non-volatile “permanent” memory, such as ROM, EEPROM, magnetic or optical diskette, memory cards, and the like.


Input controller(s) 108 are coupled to receive inputs from one or more input devices 114. Input devices 114 include any of a variety of conventional input devices, such as a microphone, voice recognition devices, traditional qwerty keyboards, chording keyboards, half qwerty keyboards, dual forearm keyboards, chest mounted keyboards, handwriting recognition and digital ink devices, a mouse, a track pad, a digital stylus, a finger or glove device to capture user movement, pupil tracking devices, a gyropoint, a trackball, a voice grid device, digital cameras (still and motion), and so forth.


Output controller(s) 110 are coupled to output data to one or more output devices 116. Output devices 116 include any of a variety of conventional output devices, such as a display device (e.g., a hand-held flat panel display, an eyeglass-mounted display that allows the user to view the real world surroundings while simultaneously overlaying or otherwise presenting information to the user in an unobtrusive manner), a speaker, an olfactory output device, tactile output devices, and so forth.


One or more application programs 118 are stored in memory 104 and executed by CPU 102. When executed, application programs 118 generate data that may be output to the user via one or more of the output devices 116 and also receive data that may be input by the user via one or more of the input devices 114. For discussion purposes, one particular application program is illustrated with a user interface (UI) component 120 that is designed to present information to the user including dynamically displaying the current status of tasks as discussed in more detail below.


Although discussed herein primarily with reference to software components and modules, the invention may be implemented in hardware or a combination of hardware, software, and/or firmware. For example, one or more application specific integrated circuits (ASICs) could be designed or programmed to carry out the invention.



FIG. 2 illustrates an exemplary user interface display in accordance with certain embodiments of the invention. User interface display 150 can be, for example, the display generated by user interface 120 of FIG. 1. UI display 150 includes an item or prompt list portion 152, a user choices portion 154, and an applet window portion 156. Additional labels or prompts 158 may also be included (e.g., a title for the task being handled, the current time, the amount of time left to finish the task, etc.). List portion 152 displays a list that prompts the user of tasks that are to be handled by the user in a particular order. An indication is also made to the user within list portion 152 of where the user currently is in that list (that is, what the next item or task is that needs to be handled by the user), and also identifies items or tasks (if any) that have already been handled by the user as well as future items or tasks (if any) that need to be handled by the user. The manner in which an item or task is handled by the user is dependent on the nature of the list, as discussed in more detail below.


User choices portion 154 displays the options for the user to select from based on the next item or task in the list that needs to be handled by the user. For example, assume that the list in portion 152 is a list prompting the user regarding what information needs to be gathered in order for the user to set up a meeting with a potential customer. The list of prompts in list portion 152 could be a list of tasks the user must perform—that is, a list of information that needs to be collected (e.g., the customer's name, the location of the meeting, the time of the meeting, and so forth). If we further assume that the current task that needs to be handled by the user is entry of the location of the meeting, user choices portion 154 could display the various permissible inputs for the location of the meeting (e.g., at the user's main office, at a remote office, at the customer's facility, and so forth).


By way of another example, the item list may be a list of prompts for the information to be verbally input by the user in each step, with user choices portion 154 displaying a list of which words can be spoken in each step.


Applet window portion 156 displays additional information clarifying or amplifying the choices in user choices portion 154 (or the current item or task in item list portion 152). Following the previous example, if the current task that needs to be handled by the user is entry of the location of the meeting, applet window portion 156 could display additional descriptive information for one or more of the permissible inputs for the location of the meeting (e.g., a street address, a distance from the user's home, a map flagging the locations of the various meeting locations, and so forth).


The list displayed in list portion 152 is a list of items that is to be traversed by a user in a particular order. This can be a list of task prompts regarding tasks that the user is to perform, a list of tasks prompts regarding tasks to be performed by another user or computer, and so forth. Any of a wide variety of lists can be displayed, such as a set of tasks the user is to perform in a particular sequence as part of his or her job (this can be used, for example, to assist in training users to do their jobs), a set of tasks the user is to perform in a particular sequence in order to assemble or install a product he or she has purchased, a set of words to be spoken (e.g., queues as to what voice inputs the user is to make in order to carry out a task), a list of questions or fields to be answered, and so forth. Alternatively, the list of items may be a list of tasks or steps to be performed by a computer or computer program. Such a list can be used, for example, by a user to track the process of the computer or program in carrying out the particular sequence of steps. Additionally, depending on the nature of the sequence of tasks being performed, multiple lists of items may be displayed (e.g., a multi-tiered item list). Situations can arise in which the list of items or prompts is too large to be displayed in its entirety. In such situations, only a portion of the list is displayed (e.g., centered on the item or prompt for the next task to be performed). This subset of the steps to be performed is then scrolled as tasks are completed, resulting in a dynamic list display that changes when a task is completed.


By displaying the list of prompts (or at least a portion thereof), the user is able to readily identify the status of the set of tasks being performed (in other words, the user is also able to obtain a feel for where he or she is (or where the user or computer being monitored is) in progressing through the sequence of tasks). The user is able to quickly identify one or more previous tasks (if any) in the sequence, as well as one or more future tasks (if any) in the sequence. Such information is particularly helpful in reorienting the user to the sequence of tasks if his or her attention has been diverted away from the sequence. For example, the user's attention may be diverted away from the sequence to answer questions from another employee. After answering the question, the user can look back at display 150 and quickly reorient him- or her-self into the sequence of tasks being performed.


Item lists may be a set of predetermined items, such as a particular set of steps to be followed to assemble a machine or a set of words to be uttered to carry out a task for a speech-recognizing computer. Alternatively, item lists may be dynamic, changing based on the user's current location, current activity, past behavior, etc. For example, computer 100 of FIG. 1 may detect where the user is currently located (e.g., in his or her office, in the assembly plant, which assembly plant, etc.), and provide the appropriate instructions to perform a particular task based on that current location. Additional information regarding detecting the user's current context (e.g., current location, current activity, etc.) can be found in a co-pending U.S. patent application Ser. No. 09/216,193, entitled “Method and System For Controlling Presentation of Information To a User Based On The User's Condition”, which was filed Dec. 18, 1998, and is commonly assigned to Tangis Corporation. This application is hereby incorporated by reference.



FIG. 3 illustrates an exemplary display of an item list and current location marker such as may be used in accordance with certain embodiments of the invention. Assume that the sequence of items on the list is a set of prompts regarding information that needs to be supplied by the user in order to schedule a meeting. In the illustrated example, this list includes the following information: who the meeting is with (who), the date and time for the meeting (when), the duration of the meeting (how long), the location of the meeting (where), an indication of any materials to bring to the material (bring), and an indication of anyone else that should be notified of the meeting (cc).



FIG. 3 illustrates an example item list displayed in list portion 152 of FIG. 2. Initially, the item list 170 is displayed, including the following prompts: “who?”, “when?”, “how long?”, “where?”, and “bring?”. The prompts in list 170 provide a quick identification to the user of what information he or she needs to input for each task in the sequence of tasks for scheduling a meeting. Due to the limited display area, list 170 does not include the prompts for each step in the sequence, but rather scrolls through the prompts as discussed in more detail below. A current location marker 172 is also illustrated in FIG. 3 to identify to the user what the current step is in the sequence. Assuming the meeting scheduling process has just begun, the first step in the sequence is to identify who the meeting is with (who), which is identified by current location marker 172 being situated above the prompt “who?”. In the illustrated example, location marker 172 is a circle or ball. Alternatively, other types of presentation changes may be made to alter the appearance of a prompt (or area surrounding a prompt) in order to distinguish the current step from other steps in the sequence. For example, different shapes other than a circle or ball may be used for a location marker, the text for the prompt may be altered (e.g., a different color, a different font, a different size, a different position on screen (e.g., slightly higher or lower than other prompts in the list), and so forth), the display around the prompt may be altered (e.g., the prompt may be inverted so that it appears white on a black background rather than the more traditional black on a white background, the prompt may be highlighted, the prompt may be encircled by a border, and so forth), etc. Those skilled in the art can easily determine a variety of alternate methods for marking the current step.


One additional presentation change that can be made to distinguish the current step from other steps in the sequence is to change the prompt itself. The prompt could be replaced with another prompt, or another prompt could be superimposed on the prompt for the current step. For example, the user may have a set of individuals that he or she typically meets with, and these may be superimposed on the “who?” prompt when it is the current step. FIGS. 4A-4B illustrates different ways in which the prompt in a sequence can be changed. FIG. 4A illustrates an example item list with the prompt for the current step in the sequence being superimposed with various input options. A list 190 is illustrated and the current step is to input who the meeting is to be with (the “who?” prompt). As illustrated, a set of common people that the user schedules meetings with (Jane, David, Lisa, and Richard) are superimposed on the “who?” prompt. The appearance of the underlying prompt “who?” may be changed (e.g., shadowed out, different color, etc.) in order for overlying input options to be more easily viewed. It is to be appreciated that the exact location of the superimposed set of input options can vary (e.g., the characters of one or more input options may overlap the prompt, or be separated from the prompt).



FIG. 4B illustrates an example item list with the prompt for the current step in the sequence being replaced by the set of input options. A list 192 is illustrated and the current step is to input who the meeting is to be with (the “who?” prompt). However, as illustrated, the “who?” prompt is replaced with a set of common people that the user schedules meetings with (Jane, David, Lisa, and Richard).


The user is thus given an indication of both the current step in the sequence as well as common responses to that step. The type of information that is superimposed on or replaces the prompt can vary based on the current step. For example, when the “when?” prompt is the current step it may have superimposed thereon the times that the user is available for the current day (or current week, and so forth).


Returning to FIG. 3, once the user enters the information identifying who the meeting is with (assume for purposes of this example the meeting is with Bob Smith), list 170 is changed to list 174 in which the prompt “who?” is replaced with the name “Bob Smith” and the current location marker 172 is changed to indicate the next prompt (“when?”) is the current task that needs to be handled by the user. Assuming the user inputs that the meeting is to occur at 10 am on October 31, list 174 is changed to list 176 in which the prompt “when?” is replaced with the date and time of the meeting, and the current location marker 172 is changed to indicate the next prompt (“how long?”) is the current task that needs to be handled by the user. Thus, as can be seen from lists 172, 174, and 176, the current location marker 172 “bounces” along the list from item to item, making the user readily aware of what the current task is that he or she should be performing (that is, which data he or she should be inputting in the present example).


Once the user inputs the duration of the meeting, list 176 is changed to list 178. Given the limited display area, the user interface now scrolls the list so that the leftmost item is no longer shown but a new item is added at the right. Thus, the identification of “Bob Smith” is no longer shown, but a prompt for who else should be notified of the meeting (“cc?”) is now shown. Once the user enters the location for the meeting (“home office”), list 178 is changed to list 180 and current location marker 172 is changed to indicate the next prompt (“bring?”) is the current task that needs to be handled by the user. Thus, as can be seen with lists 176, 178, and 180, current location marker 172 may not be moved in response to an input but the list may be scrolled.


Thus, as can be seen in FIG. 3, the item list provides a series of prompts identifying what tasks (if any) in the sequence have already been performed and what tasks (if any) remain to be performed. For those tasks that have already been performed, an indication is made in the list as to what action was taken by the user for those tasks (e.g., what information was entered by the user in the illustrated example). Thus, the user can readily orient him- or her-self to the sequence of steps, even if his or her attention is diverted from the display for a period of time. Alternatively, the prompts in the list need not be changed when the user enters the data (e.g., “who?” need not be replaced by “Bob Smith”). The data input by the user can alternatively be displayed elsewhere (e.g., in applet window portion 156).


One advantage of the item lists described herein is that the lists present the multiple steps or items in a concise manner—these steps or items can also be referred to as idioms. When these idioms are presented together in a sequence, the provide more information to the user than when presented in independent form. For example, the idiom “bring?” by itself does not present as much information to the user as the entire sequence of idioms “who?”, “when?”, “how long?”, “where?”, and “bring?”.


The use of item lists as described herein also allows an individual to “zoom” in (and thus gain more information about) a particular task. For example, with reference to FIG. 3, the user is able to select and zoom in on the “where?” prompt and have additional information about that task displayed (e.g., the possible locations for the meeting). The user is able to “backtrack” through the list (e.g., by moving a cursor to the desired item and selecting it, or using a back arrow key or icon, or changing the current location marker (e.g., dragging and dropping the location marker to the desired item), etc.) and see this additional information for tasks already completed. Alternatively, the “backtracking” may be for navigational rather than informational purposes. Moving back through the list (whether by manipulation of the location marker or in some other manner) may also be used to accomplish other types of operations, such as defining a macro or annotation.


Additionally, by displaying the prompts for future items, the speed of handling of the sequence of the items by the user can potentially be increased. For example, the user can see the prompt for the next one or more items in the list and begin thinking about how he or she is going to handle that particular item even before the computing device is finished processing the input for the item he or she just handled.


According to another embodiment, multiple location markers are displayed along with the item list—one marker identifying the current item to be handled by the user and another marker identifying the current item being processed by the computing device. Situations can arise where the user can input data quicker than it can be processed by the computing device. For example, the user may be able to talk at a faster rate than the computing device is able to analyze the speech.


The use of two such markers can allow the user to identify if the computing device is hung up on or having difficulty processing a particular input (e.g., identify a particular word spoken by the user, misrecognition of the input, improper parsing, etc.), the user can identify this situation and go back to the task the computing device is having difficulty processing and re-enter the speech.



FIG. 5 is a flowchart illustrating an exemplary process for displaying the current status of tasks in accordance with certain embodiments of the invention. The process of FIG. 5 is carried out by the user interface of a computing device (e.g., interface 120 of FIG. 1), and may be performed in software. Although FIG. 5 is discussed with reference to a location marker, it is to be appreciated that any of the presentation changes discussed above an be used to identify items in the list.


Initially, an item list is displayed (act 200), which is a sequence of items or prompts for the user to follow. A current Location marker is also displayed to identify the first item in the list (act 202), and input corresponding to the first item in the list is received (act 204). The nature of this input can vary depending on the sequence of tasks itself (e.g., it may be data input by a user, an indication from another computer program that the task has been accomplished, etc.). A check is then made as to whether the end of the list has been reached (at 206). If the end of the list has been reached then the process stops (act 208), waiting for the next sequence of tasks to begin or for the user to backtrack to a previously completed task.


However, if the end of the list has not been reached, then a check is made as to whether scrolling of the list is needed (act 210). Whether scrolling of the list is needed can be based on a variety of different factors. For example, the user interface may attempt to make sure that there are always at least a threshold number of prompts before and/or after the current location marker, the user interface may attempt to make sure that the current task remains as close to the center of the item list as is possible but that no portions of the item list be left empty, etc. These factors can optionally be user-configurable preferences, allowing the user to adjust the display to his or her particular likes and/or dislikes (e.g., the user may prefer to see more future tasks than previous tasks).


If scrolling is needed, then the item list is scrolled by one item (or alternatively more items) in the appropriate direction (act 212). The amount that the item list is scrolled can vary (e.g., based on the sizes of the different items in the list). The appropriate direction for scrolling can vary based on the activity being performed by the user and the layout of the list (e.g., in the example of FIG. 3, the scrolling is from right to left when progressing forward through the list, and left to right when backtracking through the list). Regardless of whether the ordered item list is scrolled, after act 210 or 212 the current location marker is moved as necessary to identify the next item in the list that is to be handled by the user (act 214). In some situations, movement of the current location marker may not be necessary due to the scrolling performed (e.g., as illustrated with reference to lists 176 and 178 in FIG. 3). At some point after the current location marker is moved (if necessary), user input is received corresponding to the identified next item in the list (act 216). The process then returns to determine whether the end of the list has been reached (act 206).


The item list and current location identifier or marker can be displayed in a wide variety of different manners. FIGS. 6 and 7 illustrate alternative displays of the item list and current location identifiers with reference to a sequence of tasks to be completed in order to record a new inspection (e.g., a building inspection). In the exemplary display 240 of FIG. 6, an item list portion 242 and an applet window portion 244 are illustrated. The item list portion 242 includes a list of tasks that are to be handled by the user, each of which is information to be entered by the user. Once entered, the information is displayed in applet window portion 244. A current location marker 246 advances down the list in portion 242 to identify the current information that the user needs to input (the customer's state in the illustrated display). Additional information is displayed at the top of display 240, including a prompt 248 identifying a type of information being entered by the user (inspection information).


In the exemplary display 260 of FIG. 7, a multi-tiered item list is displayed including list portion 262 and list portion 264. In list portion 262, prompts for the overall process of recording a new inspection are listed, including selecting a new inspection option and then entering inspection information. Two current location markers 266 and 268 are illustrated, each providing a visual indication of where in the overall process the current user is (inspection info in the illustrated display). A prompt 270 provides a further identification to the user of where he or she is in the overall process. List portion 264 includes prompts for the process of entering inspection information, with a current location marker 272 providing a visual indication of where in the inspection information entry process the user currently is (customer state in the illustrated display).


In addition to tracking the status of tasks being performed by a single user, the dynamic displaying of the current status of tasks of the present invention can further be used to track the status of tasks being performed by multiple users. In this situation, information indicating the status of tasks being performed by multiple users is communicated back to the computing devices of one or more other users, who in turn can view the status information of multiple users on a single display.



FIG. 8 illustrates an exemplary distributed environment in which the status of tasks being performed by multiple users can be monitored. In the illustrated example, multiple users Jamie, John, Max, and Carol each have a wearable computer with an eyeglass display 300, 302, 304, and 306, respectively. An item list is displayed on the eyeglass display for each of these users, with a current location marker to identify to the respective users where they are in the task sequences they are performing. Information regarding their current location is also communicated to another computing device of their supervisor Jane, who is also wearing an eyeglass display 308. The information communicated to Jane's computer can be simply an identification of the current location (e.g., Jane's computer may already be programmed with all of the tasks in the list), or alternatively the entire (or at least a portion of) the item list. The information for one or more of the users Jamie, John, Max, and Carol can then be displayed on display 308, allowing Jane to keep track of the status of each of the users Jamie, John, Max, and Carol in performing their tasks. This allows Jane, as the supervisor, to see if people are proceeding through their tasks too quickly or too slowly (e.g., a user may be having difficulty and need assistance), to know when the individual users will be finished with their tasks, etc. If a multi-tiered item list is being used, then the supervisor can also zoom in on the particular step of a user and get additional information regarding where the user is stuck.



FIG. 9 illustrates an exemplary group of lists that may be displayed on eyeglass display 308 of FIG. 8. Assume that each of the users John, Jamie, Max, and Carol are each performing a machine assembly process involving the following tasks: inventory the necessary parts, assemble an intake, lubricate a core part of the machine, install the assembled intake, verify that the batteries are fully charged, and then run a diagnostic program. The tasks in the machine assembly process are illustrated in a portion 310 of display 308 in an abbreviated form. Alternatively, the tasks illustrated in portion 310 may not be abbreviated, or may be represented in some other manner (e.g., as icons). A separate item list is displayed on display 308 for each of the users along with a corresponding current location marker in the shape of a ball or circle. Thus, as illustrated in FIG. 9, the viewer of display 308 can readily identify that John is at the “assemble intake” step, Jamie and Max are both at the “install intake” step, and Carol is at the “verify charge” step. Thus, the supervisor viewing display 308 can quickly and easily determine, based on the item list and current location markers, that each of Jamie, Max, and Carol is proceeding normally through the assembly process, but that John is hung up on the “assemble intake” step, so the supervisor can check with John to see if he is experiencing difficulties with this step.


Conclusion

Although the description above uses language that is specific to structural features and/or methodological acts, it is to be understood that the invention defined in the appended claims is not limited to the specific features or acts described. Rather, the specific features and acts are disclosed as exemplary forms of implementing the invention.

Claims
  • 1. One or more computer-readable media storing a computer program that, when executed by one or more processors on a mobile computer, causes the one or more processors to: present, on a display, a subset of a plurality of steps in an order to be performed by a user;alter an appearance of a current step in the subset of steps that needs to be performed by the user to distinguish the current step from other steps in the subset;receive information about a current context of the user from a context awareness component that receives sensed information from multiple sources and that mediates amongst the multiple sources to build a model of the current context of the user, the information about the current context of the user comprising information on the user's current location, current activity and/or previous behavior;in response to the received information about the current context of the user, alter instructions for performing one or more of the subset of steps that need to be performed by the user based on the received information;allow the user to input data corresponding to the current step;when input data is not received from the user for the current step and information received from the context awareness component indicates that the user currently has a high cognitive load, alter a manner of presenting information to the user relating to the current step, comprising selecting a manner of presentation that is less intrusive upon the user and formatting the information relating to the current step based on the selected manner of presentation;scroll, in response to user input of data corresponding to the current step, the plurality of steps so that a new subset of the plurality of steps is presented to the user; andamend the step for which the data input was received from the user with indications of that data input.
  • 2. One or more computer-readable media as recited in claim 1, wherein the computer program further causes the one or more processors to: alter, in response to user input of data corresponding to the current step, the appearance of another step as necessary to identify the new current step in the subset of steps that needs to be performed by the user.
  • 3. One or more computer-readable media as recited in claim 1, wherein altering the appearance of the current step comprises marking the current location with a ball.
  • 4. One or more computer-readable media as recited in claim 1, wherein altering the appearance of the current step comprises displaying the current step differently than other steps in the subset.
  • 5. One or more computer-readable media as recited in claim 1, wherein altering the appearance of the current step comprises replacing the current step with a set of one or more input options for the current step.
  • 6. One or more computer-readable media as recited in claim 1, wherein altering the appearance of the current step comprises superimposing, on the current step, a set of one or more input options for the current step.
  • 7. One or more computer-readable media as recited in claim 1, wherein the computer program further causes the one or more processors to: replace, in the subset, the display of the current step with a display of the input data.
  • 8. One or more computer-readable media as recited in claim 1, wherein the computer program further causes the one or more processors to: display a current processing marker that identifies which step in the subset of steps is currently being processed by the one or more processors.
  • 9. One or more computer-readable media as recited in claim 1, wherein the one or more computer-readable media comprise a computer memory of a wearable computer.
  • 10. A method comprising: displaying a list of items to be handled by a user in a particular order;identifying one item in the list of items that is a current item;displaying instructions for performing the current item, the instructions specifying one or more possible inputs by the user in relation to the current item;receiving information about a current context of the user;in response to the received information about the current context of the user, altering a manner in which the instructions for performing the current item are presented based on the received information, the altering comprising selecting a manner of presentation that is less intrusive upon the user and formatting the instructions for performing the current item based on the selected manner of presentation;receiving a user input corresponding to the current item;updating, in response to receiving the user input, the list to reflect the received user input corresponding to the current item; andupdating the identification of the one item that is the current item to indicate the next item in the list of items as the current item.
  • 11. A method as recited in claim 10, wherein displaying the list of items comprises displaying at least one item corresponding to a task that has already been performed and at least one item corresponding to a task that still needs to be performed by the user.
  • 12. A method as recited in claim 10, wherein displaying the list of items comprises displaying, after the user input is received, the user input in place of the corresponding item.
  • 13. A method as recited in claim 10, wherein displaying the list of items comprises displaying only a subset of the list of items at any given time, the subset of the list of items comprising a plurality of items.
  • 14. A method as recited in claim 13, further comprising scrolling through the list of items to display different subsets as items in the list are handled by the user.
  • 15. A method as recited in claim 10, further comprising displaying a current processing marker identifying an item in the list of items corresponding to a current user input being processed.
  • 16. A method as recited in claim 10, wherein the list of items comprises a list of tasks to be completed by the user, and wherein handling of an item by the user comprises the user completing the task.
  • 17. A method as recited in claim 16, wherein the list of tasks comprises a list of prompts corresponding to data to be entered into the computer by the user.
  • 18. A method as recited in claim 10, wherein the list of items comprises a list of prompts of words to be spoken by the user, and wherein handling of an item by the user comprises speaking one or more words corresponding to the prompt.
  • 19. A system comprising: one or more output devices;a user interface component, coupled to an output device, causing a user interface to be output on the output device;a module that provides information about the current context of the user to the user interface component;wherein the user interface includes a list portion in which a list of a plurality of items to be handled by a user are output;wherein the user interface further includes a current location marker identifying one of the items in the list as the current item to be handled by the user;wherein the user interface further displays information relating to the current item, the information comprising one or more possible inputs by the user in relation to the current item in the list;wherein the user interface further updates the list, in response to the user providing an input in relation to the current item in the list, to reflect the provided input;wherein the user interface component further automatically updates the current location marker to identify a new item in the list in response to the user handling the current item in the list;wherein the user interface component, in response to the received information about the current context of the user, alters a manner of presenting information to the user, comprising selecting a manner of presentation that is less intrusive upon the user and formatting the information to be presented based on the selected manner of presentation.
  • 20. A system as recited in claim 19, wherein the user interface component further replaces, after the user has handled the current item, a user input in place of the current item.
RELATED APPLICATIONS

This application is a continuation of co-pending U.S. patent application No. 09/879,829 filed Jun. 11, 2001 and entitled “Dynamically Displaying Current Status of Tasks”, which is hereby incorporated by reference, and which claims the benefit under 35 USC 119(e) of U.S. Provisional Application No. 60/240,685, filed Oct. 16, 2000, entitled “Method for Dynamically Displaying the Current Status of Tasks”.

US Referenced Citations (379)
Number Name Date Kind
3973251 Stephans Aug 1976 A
4283712 Goody Aug 1981 A
4458331 Amezcua et al. Jul 1984 A
4569026 Best Feb 1986 A
4815030 Cross et al. Mar 1989 A
4905163 Garber et al. Feb 1990 A
4916441 Gombrich Apr 1990 A
4970683 Harshaw et al. Nov 1990 A
4991087 Burkowski et al. Feb 1991 A
5032083 Friedman Jul 1991 A
5133075 Risch Jul 1992 A
5201034 Matsuura et al. Apr 1993 A
5208449 Eastman et al. May 1993 A
5214757 Mauney et al. May 1993 A
5227614 Danielson et al. Jul 1993 A
5237684 Record et al. Aug 1993 A
5251294 Abelow Oct 1993 A
5267147 Harshaw et al. Nov 1993 A
5278946 Shimada et al. Jan 1994 A
5285398 Janik Feb 1994 A
5317568 Bixby et al. May 1994 A
5327529 Fults et al. Jul 1994 A
5335276 Thompson et al. Aug 1994 A
5339395 Pickett et al. Aug 1994 A
5353399 Kuwamoto et al. Oct 1994 A
5388198 Layman et al. Feb 1995 A
5398021 Moore Mar 1995 A
5416730 Lookofsky May 1995 A
5454074 Hartel et al. Sep 1995 A
5470233 Fruchterman et al. Nov 1995 A
5471629 Risch Nov 1995 A
5481667 Bieniek et al. Jan 1996 A
5493692 Theimer et al. Feb 1996 A
5506580 Whiting et al. Apr 1996 A
5513646 Lehrman et al. May 1996 A
5522026 Records et al. May 1996 A
5535323 Miller et al. Jul 1996 A
5537618 Boulton et al. Jul 1996 A
5539665 Lamming et al. Jul 1996 A
5544321 Theimer et al. Aug 1996 A
5553609 Chen et al. Sep 1996 A
5555376 Theimer et al. Sep 1996 A
5559520 Barzegar et al. Sep 1996 A
5560012 Ryu et al. Sep 1996 A
5566337 Szymanski et al. Oct 1996 A
5568645 Morris et al. Oct 1996 A
5572401 Carroll Nov 1996 A
5592664 Starkey Jan 1997 A
5601435 Quy Feb 1997 A
5603054 Theimer et al. Feb 1997 A
5611050 Theimer et al. Mar 1997 A
5642303 Small et al. Jun 1997 A
5646629 Loomis et al. Jul 1997 A
5659746 Bankert et al. Aug 1997 A
5675358 Bullock et al. Oct 1997 A
5689619 Smyth Nov 1997 A
5689708 Regnier et al. Nov 1997 A
5701894 Cherry et al. Dec 1997 A
5704366 Tacklind et al. Jan 1998 A
5710884 Dedrick Jan 1998 A
5715451 Marlin Feb 1998 A
5717747 Boyle, III et al. Feb 1998 A
5719744 Jenkins et al. Feb 1998 A
5726660 Purdy et al. Mar 1998 A
5726688 Siefert et al. Mar 1998 A
5740037 McCann et al. Apr 1998 A
5742279 Yamamoto et al. Apr 1998 A
5745110 Ertemalp Apr 1998 A
5751260 Nappi et al. May 1998 A
5752019 Rigoutsos et al. May 1998 A
5754938 Herz et al. May 1998 A
5761662 Dasan Jun 1998 A
5769085 Kawakami et al. Jun 1998 A
5781913 Felsenstein et al. Jul 1998 A
5787234 Molloy Jul 1998 A
5787279 Rigoutsos Jul 1998 A
5790974 Tognazzini Aug 1998 A
5796952 Davis et al. Aug 1998 A
5798733 Ethridge Aug 1998 A
5806079 Rivette et al. Sep 1998 A
5812865 Theimer et al. Sep 1998 A
5818446 Bertram et al. Oct 1998 A
5826253 Bredenberg Oct 1998 A
5831594 Tognazzini et al. Nov 1998 A
5832296 Wang et al. Nov 1998 A
5835087 Herz et al. Nov 1998 A
5852814 Allen Dec 1998 A
5867171 Murata et al. Feb 1999 A
5873070 Bunte et al. Feb 1999 A
5878274 Kono et al. Mar 1999 A
5879163 Brown et al. Mar 1999 A
5881231 Takagi et al. Mar 1999 A
5899963 Hutchings May 1999 A
5902347 Backman et al. May 1999 A
5905492 Straub et al. May 1999 A
5910799 Carpenter et al. Jun 1999 A
5911132 Sloane Jun 1999 A
5913030 Lotspiech et al. Jun 1999 A
5924074 Evans Jul 1999 A
5937160 Davis et al. Aug 1999 A
5938721 Dussell et al. Aug 1999 A
5942986 Shabot et al. Aug 1999 A
5945988 Williams et al. Aug 1999 A
5948041 Abo et al. Sep 1999 A
5953718 Wical Sep 1999 A
5959611 Smailagic et al. Sep 1999 A
5966126 Szabo Oct 1999 A
5966533 Moody Oct 1999 A
5966710 Burrows Oct 1999 A
5971580 Hall et al. Oct 1999 A
5974262 Fuller et al. Oct 1999 A
5977968 Le Blanc Nov 1999 A
5980096 Thalhammer-Reyero Nov 1999 A
5983335 Dwyer, III Nov 1999 A
5991687 Hale et al. Nov 1999 A
5991735 Gerace Nov 1999 A
5995956 Nguyen Nov 1999 A
5999932 Paul Dec 1999 A
5999943 Nori et al. Dec 1999 A
5999975 Kittaka et al. Dec 1999 A
6003082 Gampper et al. Dec 1999 A
6006251 Toyouchi et al. Dec 1999 A
6012152 Douik et al. Jan 2000 A
6014638 Burge et al. Jan 2000 A
6023729 Samuel et al. Feb 2000 A
6041331 Weiner et al. Mar 2000 A
6041365 Kleinerman Mar 2000 A
6044415 Futral et al. Mar 2000 A
6047301 Bjorklund et al. Apr 2000 A
6047327 Tso et al. Apr 2000 A
6055516 Johnson et al. Apr 2000 A
6061610 Boer May 2000 A
6064943 Clark, Jr. et al. May 2000 A
6067084 Fado et al. May 2000 A
6081814 Mangat et al. Jun 2000 A
6085086 La Porta et al. Jul 2000 A
6088689 Kohn et al. Jul 2000 A
6091411 Straub et al. Jul 2000 A
6092101 Birrell et al. Jul 2000 A
6094625 Ralston Jul 2000 A
6098065 Skillen et al. Aug 2000 A
6105063 Hayes, Jr. Aug 2000 A
6108197 Janik Aug 2000 A
6108665 Bair et al. Aug 2000 A
6112246 Horbal et al. Aug 2000 A
6122348 French-St. George et al. Sep 2000 A
6122960 Hutchings et al. Sep 2000 A
6127990 Zwern Oct 2000 A
6131067 Girerd et al. Oct 2000 A
6134532 Lazarus et al. Oct 2000 A
6154745 Kari et al. Nov 2000 A
6155960 Roberts et al. Dec 2000 A
6164541 Dougherty et al. Dec 2000 A
6169976 Colosso Jan 2001 B1
6185534 Breese et al. Feb 2001 B1
6188399 Voas et al. Feb 2001 B1
6195622 Altschuler et al. Feb 2001 B1
6198394 Jacobsen et al. Mar 2001 B1
6199099 Gershman et al. Mar 2001 B1
6215405 Handley et al. Apr 2001 B1
6218958 Eichstaedt et al. Apr 2001 B1
6230111 Mizokawa May 2001 B1
6236768 Rhodes et al. May 2001 B1
6256633 Dharap Jul 2001 B1
6262720 Jeffrey et al. Jul 2001 B1
6263268 Nathanson Jul 2001 B1
6263317 Sharp et al. Jul 2001 B1
6272470 Teshima Aug 2001 B1
6272507 Pirolli et al. Aug 2001 B1
6282517 Wolfe et al. Aug 2001 B1
6285757 Carroll et al. Sep 2001 B1
6285889 Nykanen et al. Sep 2001 B1
6289316 Aghili et al. Sep 2001 B1
6289513 Bentwich Sep 2001 B1
6292796 Drucker et al. Sep 2001 B1
6294953 Steeves Sep 2001 B1
6305007 Mintz Oct 2001 B1
6305221 Hutchings Oct 2001 B1
6308203 Itabashi et al. Oct 2001 B1
6311162 Reichwein et al. Oct 2001 B1
6314384 Goetz Nov 2001 B1
6317718 Fano Nov 2001 B1
6321158 DeLorme et al. Nov 2001 B1
6321279 Bonola Nov 2001 B1
6327535 Evans et al. Dec 2001 B1
6349307 Chen Feb 2002 B1
6353398 Amin et al. Mar 2002 B1
6353823 Kumar Mar 2002 B1
6356905 Gershman et al. Mar 2002 B1
6363377 Kravets et al. Mar 2002 B1
6385589 Trusheim et al. May 2002 B1
6392670 Takeuchi et al. May 2002 B1
6401085 Gershman et al. Jun 2002 B1
6405159 Bushey et al. Jun 2002 B2
6405206 Kayahara Jun 2002 B1
6418424 Hoffberg et al. Jul 2002 B1
6421700 Holmes et al. Jul 2002 B1
6427142 Zachary et al. Jul 2002 B1
6430531 Polish Aug 2002 B1
6438618 Lortz et al. Aug 2002 B1
6442549 Schneider Aug 2002 B1
6442589 Takahashi et al. Aug 2002 B1
6442620 Thatte et al. Aug 2002 B1
6446076 Burkey et al. Sep 2002 B1
6446109 Gupta Sep 2002 B2
6460036 Herz Oct 2002 B1
6462759 Kurtzberg et al. Oct 2002 B1
6466232 Newell et al. Oct 2002 B1
6477117 Narayanaswami et al. Nov 2002 B1
6483485 Huang et al. Nov 2002 B1
6484200 Angal et al. Nov 2002 B1
6487552 Lei et al. Nov 2002 B1
6490579 Gao et al. Dec 2002 B1
6505196 Drucker et al. Jan 2003 B2
6507567 Willars Jan 2003 B1
6507845 Cohen et al. Jan 2003 B1
6513046 Abbott et al. Jan 2003 B1
6519552 Sampath et al. Feb 2003 B1
6526035 Atarius et al. Feb 2003 B1
6529723 Bentley Mar 2003 B1
6539336 Vock et al. Mar 2003 B1
6542889 Aggarwal et al. Apr 2003 B1
6546425 Hanson et al. Apr 2003 B1
6546554 Schmidt et al. Apr 2003 B1
6549915 Abbott et al. Apr 2003 B2
6549944 Weinberg et al. Apr 2003 B1
6553336 Johnson et al. Apr 2003 B1
6563430 Kemink et al. May 2003 B1
6568595 Rusell et al. May 2003 B1
6571279 Herz et al. May 2003 B1
6578019 Suda et al. Jun 2003 B1
6625135 Johnson et al. Sep 2003 B1
6636831 Profit, Jr. et al. Oct 2003 B1
6643684 Malkin et al. Nov 2003 B1
6652283 Van Schaack et al. Nov 2003 B1
6661437 Miller et al. Dec 2003 B1
6672506 Swartz et al. Jan 2004 B2
6697836 Kawano et al. Feb 2004 B1
6704722 Wang Baldonado Mar 2004 B2
6704785 Koo et al. Mar 2004 B1
6704812 Bakke et al. Mar 2004 B2
6707476 Hochstedler Mar 2004 B1
6712615 Martin Mar 2004 B2
6714977 Fowler et al. Mar 2004 B1
6738040 Jahn et al. May 2004 B2
6738759 Wheeler et al. May 2004 B1
6741188 Miller et al. May 2004 B1
6741610 Volftsun et al. May 2004 B1
6747675 Abbott et al. Jun 2004 B1
6751620 Orbanes et al. Jun 2004 B2
6766245 Padmanabhan Jul 2004 B2
D494584 Schlieffers et al. Aug 2004 S
6791580 Abbott et al. Sep 2004 B1
6795806 Lewis et al. Sep 2004 B1
6796505 Pellaumail et al. Sep 2004 B2
6801223 Abbott et al. Oct 2004 B1
6812937 Abbott et al. Nov 2004 B1
6829639 Lawson et al. Dec 2004 B1
6834195 Brandenberg et al. Dec 2004 B2
6834208 Gonzales et al. Dec 2004 B2
6837436 Swartz et al. Jan 2005 B2
6842877 Robarts et al. Jan 2005 B2
6853966 Bushey et al. Feb 2005 B2
6868525 Szabo Mar 2005 B1
6874017 Inoue et al. Mar 2005 B1
6874127 Newell et al. Mar 2005 B2
6885734 Eberle et al. Apr 2005 B1
6963899 Fernandez et al. Nov 2005 B1
6968333 Abbott et al. Nov 2005 B2
7000187 Messinger et al. Feb 2006 B2
7010501 Roslak et al. Mar 2006 B1
7010603 Martin, Jr. et al. Mar 2006 B2
7040541 Swartz et al. May 2006 B2
7046263 Abbott et al. May 2006 B1
7055101 Abbott et al. May 2006 B2
7058893 Abbott et al. Jun 2006 B2
7058894 Abbott et al. Jun 2006 B2
7062715 Abbott et al. Jun 2006 B2
7063263 Swartz et al. Jun 2006 B2
7076737 Abbott et al. Jul 2006 B2
7080322 Abbott et al. Jul 2006 B2
7089497 Abbott et al. Aug 2006 B2
7096253 Vinson et al. Aug 2006 B2
7103806 Horvitz Sep 2006 B1
7107539 Abbott et al. Sep 2006 B2
7110764 Blair et al. Sep 2006 B1
7120558 McIntyre et al. Oct 2006 B2
7124125 Cook et al. Oct 2006 B2
7137069 Abbott et al. Nov 2006 B2
7155456 Abbott, III et al. Dec 2006 B2
7162473 Dumais et al. Jan 2007 B2
7171378 Petrovich et al. Jan 2007 B2
7195157 Swartz et al. Mar 2007 B2
7203906 Abbott et al. Apr 2007 B2
7225229 Abbott et al. May 2007 B1
7231439 Abbott et al. Jun 2007 B1
7260453 Poier et al. Aug 2007 B2
7349894 Barth et al. Mar 2008 B2
7360152 Capps et al. Apr 2008 B2
7385501 Miller et al. Jun 2008 B2
7386477 Fano Jun 2008 B2
7392486 Gyde et al. Jun 2008 B1
7395221 Doss et al. Jul 2008 B2
7444594 Abbott et al. Oct 2008 B2
7464153 Abbott et al. Dec 2008 B1
7512889 Newell et al. Mar 2009 B2
7533052 Tilfors et al. May 2009 B2
7533082 Abbott et al. May 2009 B2
7561200 Garvey, III et al. Jul 2009 B2
7571218 Tanaka et al. Aug 2009 B2
7614001 Abbott et al. Nov 2009 B2
7647400 Abbott et al. Jan 2010 B2
7689919 Abbott et al. Mar 2010 B2
7734780 Abbott et al. Jun 2010 B2
7739607 Abbott et al. Jun 2010 B2
7779015 Abbott et al. Aug 2010 B2
20010030664 Shulman et al. Oct 2001 A1
20010040590 Abbott et al. Nov 2001 A1
20010040591 Abbott et al. Nov 2001 A1
20010043231 Abbott et al. Nov 2001 A1
20010043232 Abbott et al. Nov 2001 A1
20020032689 Abbott, III et al. Mar 2002 A1
20020044152 Abbott, III et al. Apr 2002 A1
20020052930 Abbott et al. May 2002 A1
20020052963 Abbott et al. May 2002 A1
20020054130 Abbott, III et al. May 2002 A1
20020054174 Abbott et al. May 2002 A1
20020078204 Newell et al. Jun 2002 A1
20020080155 Abbott et al. Jun 2002 A1
20020080156 Abbott et al. Jun 2002 A1
20020083025 Robarts et al. Jun 2002 A1
20020083158 Abbott et al. Jun 2002 A1
20020087525 Abbott et al. Jul 2002 A1
20020099817 Abbott et al. Jul 2002 A1
20020147880 Wang Baldonado Oct 2002 A1
20020191034 Sowizral et al. Dec 2002 A1
20030046401 Abbott et al. Mar 2003 A1
20030154476 Abbott, III et al. Aug 2003 A1
20030186201 Martin Oct 2003 A1
20030229900 Reisman Dec 2003 A1
20040088328 Cook et al. May 2004 A1
20040133600 Homer Jul 2004 A1
20040186854 Choi Sep 2004 A1
20040201500 Miller et al. Oct 2004 A1
20040215663 Liu et al. Oct 2004 A1
20040267700 Dumais et al. Dec 2004 A1
20040267812 Harris et al. Dec 2004 A1
20050027704 Hammond et al. Feb 2005 A1
20050034078 Abbott et al. Feb 2005 A1
20050066282 Abbott et al. Mar 2005 A1
20050086243 Abbott et al. Apr 2005 A1
20050160113 Sipusic et al. Jul 2005 A1
20050165843 Capps et al. Jul 2005 A1
20050193017 Kim Sep 2005 A1
20050266858 Miller et al. Dec 2005 A1
20050272442 Miller et al. Dec 2005 A1
20060004680 Robarts et al. Jan 2006 A1
20060019676 Miller et al. Jan 2006 A1
20060136393 Abbott et al. Jun 2006 A1
20060259494 Watson et al. Nov 2006 A1
20070022384 Abbott et al. Jan 2007 A1
20070043459 Abbott et al. Feb 2007 A1
20070130524 Abbott et al. Jun 2007 A1
20070168502 Abbott et al. Jul 2007 A1
20070185864 Budzik et al. Aug 2007 A1
20070266318 Abbott et al. Nov 2007 A1
20080090591 Miller et al. Apr 2008 A1
20080091537 Miller et al. Apr 2008 A1
20080147775 Abbott et al. Jun 2008 A1
20080161018 Miller et al. Jul 2008 A1
20080313271 Abbott et al. Dec 2008 A1
20090013052 Robarts et al. Jan 2009 A1
20090055752 Abbott et al. Feb 2009 A1
20090094524 Abbott et al. Apr 2009 A1
20090150535 Abbott et al. Jun 2009 A1
20090228552 Abbott et al. Sep 2009 A1
20090234878 Herz et al. Sep 2009 A1
20090282030 Abbott et al. Nov 2009 A1
20100217862 Abbott et al. Aug 2010 A1
Foreign Referenced Citations (20)
Number Date Country
0661627 Jul 1995 EP
0759591 Feb 1997 EP
0801342 Oct 1997 EP
0823813 Feb 1998 EP
0846440 Jun 1998 EP
0924615 Jun 1999 EP
05260188 Oct 1993 JP
09091112 Apr 1997 JP
11306002 Nov 1999 JP
WO-9008361 Jul 1990 WO
WO9531773 Nov 1995 WO
WO-9703434 Jan 1997 WO
WO-9734388 Sep 1997 WO
WO-9800787 Jan 1998 WO
WO-9847084 Oct 1998 WO
WO-9917228 Apr 1999 WO
WO-9926180 May 1999 WO
WO-9966394 Dec 1999 WO
WO-9967698 Dec 1999 WO
WO-0036493 Jun 2000 WO
Related Publications (1)
Number Date Country
20070089067 A1 Apr 2007 US
Provisional Applications (1)
Number Date Country
60240685 Oct 2000 US
Continuations (1)
Number Date Country
Parent 09879829 Jun 2001 US
Child 11548569 US