The present invention is related to the following copending U.S. patent applications filed concurrently herewith, assigned to the assignee of the present invention, and hereby incorporated by reference in their entireties:
“Method and System for Auxiliary Display of Information for a Computing Device,” U.S. patent application Ser. No. 10/429,932;
“Real-Time Communications Architecture and Methods for use with a Personal Computer System,” U.S. patent application Ser. No. 10/429,905;
“Method and System for Auxiliary Processing Of Information for a Computing Device,” U.S. patent application Ser. No. 10/429,930;
“System and Method for Activating a Computer System;” U.S. patent application Ser. No. 10/429,903;
“Computer System with Do Not Disturb System and Method,” U.S. patent application Ser. No. 10/430,369;
“Computer Camera System and Method for Reducing Parallax,” U.S. patent application Ser. No. 10/429,943;
“Control and Communications Panel for a Computer System,” U.S. patent application Ser. No. 10/439,933; and
“Notification Lights, Locations and Rules for a Computer System,” U.S. patent application Ser. No. 10/439,931.
The invention relates generally to audio controls and computer systems.
Personal computers have evolved over time to accept various kinds of input. Contemporary computing devices allow audio input via a microphone. Such audio input is supported by contemporary operating systems, such as Microsoft Windows®-based operating systems, which provide a sound recorder to record, mix, play, and edit sounds, and also to link sounds to a document, or insert sounds into a document. Application programs and integrated programs such as Microsoft® Officexp offer speech recognition, which converts speech to text.
While audio input to a computer system is thus supported by software, a certain level of skill and effort is required to use these audio features to any reasonable extent. For example, to insert an audio comment (voice comment or voice annotation) into a word processing document such as via the Microsoft® Word word processing program, the user needs to know to put up a reviewing toolbar, click an arrow next to a new comment icon, click a voice comment icon, and click a record button on a sound object dialog and speak to record the voice comment. When finished recording, the user needs to click a stop button, and then exit the dialog to resume typing. Other programs have like requirements for entering a voice comment. Even a skilled user still has to perform a fair amount of work and manipulate the pointing device a significant amount to enter such comments.
Other uses for audio input include speech recognition to enter text, and command and control, in which a user speaks commands to the computer system to perform operations. These tasks also require a fair amount of familiarity with the audio programs and a fair amount skill and effort to perform.
Video input via a camera is also becoming popular. In general, video input suffers from the same drawbacks as audio input, which is that it is difficult in use.
What is needed is a way for users to efficiently and intuitively leverage the audio and video capabilities provided with contemporary computer systems, operating systems and applications. The method and system should be simple and fast for users to learn and operate, and configurable to some extent to meet various user scenarios and usage patterns.
Briefly, the present invention provides a record button (human interface device) that facilitates audiovisual input without requiring manual interaction (direct manipulation interaction) with software. Simple actuation of the record button is all that may be required for a user to enter audio or video into the system, or record other types of information, as the record button connects with (operably couples to) the operating system to provide the information needed for the user's given operating context. For example, when working with a document, a single press of the record button may insert an audio annotation into the document at an appropriate location, such as at the cursor location in a word processing document, a selected cell of a spreadsheet, and so on. When listening to a voicemail message recorded in the computer, a press of the record button will allow the user to record an audio reply to that message, e.g., for sending to the party that left the message.
The record button may be configured to respond to different methods of actuation, each of which may correspond to a different audio operating mode. For example, a single-pressed-and-release actuation may be handled as a straight audio recording operation that passes audio into a top-level window, a hold-and-release (hold for some threshold amount of time but release before too long) actuation may be handled as a speech-to-text conversion prior to sending the text to the top-level window, while a hold-indefinitely actuation may be handled as a command and control function, e.g., recognize the command and pass it to the operating system. A double-press mode can send voice recognized commands to the top-level window. Video may be entered as well, such as when the audio mode is selected to provide an audio stream to a program, the application context will accept video, and the camera shutter position is open.
The various types of actuation methods and the time limits may be configurable by the user beyond default values, and the operating modes that result may be mapped to the actuation methods, as well as individually selected or deactivated.
A record light may be provided to notify the user of the current action, corresponding to the current recording mode. Colors and/or flash patterns may be varied to provide the information. For example, off may indicate not selected, red may indicate recording straight audio and/or video, yellow when speech to text recognition is active, and green when command and control is active. Flash patterns may also be provided, such as to indicate noisy conditions, or provide similar warnings and other information. Other colors and flash patterns, and alternative meanings, are feasible.
The record button can work with manual software switching as well, e.g., to change modes to an extent based on software commands rather than solely on a method of actuation. Thus, by software switching, a user can use as little as one record button actuation technique, such as a single press and release actuation, for different modes of operation.
Other advantages will become apparent from the following detailed description when taken in conjunction with the drawings, in which:
Exemplary Operating Environment
The invention is operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well known computing systems, environments, and/or configurations that may be suitable for use with the invention include, but are not limited to: personal computers, server computers, hand-held or laptop devices, tablet devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like.
The invention may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, and so forth, which perform particular tasks or implement particular abstract data types. The invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in local and/or remote computer storage media including memory storage devices.
With reference to
The computer 110 typically includes a variety of computer-readable media. Computer-readable media can be any available media that can be accessed by the computer 110 and includes both volatile and nonvolatile media, and removable and non-removable media. By way of example, and not limitation, computer-readable media may comprise computer storage media and communication media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by the computer 110. Communication media typically embodies computer-readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of the any of the above should also be included within the scope of computer-readable media.
The system memory 130 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 131 and random access memory (RAM) 132. A basic input/output system 133 (BIOS), containing the basic routines that help to transfer information between elements within computer 110, such as during start-up, is typically stored in ROM 131. RAM 132 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 120. By way of example, and not limitation,
The computer 110 may also include other removable/non-removable, volatile/nonvolatile computer storage media. By way of example only,
The drives and their associated computer storage media, discussed above and illustrated in
The computer 110 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 180. The remote computer 180 may be a personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 110, although only a memory storage device 181 has been illustrated in
When used in a LAN networking environment, the computer 110 is connected to the LAN 171 through a network interface or adapter 170. When used in a WAN networking environment, the computer 110 typically includes a modem 172 or other means for establishing communications over the WAN 173, such as the Internet. The modem 172, which may be internal or external, may be connected to the system bus 121 via the user input interface 160 or other appropriate mechanism. In a networked environment, program modules depicted relative to the computer 110, or portions thereof, may be stored in the remote memory storage device. By way of example, and not limitation,
Record Button
The present invention is generally directed towards a record button coupled to the computer system, which facilitates the inputting of audio and/or video input (referred to herein as audiovisual input generally, even if only audio or only video one is being input) into a computer system. As will be understood, one or more record buttons may be built into a computer system (such as on the keyboard, mouse or monitor) or external and coupled to the computer system, such as within an add-on set of control buttons or on an external microphone. Indeed, one of the mouse buttons may be assigned to act as the record button. Moreover, it will be readily apparent that the various examples herein, such as with respect to operating modes, actuation methods, indicator displays and so on are non-limiting examples, and that the present invention may be implemented in numerous ways. For example, as will be understood, although the present invention primarily uses examples herein of using the record button with a microphone and/or camera, the record button may be used for recording of content or system states other than via microphone or camera, such as to or from a series of system events, physical user input, user interactions, data transmissions, computations, renderings, and broadcast or removable media. Thus, as used herein, the record button may be used with any data capable of being recorded.
Turning to
Also represented in
In accordance with an aspect of the present invention, the computer system, such as on the keyboard 162 and/or monitor 191 (and/or possibly the pointing device 161) provides a record button, shown in
As generally represented in
Turning to an explanation of the use of the record button, as generally represented in
By way of example, pressing the record button 222 while in an active application 636 that supports recording video or audio streams may cause the application 636 to display an audio/video monitor dialog, set the record indicator (LED 602), and begin capturing the stream to memory or disk. Again pressing the record button 222, or pressing a stop button (e.g., in the transport controls), stops the recording and automatically saves, or offer options for saving, the stream that was captured. Note that a pause button may be used to pause and resume the recording process.
Other operating modes are feasible. The application may be set to perform dictation, in which received audio is converted to text and text entered into a document. These varying operating modes may be set by manual interaction with the application program and/or operating system, but in keeping with the present invention, instead may be controlled by the record button. For example while momentary contact of the record button may toggle the record mode, the user can press and hold the record button to capture quick voice notes, or perform other functions. Various example actuation methods and example modes that result are described below with reference to
The operating system 134, or application program 636 via the operating system 134, may control the display of the indicator 602. In general, the record indicator 602 is lit as steady red when the foreground application is recording live audio or video streams, although this may be configurable, e.g., by color-blind individuals. Other colors and/or patterns may indicate additional information to users, as described below.
Depending on the mode of operation and the application program's current context, speech to text recognition may be active, as performed by a speech recognition engine 604 or the like. For example, the application may be incapable of handling speech, but the system set up so that the operating system sends the audio to a speech recognition engine 604 that converts the speech to text, which the operating system then sends to the application program. Applications that are speech-aware can receive audio and contact the speech recognition engine 604 on their own for conversion to text. Further, the speech recognition engine 604 can be employed in command-and-control scenarios, in which the operating system 134 or application receives audio commands, which are recognized and treated as commands instead of as text to enter.
Note that with speech recognizers, typically code words or phrases (e.g., “voice command” versus “dictation”) are used to switch modes, or to distinguish commands from text, or icons are clicked for this purpose. The present invention enables the record button to change modes, and further, the current mode may indicate whether the command is directed to the application program or to the operating system without having to make this selection by mouse clicking or speaking (which can be misinterpreted).
By way of example, consider four modes of recording operation, namely a record mode that sends audiovisual data to an application (e.g., the top-level one), a dictation mode that uses speech recognition (whether by the application or the operating system to send text to the application), a command and control mode for issuing commands to the operating system, and a command and control mode for issuing commands to the application. The camera's settings in conjunction with the application's capabilities can determine whether video accompanies the audio.
A user can select from among these modes by actuating the record button 222 in different ways. For example, a user can press and release the button once, press-and-release it twice quickly (like a double-click), hold the button for a threshold time that indicates that this is more than a single or double press (e.g., greater than one-half of a second) and then release it, and hold the button indefinitely, which may trigger the mode at some other threshold time, (e.g., one second). Each of these may be mapped to one of the above-described recording/operating modes.
The process shown in
As represented at step 702, if the user has not released the record button by the press and hold threshold time, then the user is considered to have entered the “hold indefinitely” (or simply “hold”) mode, as represented by step 704. Depending on what this mode is mapped to operation-wise, step 704 will perform the necessary initial operations, such as to control the indicator 602, and possibly to pass a message to the application program if the selected record mode involves the application. Steps 706 and 708 represent operating in the hold mode until the button is released or the stop button is pressed (or some other appropriate indication is given).
Returning to step 700, if the record button is released before the press and hold (hold indefinitely) threshold is reached, then step 700 instead branches to step 710 to process this released state. Step 710 represents whether the hold and release threshold was met, which should be a long enough hold duration to clearly distinguish the user's actions from a quick press and release (like a single-click or double-click) action. If the hold and release threshold was met, step 710 branches to step 712, to initially enter the hold and release mode, such as to control the indicator 602, and possibly to pass an initialization message to the application program as needed. Steps 716 and 718 represent operating in the hold and release mode until the record button is again pressed, or the stop button is pressed (or some other appropriate indication is given).
Returning to step 710, if the hold and release time was not met, then the button was quickly pressed (clicked), either as a single-press action or as part of double-press action in this example. Step 720 determines whether the time was too long to be considered part of a double-press action, in which event step 720 branches to step 722 to enter a single-press mode, performing any initialization operations, before continuing to steps 724 and 726 to operate in this mode until stopped in some manner.
If step 720 instead determines that the press-and-release action was fast enough so as to possibly be part of a double-press action, steps 720 and 730 represent waiting for the second press, or timing out if too much time expires and entering the above-described single press mode. If a second press is received within the double-press time window, step 730 branches to step 732 to enter the double-press mode, performing any initialization operations, before continuing to steps 734 and 736 to operate in this double-press mode until stopped in some manner.
Note that while recording, if no significant audio input has been received for a long time, a recording mode may time out, possibly after some warning (which may occur at least in part via the LED recording indicator 602), so as to not fill an annotation or the like with background noise if the user becomes distracted after entering such a mode. Instead of exiting the recording mode in such a situation, a suspension of the actual recording may be performed, thus operating like conventional voice-operated recording devices. In other words, the record button may be configured to work in conjunction with a voice-activated recording mode. Video motion sensing may be similarly employed.
As can be appreciated, the above actuation methods and resultant operating modes are only examples of how a user may want a system configured. Default values for threshold times, default mappings of the actuation methods to recording modes, and default colors and patterns for controlling the record indicator, such as determined by empirical testing, may be sufficient for most users. Other users however may want to change these values from the default values to match their own needs. Further, users may choose not to use a given mode or actuation method, and may customize modes and methods, e.g., a triple-press (click) of the record button may also turn the camera on.
In this manner, for example, the user settings or default settings may be set so that a single click sends audiovisual input to an application program. Then, when working with a document, a single press of the record button may insert an audio annotation of whatever is recorded thereafter (until stopped) into a document at an appropriate location, such as at the cursor location in a word processing document, a selected cell a spreadsheet, and so on. In another application context, when listening to a voicemail message recorded in the computer, a press of the record button may be all that is needed to allow the user to record an audio reply to that message, e.g., for later sending to the party that left the message.
In sum, use of the record button may provide any audiovisual input data to an application program for handling in that application program's current context. Likewise, command and control can be activated through the use of the record button rather than through trigger words or mouse/software control. This gives users an intuitive and efficient way to integrate audiovisual data into their programs and to control their computer systems.
As described above, the record light (e.g., LED) 602 may be controlled to notify the user of the current action, corresponding to the current recording mode. Colors and/or flash patterns may be varied to provide the information. For example, off may indicate not selected, steady red may indicate recording or passing straight audio, yellow when speech-to-text recognition is active, or green when command and control is active. One flash pattern may indicate active but noisy, while another flash pattern may indicate that no sufficient sounds have been detected and that the recording will soon stop automatically unless the user starts or resumes speaking. Another color or flash pattern such as orange can indicate that voice recording is suspended until voice activated.
Although the record button may be configured to handle all of a user's audiovisual input needs, the record button can work with in combination with software-based switching. For example, a user may prefer to change modes to an extent based on software commands rather than solely on various methods of actuation. For example, a user familiar with a single on-off switch may want all actuation methods to simply toggle between on or off with respect to providing audiovisual data to an application program. Note however that a user may still accomplish each of the recording/operating modes by software switching, e.g., recognized code words and icon clicks.
As can be seen from the foregoing detailed description, there is provided a record button on a computer system that can significantly improve a user's computing experience with respect to entering audiovisual information. The record button may be set for various actuation methods corresponding to various operating modes, which may be user configurable to meet specific user data input needs. The method and system thus provide significant advantages and benefits needed in contemporary computing and communications.
While the invention is susceptible to various modifications and alternative constructions, certain illustrated embodiments thereof are shown in the drawings and have been described above in detail. It should be understood, however, that there is no intention to limit the invention to the specific forms disclosed, but on the contrary, the intention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention.
Number | Name | Date | Kind |
---|---|---|---|
4860342 | Danner | Aug 1989 | A |
5159445 | Gitlin | Oct 1992 | A |
5388268 | Beach | Feb 1995 | A |
5412417 | Tozuka | May 1995 | A |
5487181 | Dailey | Jan 1996 | A |
5491800 | Goldsmith | Feb 1996 | A |
5519772 | Akman | May 1996 | A |
5533115 | Hollenbach et al. | Jul 1996 | A |
5546538 | Cobbley | Aug 1996 | A |
5568540 | Grecko | Oct 1996 | A |
5657414 | Lett | Aug 1997 | A |
5675374 | Kohda | Oct 1997 | A |
5675810 | Sellers | Oct 1997 | A |
5732216 | Logan et al. | Mar 1998 | A |
5745761 | Celi | Apr 1998 | A |
5764901 | Skarbo | Jun 1998 | A |
5768164 | Hollon | Jun 1998 | A |
5802305 | McKaughan | Sep 1998 | A |
5831606 | Nakajima | Nov 1998 | A |
5907604 | Hsu | May 1999 | A |
5959622 | Greer | Sep 1999 | A |
5978837 | Foladare | Nov 1999 | A |
5987106 | Kitamura | Nov 1999 | A |
5991822 | Mealy | Nov 1999 | A |
5991836 | Renda | Nov 1999 | A |
5999613 | Nabkel | Dec 1999 | A |
6006285 | Jacobs | Dec 1999 | A |
6008806 | Nakajima | Dec 1999 | A |
6052442 | Cooper et al. | Apr 2000 | A |
6101610 | Beebe | Aug 2000 | A |
6118856 | Paarsmark | Sep 2000 | A |
6144363 | Alloul | Nov 2000 | A |
6144644 | Bajzath et al. | Nov 2000 | A |
6160550 | Nakajima | Dec 2000 | A |
6172703 | Lee | Jan 2001 | B1 |
6208373 | Fong et al. | Mar 2001 | B1 |
6215420 | Harrison et al. | Apr 2001 | B1 |
6237846 | Lowell | May 2001 | B1 |
6240168 | Stanford et al. | May 2001 | B1 |
6266714 | Jacobs | Jul 2001 | B1 |
6279056 | Jacobs | Aug 2001 | B1 |
6282435 | Wagner et al. | Aug 2001 | B1 |
6346934 | Wugofski | Feb 2002 | B1 |
6362440 | Karidis | Mar 2002 | B1 |
6380968 | Alexander | Apr 2002 | B1 |
6390371 | Armga | May 2002 | B1 |
6417849 | Lefebvre | Jul 2002 | B2 |
6438216 | Aktas | Aug 2002 | B1 |
6438585 | Mousseau | Aug 2002 | B2 |
6453027 | Kang | Sep 2002 | B1 |
6483905 | Kikinis | Nov 2002 | B1 |
6484019 | Aklian | Nov 2002 | B1 |
6496860 | Ludtke | Dec 2002 | B2 |
6507356 | Jackel et al. | Jan 2003 | B1 |
6513128 | Wang | Jan 2003 | B1 |
6516356 | Belknap et al. | Feb 2003 | B1 |
6518957 | Lehtinen | Feb 2003 | B1 |
6519335 | Bushnell | Feb 2003 | B1 |
6546262 | Freadman | Apr 2003 | B1 |
6603855 | Cannon et al. | Aug 2003 | B1 |
6628194 | Hellebust | Sep 2003 | B1 |
6628267 | Karidis | Sep 2003 | B2 |
6658095 | Yoakum | Dec 2003 | B1 |
6671356 | Lewis | Dec 2003 | B2 |
6671743 | Verity | Dec 2003 | B1 |
6674457 | Davies | Jan 2004 | B1 |
6680845 | Agata | Jan 2004 | B2 |
6690778 | Kahn | Feb 2004 | B2 |
6691233 | Gannage | Feb 2004 | B1 |
6718183 | Blust | Apr 2004 | B1 |
6731316 | Herigstad et al. | May 2004 | B2 |
6732365 | Belknap | May 2004 | B2 |
6741232 | Siedlikowski | May 2004 | B1 |
6757372 | Dunlap et al. | Jun 2004 | B1 |
6806867 | Arruda | Oct 2004 | B1 |
6816881 | Mohindra | Nov 2004 | B1 |
6819961 | Jacobs | Nov 2004 | B2 |
6831657 | Tsutsumi et al. | Dec 2004 | B2 |
6882326 | Hirayama | Apr 2005 | B2 |
6888562 | Rambo et al. | May 2005 | B2 |
6892074 | Tarkiainen | May 2005 | B2 |
6897851 | Carini | May 2005 | B2 |
6902332 | McLoone | Jun 2005 | B2 |
6912283 | Meyerson et al. | Jun 2005 | B2 |
6918123 | Shteyn | Jul 2005 | B1 |
6937950 | Cragun | Aug 2005 | B2 |
6938174 | LeKuch | Aug 2005 | B2 |
6970556 | Wall et al. | Nov 2005 | B2 |
6973167 | Kikinis | Dec 2005 | B2 |
6976216 | Peskin et al. | Dec 2005 | B1 |
6980641 | Stanford et al. | Dec 2005 | B1 |
6996445 | Kamijo | Feb 2006 | B1 |
7000237 | Sinha | Feb 2006 | B1 |
7036110 | Jeyaraman | Apr 2006 | B2 |
7068641 | Allan | Jun 2006 | B1 |
7085814 | Gandhi | Aug 2006 | B1 |
7096391 | Johnson | Aug 2006 | B2 |
7106472 | Gomez | Sep 2006 | B2 |
7123370 | Watanabe | Oct 2006 | B2 |
7221331 | Bear | May 2007 | B2 |
7227511 | Adan | Jun 2007 | B2 |
7231229 | Hawkins | Jun 2007 | B1 |
7243130 | Horvitz | Jul 2007 | B2 |
7272660 | Powers | Sep 2007 | B1 |
7292588 | Milley | Nov 2007 | B2 |
7302637 | Maguire | Nov 2007 | B1 |
7376932 | Chen | May 2008 | B2 |
7401053 | Kamimura | Jul 2008 | B2 |
7519911 | Friedman | Apr 2009 | B2 |
20010034251 | Goto | Oct 2001 | A1 |
20010040551 | Yates et al. | Nov 2001 | A1 |
20020004855 | Cox | Jan 2002 | A1 |
20020015020 | Mobin | Feb 2002 | A1 |
20020019812 | Board | Feb 2002 | A1 |
20020080967 | Abdo | Jun 2002 | A1 |
20020087225 | Howard | Jul 2002 | A1 |
20020099456 | McLean | Jul 2002 | A1 |
20020114430 | Murata | Aug 2002 | A1 |
20020131072 | Jackson | Sep 2002 | A1 |
20020144191 | Lin | Oct 2002 | A1 |
20020167458 | Baudisch | Nov 2002 | A1 |
20020167460 | Baudisch | Nov 2002 | A1 |
20030021290 | Jones | Jan 2003 | A1 |
20030025674 | Watanabe | Feb 2003 | A1 |
20030037180 | Madineni et al. | Feb 2003 | A1 |
20030046448 | Fischer | Mar 2003 | A1 |
20030069689 | Ihara | Apr 2003 | A1 |
20030074590 | Fogle | Apr 2003 | A1 |
20030112325 | Boyden et al. | Jun 2003 | A1 |
20030118003 | Geck | Jun 2003 | A1 |
20030122874 | Dieberger | Jul 2003 | A1 |
20030131148 | Kelley et al. | Jul 2003 | A1 |
20030146903 | Yi | Aug 2003 | A1 |
20030188041 | Fillmore | Oct 2003 | A1 |
20030197685 | Yi | Oct 2003 | A1 |
20030227471 | Eglit | Dec 2003 | A1 |
20040001087 | Warmus | Jan 2004 | A1 |
20040027375 | Ellis | Feb 2004 | A1 |
20040052341 | Yeh | Mar 2004 | A1 |
20040103144 | Sallam | May 2004 | A1 |
20040110490 | Steele | Jun 2004 | A1 |
20040114032 | Kakii | Jun 2004 | A1 |
20040135819 | Maa | Jul 2004 | A1 |
20040141012 | Tootill | Jul 2004 | A1 |
20040155956 | Libbey | Aug 2004 | A1 |
20040177361 | Bernhard et al. | Sep 2004 | A1 |
20040210628 | Inkinen | Oct 2004 | A1 |
20040222977 | Bear | Nov 2004 | A1 |
20040222978 | Bear | Nov 2004 | A1 |
20040223058 | Richter | Nov 2004 | A1 |
20040223061 | Bear | Nov 2004 | A1 |
20040223599 | Bear | Nov 2004 | A1 |
20040225502 | Bear | Nov 2004 | A1 |
20040225892 | Bear | Nov 2004 | A1 |
20040225901 | Bear | Nov 2004 | A1 |
20040240167 | Ledbetter et al. | Dec 2004 | A1 |
20040240650 | Bear | Dec 2004 | A1 |
20040266426 | Marsh | Dec 2004 | A1 |
20050005067 | Cutler | Jan 2005 | A1 |
20050068423 | Bear | Mar 2005 | A1 |
20050069101 | Bear | Mar 2005 | A1 |
20050071437 | Bear | Mar 2005 | A1 |
20050071626 | Bear | Mar 2005 | A1 |
20050182822 | Daniel | Aug 2005 | A1 |
20050186942 | Griffin | Aug 2005 | A1 |
20050193396 | Stafford-Fraser | Sep 2005 | A1 |
20050259032 | Morris | Nov 2005 | A1 |
20050262302 | Fuller | Nov 2005 | A1 |
20060007051 | Bear | Jan 2006 | A1 |
20060048062 | Adamson | Mar 2006 | A1 |
20060061516 | Campbell | Mar 2006 | A1 |
20060095525 | Mousseau | May 2006 | A1 |
20060100978 | Heller | May 2006 | A1 |
20060130072 | Rhote | Jun 2006 | A1 |
20060130075 | Rhoten | Jun 2006 | A1 |
20060164324 | Polivy | Jul 2006 | A1 |
20060168355 | Shenfield et al. | Jul 2006 | A1 |
20060236221 | McCausland | Oct 2006 | A1 |
20060242590 | Polivy | Oct 2006 | A1 |
20060284787 | Bear | Dec 2006 | A1 |
20090259327 | Bear | Oct 2009 | A1 |
20090305695 | Bear | Dec 2009 | A1 |
20100008488 | Bear | Jan 2010 | A1 |
20100010653 | Bear | Jan 2010 | A1 |
20100054432 | Brahm | Mar 2010 | A1 |
Number | Date | Country |
---|---|---|
0772327 | May 1997 | EP |
0777394 | Jun 1997 | EP |
000816990 | Jan 1998 | EP |
0838934 | Apr 1998 | EP |
0772327 | Feb 1999 | EP |
WO9602049 | Jan 1996 | WO |
WO0169387 | Sep 2001 | WO |
03085960 | Oct 2003 | WO |
Number | Date | Country | |
---|---|---|---|
20040225502 A1 | Nov 2004 | US |