System and method of using a remote control and apparatus

Information

  • Patent Grant
  • 8228224
  • Patent Number
    8,228,224
  • Date Filed
    Friday, October 26, 2007
    16 years ago
  • Date Issued
    Tuesday, July 24, 2012
    11 years ago
Abstract
A method includes receiving a first identification signal, where the first identification signal corresponds to a first control. The method also includes determining an active device function to which the first control corresponds, where the active device function is a first function of a first device when the first device is active and where the active device function is a second function of a second device when the second device is active. The method also includes triggering emission of an audible signal identifying the active device function.
Description
BACKGROUND

1. Field of the Disclosure


The present disclosure relates to remote controls, apparatuses, and systems, and methods of using the same, and more particularly to remote controls, apparatuses, and systems, any one or more of which can produce a non-visible signal to identify a control before activating a function associated with the control.


2. Description of the Related Art


Remote controls can provide audible signals, whether in the form of words or tones, to notify a user after a key has been depressed. An example of a remote control with such a function is a remote control made by Accenda of Port Washington, N.Y. The Accenda remote control is designed for use with a TV, VCR, cable box, or satellite.


Similar to many other remote controls, the Accenda remote control announces the key after the key has been depressed and the function associated with the key has been activated. Announcing a key after a function has been activated can be undesired. For example, a VCR tape may be over ten years old and include images of a deceased friend or relative. If the key for the record function was pressed instead of the key for the play function, the valuable VCR tape may be recorded over with undesired content. The user may need to quickly find the stop key to prevent further recording. If the user is blind, visually impaired, or has normal vision but is in a dark room, locating the correct key may be difficult. Therefore, providing an “after-the-fact” announcement to notify the user of the function that was activated may provide feedback too late to the user. Accordingly, there is a need for an improved remote control and method of using a remote control.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 includes a block diagram of a home entertainment system;



FIG. 2 includes an illustration of a control layout for a remote control that can be used with the home entertainment system of FIG. 1;



FIGS. 3 and 4 include block diagrams that illustrate embodiments of the remote control of FIG. 2;



FIG. 5 includes a block diagram of an apparatus that can be used with the home entertainment system of FIG. 1;



FIGS. 6 and 7 include flow diagrams of methods of using the system of FIG. 1;



FIG. 8 includes a diagram of controls within an automobile; and



FIG. 9 includes a flow diagram of a method of using the controls of FIG. 8.





Skilled artisans appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale.


DETAILED DESCRIPTION

A system provides a non-visible signal to the user of the system before a control or function is activated by the user. In this manner, the user can be visually impaired, in a dark environment, or in a position where visual confirmation of a control may be undesired. In one embodiment, a remote control can be used with an apparatus, such as a set-top box. When the user places an object near a control within the remote control, a control or function associated with the control may be announced to the user before he or she decides to activate the control. In another embodiment, equipment, such as an automobile, can be the system. Similar to the remote control, when the user places an object near a control within the remote control, a control or function associated with the control may be announced to the user before he or she decides to activate the control. The likelihood of activating the wrong control is substantially reduced or eliminated. Also, the likelihood of causing irreversible damage (unintentionally recording over existing content) can also be substantially reduced.


In one aspect, a method of using a remote control controls an operation of an apparatus. The remote control includes a plurality of controls including a first control that corresponds to a first function. The method includes sensing that a first object is near the first control before the first function is activated. In response to sensing, the method also includes providing a first audible signal that corresponds to a first identifier of the first control. The method further includes sending a first activation signal to the apparatus to identify activation of the first control.


In one embodiment, the method farther comprises sensing a first force of at least a first activation threshold at the first control, or allowing a predetermined amount of time to pass before sensing a second force of at least a second activation threshold at any control within the plurality of controls other than the first control.


In another embodiment, the method farther includes sensing that a second object is near a second control before a second function is activated, wherein the plurality of controls includes the second control that corresponds to the second function, and the second object is the same or different from the first object. In response to sensing that the second object is near the second control, the method also includes providing a second audible signal that corresponds to a second identifier for the second control. Sensing the second object is near the second control and providing the second audible signal are performed before sensing the first object is near the first control and providing the first audible signal. The second function is not activated during a time period between providing the second audible signal and sensing the first object is near the first control.


In still another embodiment, the method further includes receiving a language selection signal associated with the first audio signal. In yet another embodiment, the method further includes receiving a user-defined signal associated with the first audio signal.


In another aspect, a remote control controls an operation of an apparatus. The remote control includes a plurality of controls including a first control that corresponds to a first function and a control module. The control module is configured to receive a first sensing signal when a first object is near the first control before the first function is activated, in response to receiving the first sensing signal, provide a first audio signal that corresponds to a first identifier of the first control, and send a first activation signal to the apparatus to identify activation of the first control in response to a predetermined activity.


In one embodiment, the predetermined activity includes sensing a first force of at least a first activation threshold at the first control. Alternatively, the predetermined activity includes allowing a predetermined amount of time to pass before sensing a second force of at least a second activation threshold at any control within the plurality of controls other than the first control.


In another embodiment, the plurality of controls includes a second control that corresponds to a second function. The control module is further configured to not provide an audio signal that corresponds to a second identifier associated with the second control, and send a second activation signal to the apparatus to identify activation of the second control after the second control receives a force of at least the activation threshold.


In still another embodiment, the plurality of controls includes a second control that corresponds to a second function, wherein the second control is different from the first control. The control module is further configured to receive a second sensing signal when a second object is near the second control before the second function is activated, wherein the second object is the same or different compared to the first object, and in response to receiving the second sensing signal, provide a second audio signal that corresponds to a second identifier of the second control.


In a further embodiment, the remote control further includes a sensing module responsive to the first control and coupled to the control module and a transmitter responsive to the control module. In a particular embodiment, the remote control further includes an audio module responsive to the control module and a speaker responsive to the audio module.


In still another aspect, a method can be used to operate a system including an apparatus and a remote control that controls an operation of the apparatus. The remote control includes a plurality of controls including a first control, wherein the first control corresponds to a plurality of functions including a first function. The method includes sensing that a first object is near the first control during a first time period, wherein sensing is performed by the remote control. The method also includes determining a first state of the apparatus, wherein the apparatus is capable of being in at least one state of a plurality of states including the first state. The method further includes determining a first function corresponds to the first control, based at least in part on the first state of the apparatus. The method still further includes providing a first audio signal, wherein the first audio signal corresponds to a first identifier of the first function.


In one embodiment, determining the first state of the apparatus includes determining which one or more input devices coupled to the apparatus is active, determining which one or more output devices coupled to the apparatus is active, or any combination thereof. In a particular embodiment, the method farther includes sensing a second object is near the first control during a second time period, wherein sensing is performed by the remote control. The method still further includes determining a second state of the apparatus during the second time period, wherein the plurality of states includes the second state that is different from the first state. The method yet further includes determining a second function corresponds to the first control, based at least in part on the second state of the apparatus, wherein the second function is different from the first function. The method also includes providing a second audio signal, wherein the second audio signal corresponds to a second identifier of the second function.


In another embodiment, the method further includes activating the first control in response to a predetermined activity. Providing the second audio signal is performed before activating the first control. The predetermined activity includes sensing a first force of at least a first activation threshold at the first control. Alternatively, the predetermined activity includes allowing a predetermined amount of time to pass before sensing a second force of at least a second activation threshold at any control within the plurality of controls other than the first control.


In a particular embodiment, the method further includes sensing a second object is near a second control during the first time period, wherein the plurality of controls includes the second control that is different from the first control. The method also includes determining a second function corresponds to the second control, based at least in part on the first state of the apparatus, wherein the plurality of functions includes the second function that is different from the first function. The method further includes providing a second audio signal that corresponds to a second identifier of the second function. Sensing the second object is near the second control and providing the second audio signal are performed before sensing the first object is near the first control and providing the first audio signal. The second function is not activated during a time period between providing the second audio signal and sensing the first object is near the first control.


In a further aspect, a remote control includes a plurality of controls including a first control, wherein the first control corresponds to a plurality of functions including a first function and a control module. The control module is configured to receive a first sensing signal when a first object is near the first control during a first time period, in response to receiving the first sensing signal, provide a first identification signal to a remote apparatus, wherein the first identification signal corresponds to the first control, receive a second identification signal from the remote apparatus, wherein the second identification information signal corresponds to the first function, and provide a first audio signal, wherein the first audio signal corresponds to a first identifier of the first function.


In one embodiment, wherein the control module is further configured to receive another first sensing signal when a second object is near the first control during a second time period, wherein the second object is the same or different from the first object. In response to receiving the other first sensing signal, the control module is further configured to provide the first identification signal to the apparatus, wherein the first identification signal corresponds to the first control. The control module is still further configured to receive a third identification signal from the apparatus, wherein the third identification signal corresponds to a second function, and wherein the plurality of functions includes the second function that is different from the first function. The control module is further configured to provide a second audio signal different from the first audio signal, wherein the second audio signal corresponds to a second identifier of the second function.


In another embodiment, the control module is further configured to send a first activation signal to the apparatus in response to a predetermined activity. The predetermined activity includes sensing a first force of at least a first activation threshold at the first control. Alternatively, the predetermined activity includes allowing a predetermined amount of time to pass before sensing a second force of at least a second activation threshold at any control within the plurality of controls other than the first control.


In a still another embodiment, the remote control further includes an audio module responsive to the control module and a speaker responsive to the audio module.


In yet a further aspect, an apparatus is configured to be operated at least in part from a remote control that includes a plurality of controls including a first control. The apparatus includes a control module configured to receive a first identification signal from the remote control, wherein the first identification signal corresponds to the first control, determine a state of the apparatus, wherein the apparatus is capable of being in at least one state of a plurality of states, determine a function to which the first control corresponds, based at least in part on the state of the apparatus, and send a second identification signal to an audio system, wherein the second identification signal corresponds to the first function.


In one embodiment, the control module is configured to determine the first state of the apparatus by determining which one or more input devices coupled to the apparatus is active, determining which one or more output devices coupled to the apparatus is active, or any combination thereof.


In another embodiment, the audio system lies within the remote control. In still another embodiment, the audio system lies outside of the remote control.


In a further embodiment, the control module is further configured to receive a first activation signal from the remote control to identify activation of the first control and send a signal to activate the first function.


In yet a further embodiment, the apparatus further includes an I/O module coupled to the control module and a transceiver coupled to the control module. In a particular embodiment, the apparatus further includes a hard drive coupled to the control module.


In another aspect, a method is used for a system that includes a plurality of controls including a first control. The method includes sensing a first object is near the first control before a first function associated with the first control is activated, in response to sensing, providing a first audible signal, wherein the first audible signal corresponds to a first identifier of the first control or the first function, and sending a first activation signal to identify activation of the first control.


In one embodiment, the method further includes sensing a second object is near a second control that corresponds to a second function before the second function is activated, wherein the plurality of controls includes the second control that is different from the first control. In response to sensing, the method also includes providing a second audible signal that corresponds to a second identifier of the second control. Sensing the second object is near the second control and providing the second audible signal are performed before sensing the first object is near the first control and providing the first audible signal. The second function is not activated during a time period between providing the second audible signal and sensing the first object is near the first control.


In yet another aspect, a system includes a plurality of controls including a first control and a control module. The control module is configured to receive a first sensing signal when a first object is near the first control before a first function associated with the first control is activated. In response to receiving the first sensing signal, the control module is still further configured to provide a first audio signal, wherein the first audio signal corresponds to an identifier for the first control or the first function. The control module is yet further configured to send a first activation signal to identify activation of the first control in response to a predetermined activity.


In one embodiment, the predetermined activity includes sensing a first force of at least a first activation threshold at the first control. Alternatively, the predetermined activity includes allowing a predetermined amount of time to pass before sensing a second force of at least a second activation threshold at any control within the plurality of controls other than the first control.


In another embodiment, the plurality of controls includes a second control that corresponds to a second function. In still another embodiment, the plurality of controls includes a second control that corresponds to a second function, wherein the second control is different from the first control. The control module is further configured to receive a second sensing signal when a second object is near the second control before the second function is activated, and in response to receiving the second sensing signal, provide a second audio signal that corresponds to a second identifier of the second control.


Before addressing details of embodiments described below, some terms are defined or clarified. The term “audible signal” refers to a signal that can be hear and understood by a human. The term “audio signal” refers to a signal corresponding to one or more audible signals that can be transferred between or processed by a machine. Audible signal and audio signal are similar to an analogy between source code and object code for software programs.


The term “control” refers to a button, level, key, switch or nearly any other physical item that is capable of activating a function. The term control is to be construed broadly.


As used herein, the terms “comprises,” “comprising,” “includes,” “including,” “has,” “having” or any other variation thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Further, unless expressly stated to the contrary, “or” refers to an inclusive or and not to an exclusive or. For example, a condition A or B is satisfied by any one of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).


Additionally, for clarity purposes and to give a general sense of the scope of the embodiments described herein, the use of the “a” or “an” are employed to describe one or more articles to which “a” or “an” refers. Therefore, the description should be read to include one or at least one whenever “a” or “an” is used, and the singular also includes the plural unless it is clear that the contrary is meant otherwise.


Unless stated otherwise, any combination of parts of a system may be bi-directionally or uni-directionally coupled to each other, even though a figure may illustrate only a single-headed arrow or a double-headed arrow. Arrows within the drawing are illustrated, as a matter of convenience, to show a principal information, data, or signal flow within the system or between the system and one or more component outside the system, one or more module outside the system, one or more module outside the system, another system, or any combination thereof in accordance with an embodiment. Coupling should be construed to include a direct electrical connection in one embodiment and alternatively, may include any one or more of an intervening switch, resistor, capacitor, inductor, router, firewall, network fabric or the like between any combination of one or more component, one or more devices, or one or more modules.


Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. All publications, patent applications, patents, and other references mentioned herein are incorporated by reference in their entirety. In case of conflict, the present specification, including definitions, will control. In addition, the materials, methods, and examples are illustrative only and not intended to be limiting.


To the extent not described herein, many details regarding specific network, hardware, software, and firmware components and acts are conventional and may be found in textbooks and other sources within any one or more of the multimedia, information technology, networking and telecommunications arts.



FIG. 1 includes a block diagram of a system 100. The system 100 can be centrally controlled by an apparatus 120. The apparatus 120 may receive input from any one or more sources including a subscriber line 142, which may be connected to the an internet service provider, a cable service provider, a satellite dish, a telephone line, another conventional type of subscriber line (wired or wireless), or any combination thereof. The apparatus 120 may also be connected to an input device 144. An example of the input device 144 can include a video cassette recorder (“VCR”), a digital video disk (“DVD”) player, an audio compact disc (“CD”) player, another conventional device that may be used in conjunction with a home entertainment system, or any combination thereof. The apparatus 120 may provide output to a personal computer (“PC”) 162, a television (“TV”) 164, or other output device 166. An example of the output device 166 can include a VCR, a DVD player, a CD burner, speakers, another conventional output device used with a home entertainment system, or any combination thereof. In one embodiment, each of the subscriber line 142, input device 144, personal computer 162, television 164, and output device 166 are bi-directionally coupled to the apparatus 120. In another embodiment, the subscriber line 142, input device 144, personal computer 162, television 164, output device 166, or any combination thereof may be directly connected to the apparatus 120, or may be uni-directionally coupled or connected to the apparatus 120 (allows signals to flow in only one direction).


The apparatus 120 can be controlled by a remote control 180. The remote control 180 can communicate with the apparatus 120 using electronic signals, radio-frequency signals, optical signals, signals using other electromagnetic radiation, or any combination thereof. In one embodiment, the remote control 180 does not need to contact or otherwise be tethered to the apparatus 120. In another embodiment (not illustrated), the remote control 180 can be coupled to the apparatus 120 using one or more one wires.



FIG. 2 includes an illustration of the remote control 180 that includes a plurality of controls that by themselves or in conjunction with one another can be used to activate a function of the apparatus 120. The controls include buttons and keys in one embodiment. The remote control 180 includes an activation indicator 210 that indicates when a control in the remote control 180 has been activated. The remote control 180 and includes a plurality of different sections including a QWERTY keyboard section 220, Internet navigation section 230, a special features section 240, a volume control section 250, media control section 260, and a number pad section 270. The remote control 180 also includes an apparatus power control 282, a TV power control 284, a “last” button 286 which allows the user to go to the immediately prior channel that the user was viewing, and channel controls 288. The special features section 240 includes controls for play, summary, move, show/hide adult, content, delete, or the like. In other embodiments, more, fewer, or other controls may be part of the special features section.



FIGS. 3 and 4 include block diagrams to better illustrate some of the components and modules that provide functionality within the remote control 180. Referring to FIG. 3, the remote control 180 includes a control 302 that is coupled to a sensing module 304. The control 302 may be any of the keys or buttons previously described with respect to the remote control 180. The sensing module 304 is coupled to a control module 320. The control module 320 is coupled to an audio module 342 that is coupled to a speaker 344. The combination of the audio module 342 and the speaker 344 is an example of an audio system. The speaker 344 allows audible signals, such as tones, words, music, or other sounds to be heard by a user of the system 100, and more particularly the user of the remote control 180. The control module 320 is also coupled to a transmitter 360 that can send signals to the apparatus 120.


Referring to FIG. 4, the illustrative embodiment of remote control 180 is substantially the same as the one illustrated in FIG. 3, except that a transceiver 460 is used instead of the transmitter 360. The transceiver 460 can allow bi-directional communication between the apparatus 120 and the remote control 180. More or fewer modules and other components than illustrated may be used in other embodiments. For example the audio system, which includes the audio module 342 and the speaker 344, is not required to be within the remote control 180. In an alternate embodiment, an audio system can be part of or coupled to the apparatus 120. Although not illustrated, the remote control 180 may include one or more memory devices that can be used to store tones, words, or other sounds in the form of audio signals that can be converted to audible signals.



FIG. 5 includes a block diagram to better illustrate some of the components and modules that provide functionality within the apparatus 120. In one embodiment, the apparatus 120 is a set-top box that can be connected to one or more input devices, one or more output devices, or any combination thereof. The apparatus 120 includes a control module 520 that controls a wide array of functions within the apparatus 120. In one embodiment, the control module can include a microcontroller, a microprocessor, a chipset, a motherboard, or a collection of different modules that provide the functionality described in this specification. The control module 520 is bi-directionally coupled to I/O modules 542. The I/O modules 542 are coupled to a subscriber line 142, the input device 144, the PC 162, the TV 164, and the output device 166 as illustrated. In another embodiment, more or fewer input devices, more or fewer output devices, or a combination thereof, may be used with the apparatus 120. The control module 520 is also bi-directionally coupled to a transceiver 560. Transceiver 560 is capable of receiving signals from and sending signals to the remote control 180. In still another embodiment, the transceiver 560 can be replaced by a receiver (not illustrated) that receives signals from the remote control 180 and is coupled to the control module 520. A hard disk (“HD”) 580 is coupled to the control module 520. Stored content, such as movies, broadcast programs, pictures, audio files, or any combination thereof may be stored in HD 580. HD 580 can also include one or more software programs for operating part or all of the system 100, and the apparatus 120 in particular.


Although not illustrated, the apparatus 120 can also include an audio system similar to the audio system described with respect to the remote control 180. The audio module could be coupled to the control module 520, and the speaker would be coupled to that audio module. In another embodiment, the audio system may be part of an output device, such as the PC 162, the TV 164, or the output device 166. Therefore the audio system may lie within the remote control 180, within the apparatus 120, or lie outside the remote control 180 and the apparatus 120.


The control module 320, the control module 520, or both may include a central processing unit (“CPU”) or controller. Each of the apparatus 120 and the remote control 180 is an example of a data processing system. Although not shown, other connections and memories (not shown) may reside in or be coupled to any of the control module 320, the control module 520, or any combination thereof. Such memories can include content addressable memory, static random access memory, cache, first-in-first-out (“FIFO”), other memories, or any combination thereof. The memories, including. HD 580, can include media that can be read by a controller, CPU, or both.


Portions of the methods described herein may be implemented in suitable software code for carrying out the disclosed methods. In one embodiment, the computer-executable instructions may be lines of assembly code or compiled C++, Java, or other language code. In another embodiment, the code may be contained on a data storage device, such as a hard disk, magnetic tape, floppy diskette, optical storage device, networked storage device(s), or other appropriate data processing system readable medium or storage device.


The functions of the remote control 180 may be performed at least in part by the apparatus 120 or by a computer. Additionally, a software program or its software components with such code may be embodied in more than one data processing system readable medium in more than one computer or other item having a CPU.


Attention is now directed to methods of using the system 100 in accordance with some illustrative, but not limiting, embodiments. A couple of embodiments of methods are illustrated in the process flow diagrams of FIGS. 6 and 7.


The method illustrated in FIG. 6 can be performed with the remote control 180 having modules as illustrated in FIG. 3 or 4. In one embodiment, the remote control 180 can be used to provide an audible signal to a user regarding any one or more of the controls of the remote control 180 before the control is activated. The method can include sensing an object that is near a control before a function associated with the control is activated (block 622). As used in this specification, near is to be construed to cover when the object is close to but not in contact with the control 302, or when the object contacts but does not activate, the control 302. The object can include a finger, a stylus, a pen, a pencil, or nearly anything else that can be used to press or otherwise activate the control 302 of the remote control 180.


Sensing may occur in any one or more of several different ways. In one embodiment, proximity sensing can be used. When proximity sensing is used, sensing may be detected by the sensing module 304 using electronic or optical signals within a circuit. For example, light from a light source near the control 302 may be reflected by the object as it moves near the control 302. The light is reflected into a detector within the remote control 180. The detector may be part of the sensing module 304. In another embodiment, another form of radiation may be used instead of light. In still another embodiment, sensing may occur as a change in resistance or capacitance within a circuit when the object is near or contacts the control 302. In still another embodiment, other conventional proximity detection schemes may be used.


In a particular embodiment, the object may contact but does not activate the control 302. More specifically, a force may be applied to the control 302. In a particular embodiment, the force used for sensing would be no greater than an activation threshold force that may be used to activate the control 302. For example, if 0.2 Newton (N) (approximately 1 pound) is the activation threshold force used to activate the control 302, the force applied to the control 302 should be less than the activation threshold force, for example 0.1 N (approximately ½ pound). In another particular embodiment, the force used for sensing may exceed a minimum force (i.e. a sensing threshold force), for example 0.02 N (approximately 0.1 pound) to account for incidental contact. For example, when the remote control 180 is resting on a chair with the controls facing the chair (e.g., the control 302 contacts the chair), the control 302 would not be detected as being sensed. Skilled artisans will appreciate that other numbers or ranges of forces may be used.


In another embodiment, a timer circuit (not illustrated) may be used in conjunction with or as part of the sensing module 304. In this embodiment, the force used during sensing would be sufficient to exceed a minimum force (e.g., 0.02 N), such that incidental contact of any one or more of the controls in the remote control 180 would not be sensed by the sensing module 304. More details regarding the timer will be discussed with respect to sending an activation signal.


In response to sensing, the method also includes providing an audible signal that corresponds to a first identifier of the first control (block 642). The identifier can be one or more tones, one or more words, music, or other sound that uniquely is associated with the control. For example, the words “set-top box power” may be announced when an object gets near the apparatus power control 282, and the word “zero” may be announced when an object gets near the zero key within the number pad section 270.


In an alternative embodiment, a user of the system 100 or a manufacturer of the remote control 180 or the apparatus 120 may allow a language selection to be made. The language can include English, Spanish, French, German, Japanese, or nearly any other language. In an alternative embodiment, a user may be able to create a user-defined audible signal. In a particular embodiment, the user may record his or her own voice or that of a relative (e.g., a child) that will be played as the audible signal. In another particular embodiment, a user may be able to program the home key within the Internet navigation section 230, such that the audible signal will announce “There's no place like home” when an object gets near the home page key. In still another particular embodiment, the space key within the keyboard section 220 may have a corresponding audible signal that announces “Space, the final frontier.”


In yet another embodiment, any one or more controls, any one or more sections of controls, or any combination thereof for the remote control 180 may be configured so that audible signal(s) for one or more controls is not announced. In a particular embodiment, the sensing module 304 may be deactivated for those specific controls or sections, the control module 320 may not send an audio signal to the audible module 342, the audible module 342 may be deactivated for the specific control(s), or any combination thereof. For example, a user may not want to have the controls within the keyboard section 220 announced every time a control within the keyboard section 220 is used. Otherwise, typing a text message may be distracting if the system 200 is also being used for other purposes, such as listening to music or watching a movie. In another example, the controls within the sound control section 250 may not need to be announced because they affect the sound level of the system 200 and may be perceive as the volume of the sound changes. In into another embodiment, one or more functions provided by one or more controls may not cause an irreversible adverse effect. Unlike recording, changing a channel for viewing may not be considered irreversible, and therefore, the identity of the control may not be needed


The method can further include sending an activation signal to the apparatus to identify activation of the control in response to a predetermined activity (block 662). The predetermined activity can vary depending on the design of the remote control 180. In one the embodiment, a force greater than an activation threshold force may be used to activate the function associated with control 302. For example, in one particular embodiment, the control 302 may receive a force of 0.3 N, which is greater than the activation threshold force of 0.2 N. When this occurs, the sensing module 304 can generate a signal that is sent to the control module 320. The control module 320 sends an activation signal to the transmitter module 360 (FIG. 3) or transceiver module 460 (FIG. 4), which in turn transmits the activation signal to the apparatus 120. The control module 320 will also send a signal to the activation indicator 210 so that the indicator will become lit. This embodiment allows different levels force to be used with the control 302: a relatively lighter force to be used for sensing, and a relatively heavier force for activation.


In another embodiment, the predetermined activity can be used in conjunction with a timer. In one embodiment, after the control 302 has been pressed one time, the user may need to press the control 302 (i.e., the same control) for a second time within a predetermined time period. The predetermined time period may be nearly any length of time, and may be set in hardware or firmware, or may be adjustable in software. The predetermined time period may start right after the control 302 is pressed for the first time, after the control 302 has been announced (end of audible signal), or nearly any other time. The first time the control 302 is pressed, the identifier for the control 302 may be announced using the audible signal, and the second time the control 302 is pressed within the predetermined time period, the activating signal will be sent from the remote control 180 to the apparatus 120, as previously described. If the control 302 is not pressed for a second time within the time period, the remote control 180 will not generate an activation signal for the control 302. Skilled artisans will appreciate that pressing the same control twice within the predetermined time period is similar to “double clicking” as used with PCs.


In still another embodiment, the control 302 is pressed for a first time, and a function associated with the control 302 is announced (an audible signal) over the speaker 344 of the remote control 180. After a predetermined time period (using a timer), an activation signal associated with the control 302 is sent from the remote control 180 to the apparatus 120, unless the same or another control is pressed within a predetermined time period. If another control is pressed, the timer may be reset and automatically sends an activation signal unless that other key or another key is pressed. When the control 302 is pressed twice within the time period, logic within the control module 320 determines that the activation signal for the control 302 is not to be sent to the apparatus 120.


In another embodiment, the control 302 may correspond to more than one function, depending in part on the state of the apparatus 120. The state of the apparatus 120 may depend on which one or more input devices or one or more output devices within the system 120 are active. For example if the subscriber input line 142 and the TV 164 are active, the apparatus may be in a broadcast mode where signals received over the subscriber line 142 are processed and routed to the TV 164. In another embodiment, the input device 144 may be active. Depending upon the type of input device, one of many different functions may be associated with the control 302. For example, when the input device 144 is an audio CD player, audio signals may be provided to the output device 166, which in one embodiment can be a set of speakers. The control module 520 within the apparatus 120 may be able to determine the state of the apparatus 120.


In still another embodiment, information regarding which devices are active can be sent from the apparatus 120 using the transceiver 560 of the apparatus 120 to the transceiver 460 of the remote control 180. In this embodiment, the control module 320 within the remote control 180 may have logic that can determine the state of the apparatus 120, using at least in part, the information received from the apparatus 120. In this embodiment, signals may be sent and received by each of the remote control 180 and the apparatus 120.



FIG. 7 includes a flow diagram for a method that can be used when there in bi-directional flow of information between the apparatus 120, as illustrated in FIG. 5, and the remote control 180 having the transceiver 460 as illustrated in FIG. 4. The method can include sensing that an object is near a control during a time period, wherein sensing is performed by the remote control 180 (block 722 in FIG. 7). This portion of the method can be performed using any one or more of the embodiments as previously described with respect to sensing. The method can also include determining a state of the apparatus, wherein the apparatus is capable of being in at least one of a plurality of states (block 742). Logic within the control module 320 of the remote control 180, the control module 520 of the apparatus 120, or a combination thereof can be used to access a table or other data indicating the various states of the apparatus 120 based at least in part on which input or output device that is coupled to the apparatus 120 is active. The table may be kept in memory at the remote control 180, the apparatus 120, or a combination thereof. In a particular embodiment, the table having the state information is within the HD 580 of the apparatus 120.


The method can further include determining a specific function corresponding to the control, based at least in part on the state of the apparatus 120 (block 762). The control module 320 and the remote control 180 or the control module 520 and the apparatus 120 may perform this function based on the configuration of the remote control 180 or the apparatus 120. The same table as described with respect to determining the state of the apparatus (block 742) or a different table includes a listing of the controls and the different functions provided by the controls depending on the state. Similar to determining the state, logic within the control module 320 of the remote control 180, the control module 520 of the apparatus 120, or a combination thereof can be used to access the table to determine the specific function associated with the control. The table may be kept in memory at the remote control 180, the apparatus 120, or combination thereof. In one particular embodiment, the table having the state information is within the HD 580 of the apparatus 120. The method can still further include providing an audio signal, wherein the audio signal corresponds to an identifier of the specific function (block 782).


An example is provided to better illustrate how the method illustrated in the flow diagram of FIG. 7 is performed. In one embodiment, a double headed arrow and bar (“>>|”) control within the multimedia control section 260 (FIG. 2) of the remote control 180 may correspond to a fast-forward function that may terminate at the end of a tape if the input device 144 is a VCR. However, if the input device 144 is an audio CD player, the same control (>>|) may correspond to forward the audio CD player to the beginning of the next song. If the input device 144 is a DVD player, the same key can correspond to forward to the beginning of the next chapter. When the PC 162 is the only output device that is currently active, the multimedia control section 260 may be deactivated because the controls within the multimedia control section 260 may not be used by the PC 162. In other words, no function would correspond to the >>| control within the multimedia control section 260. In another embodiment, the multimedia control section 260 may be active when the PC 162 is active in order to operate a multimedia player on the PC 162.


The control module 320 within the remote control 180 or the control module 520 within the apparatus 120 can generate an audio signal that can be used by an audio system within the remote control 180, the apparatus 120, or an output device 166 coupled to the apparatus. The audio system can convert the audio signal into an audible signal that the user of the system 100 can understand. After hearing the audible signal, the user can determine whether to activate the function associated with that control. Any one or more of the predetermined activities previously described with respect to any disclosed embodiment may be performed. When the predetermined activity is performed an activation signal can be generated within the remote control 180 and sent to the apparatus 120.


A benefit regarding certain embodiments described herein is that an identifier of the control or an identifier of a function associated with the control, wherein the identifier is in the form of an audible signal, is provided to the user of the remote control 180 before an activation signal is sent from the remote control 180 to the apparatus 120. Therefore, the likelihood that a user will activate a control or function that he or she does not desire may be substantially reduced or even eliminated. In one embodiment, a user may place an object near a first control, wherein the object is sensed by the sensing module 304. An audible signal can be generated so that the user hears an identifier for the first control or function associated with the first control. Before the first control is activated, a user can determine he or she had the wrong control and then move the same or different object to a second control, which may be the control that the user initially desired. The second control or function associated with the second control may be announced (an audible signal) that the user can confirm corresponds to his or her selection. At this point, the user can activate the second control.


The concepts described herein can be extended to other embodiments in which the user cannot or does not desire visual confirmation of one or more controls. In one embodiment, a user operating an automobile, a truck, aircraft, or other operating equipment may benefit from such an audible signal. FIG. 8 includes an illustration of a portion of an automobile 800 that includes a dashboard 810, a control module 880, and an audio system including an audio module 892 and a speaker 894. In one embodiment the audio system may be part of the automobile's audio system. The dashboard 810 includes lighting controls, such as a headlight control 802, a fog light control 804, and a panel light control 806. Above the steering column are gauges and an odometer reset control 812. The dashboard further includes audio controls, such as a volume adjust and on/off control 820, selectors 822, 823, 824, and 825 that may correspond to preset channels or a disk selector for an audio CD player (not illustrated) within the automobile 800. Controls 842, 844, and 846 may correspond to audio input selection. For example control 842 may correspond to an FM radio (not illustrated), control 844 may correspond to the audio CD player, and the control 846 may correspond to a tape player (not illustrated). Ventilation controls can include a vent selection control 862, a temperature control 864, and a fan speed control 866. Some of the signal connections between controls and the control module 880 are illustrated with dashed lines. Although not fully illustrated, each of the controls may be bi-directionally coupled to the control module 880. In a particular embodiment, the sensing module may be incorporated within the control module 880.


Similar to the prior embodiments, a control or a function associated with a control may be identified before an activation signal is generated. FIG. 9 includes a flow diagram of a method that may be performed when operating the automobile 800. The method includes sensing that an object is near a control before a function associated with the control is activated (block 922). The sensing may be performed as previously described. The method also includes, in response to sensing, providing an audible signal, wherein the audible signal corresponds to an identifier for the control or the function associated with the control (block 942). In one particular embodiment, a user of the automobile 800 may move an object close to or in contact with the headlight control 802. A sensing signal would be sent to or generated by the control module 880 indicating that an object is near the headlight control 802. In one embodiment, an audio signal can be generated by the control module 880 and sent to the audio module 892. The audio module 892 can provide a signal to the speaker 894 that announces “headlight controls” (as an audible signal).


The user may turn the headlight control 802 to a first position, which is construed by the control module 880 to be the parking lights for the automobile 800. The user may then turn the headlight control 802 to a second position, which is construed by the control module 880 to be the headlights. An audible signal may be generated after the user turns the headlight control 802 to the first position (“park lights” announced), the second position (“headlights” announced), or both.


The method can further include sending an activation signal to identify activation of the control in response to a predetermined activity (block 962). In one embodiment, activation may occur when the user pushes the knob for the headlight control 802 into the dashboard 810. In another embodiment, a different predetermined activity, such as any one or more of the predetermined activities previously described, may be used. By using a control panel that produces audible signals, a user can focus on driving or other visual tasks while operating the automobile 800 or other equipment without having to visually confirm that the correct control or position of the control has been selected.


While a focus of the flow diagrams (FIGS. 6, 7, and 9) have been on methods, after reading this specification, skilled artisans will appreciate that appropriate logic can be generated for the remote control 180, the apparatus 120, or both to perform part or all of the methods described herein. Skilled artisans will appreciate that they have many options regarding the design and use of the system 100. In one implementation, minimal interaction between the remote control 180 and the apparatus 120 may be desired. In another implementation, a significantly higher level of interaction between the remote control 180 and the apparatus 120 may be desired. Skilled artisans will be able to design the system 100 that meets the needs or desires of an equipment manufacturer, user of the system 100, another person or entity involved with the system 100 (service provider for the subscriber line 142), or any combination thereof.


Skilled artisans will appreciate that many other embodiments are possible. The embodiments described should be viewed as illustrative and not limiting to the scope of the present invention.


Note that not all of the activities described in the general description or the examples are required, that a portion of a specific activity may not be required, and that one or more further activities may be performed in addition to those described. Still further, the order in which activities are listed are not necessarily the order in which they are performed.


In the foregoing specification, the invention has been described with reference to particular embodiments. However, one of ordinary skill in the art will appreciate that one or more modifications or one or more other changes can be made without departing from the scope of the invention as set forth in the claims below. Accordingly, the specification and figures are to be regarded in an illustrative rather than a restrictive sense and any and all such modifications and other changes are intended to be included within the scope of invention.


Any one or more benefits, one or more other advantages, one or more solutions to one or more problems, or any combination thereof have been described above with regard to one or more particular embodiments. However, the benefit(s), advantage(s), solution(s) to problem(s), or any element(s) that may cause any benefit, advantage, or solution to occur or become more pronounced is not to be construed as a critical, required, or essential feature or element of any or all the claims.

Claims
  • 1. A method comprising: detecting that an agent is within a first proximate distance of a first control of a device, wherein the first proximate distance defines an activation threshold; andtriggering a particular audible signal in response to detection of the agent within the first proximate distance of the first control;wherein the particular audible signal identifies a particular function of the first control to which the first control is configured to operate, the particular function selected from a plurality of functions of the first control, the selection of the particular function based on an indication of a particular configured state of an apparatus selected from a plurality of configurable states of the apparatus, wherein the apparatus is remote from the device; andwherein the particular audible signal is selected from a plurality of audible signals, each of the plurality of audible signals identifying a corresponding function of the plurality of functions.
  • 2. The method of claim 1, further comprising: receiving a selection of a spoken language; andselecting at least a portion of the particular audible signal based at least in part on the selection of the spoken language.
  • 3. The method of claim 1, further comprising: receiving a user-defined audible message; andincluding at least a portion of the user-defined audible message in the particular audible signal.
  • 4. The method of claim 1, wherein the indication is received at the device from the apparatus.
  • 5. The method of claim 4, wherein the first control is an automobile control to control one or more functions that are associated with an automobile.
  • 6. The method of claim 1, wherein the agent is detected to be within the particular proximate distance of the first control by using an optical sensor.
  • 7. The method of claim 1, wherein the agent is detected to be within the particular proximate distance of the first control by detecting that an electrical property of a circuit satisfies a threshold value.
  • 8. The method of claim 1, wherein the particular audible signal is emitted by an audio system, the audio system located within the device.
  • 9. The method of claim 1, wherein the particular audible signal is emitted by an audio system, the audio system located within the apparatus.
  • 10. A device comprising: a detector to detect that an agent is within a first proximate distance of a first control of a device, wherein the first proximate distance defines an activation threshold; anda trigger to trigger a particular audible signal in response to detection that the agent is within the first proximate distance of the first control;wherein the particular audible signal identifies a particular function of the first control to which the first control is configured to operate, the particular function selected from a plurality of functions of the first control, the selection of the particular function based on an indication of a particular configured state of an apparatus selected from a plurality of configurable states of the apparatus, wherein the apparatus is remote from the device; andwherein the particular audible signal is selected from a plurality of audible signals, each of the plurality of audible signals identifying a corresponding function of the plurality of functions.
  • 11. The device of claim 10, wherein the detector includes an optical sensor, the optical sensor to detect a position of the agent with respect to the first control.
  • 12. The device of claim 10, wherein the detector includes an electrical sensor, the electrical sensor to detect that an electrical property of a circuit satisfies a threshold value in response to a position of the agent being within the first proximate distance of the first control, and to detect that the electrical property of the circuit fails to satisfy the threshold value in response to the position of the agent being outside of the first proximate distance of the first control.
  • 13. The device of claim 10, wherein the trigger is configured to trigger the particular audible signal prior to activation of the first control, wherein the activation causes the first control to perform the particular function.
  • 14. The device of claim 10, further comprising an audio system, the audio system to emit the particular audible signal based that is selected.
  • 15. The device of claim 10, wherein the apparatus includes an audio system, the audio system to emit the particular audible signal that is selected.
  • 16. The device of claim 10, wherein the apparatus includes a memory device, the memory device storing one or more audio signals, wherein in response to the selection of the particular audible signal, a corresponding audio signal stored in the memory device is retrieved and converted into the particular audible signal prior to triggering the particular audible signal.
  • 17. The device of claim 16, wherein the corresponding audio signal includes information that is converted to words that are included in the particular audible signal.
  • 18. A non-transitory computer readable medium storing processor-executable instructions that when executed by a processor, cause the processor to: detect that an agent is within a first proximate distance of a first control of a device, wherein the first proximate distance defines an activation threshold; andtrigger a particular audible signal in response to detection of the agent within the first proximate distance of the first control;wherein the particular audible signal identifies a particular function of the first control to which the first control is configured to operate, the particular function selected from a plurality of functions of the first control, the selection of the particular function based on an indication of a particular configured state of an apparatus selected from a plurality of configurable states of the apparatus, wherein the apparatus is remote from the device; andwherein the particular audible signal is selected from a plurality of audible signals, each of the plurality of audible signals identifying a corresponding function of the plurality of functions.
  • 19. The computer-readable medium of claim 18, wherein the particular configured state of the apparatus is determined based at least in part upon data stored in a table.
  • 20. The non-transitory computer readable medium of claim 18, wherein the indication of the particular configured state of the apparatus is received from the apparatus.
RELATED APPLICATIONS

The present application is a continuation of and claims priority to U.S. patent application Ser. No. 11/049,629, filed Feb. 2, 2005, the contents of which are incorporated by reference in their entirety.

US Referenced Citations (369)
Number Name Date Kind
4243147 Twitchell et al. Jan 1981 A
4356509 Skerlos et al. Oct 1982 A
4768926 Gilbert, Jr. Sep 1988 A
4907079 Turner et al. Mar 1990 A
5126731 Cromer, Jr. et al. Jun 1992 A
5163340 Bender Nov 1992 A
5475835 Hickey Dec 1995 A
5532748 Naimpally Jul 1996 A
5541917 Farris Jul 1996 A
5589892 Knee et al. Dec 1996 A
5592477 Farris et al. Jan 1997 A
5610916 Kostreski et al. Mar 1997 A
5613012 Hoffman et al. Mar 1997 A
5650831 Farwell Jul 1997 A
5651332 Moore et al. Jul 1997 A
5656898 Kalina Aug 1997 A
5675390 Schindler et al. Oct 1997 A
5708961 Hylton et al. Jan 1998 A
5722041 Freadman Feb 1998 A
5724106 Autry et al. Mar 1998 A
5729825 Kostreski et al. Mar 1998 A
5734853 Hendricks et al. Mar 1998 A
5774357 Hoffberg et al. Jun 1998 A
5793438 Bedard Aug 1998 A
5805719 Pare, Jr. et al. Sep 1998 A
5818438 Howe et al. Oct 1998 A
5838384 Schindler et al. Nov 1998 A
5838812 Pare, Jr. et al. Nov 1998 A
5864757 Parker Jan 1999 A
5867223 Schindler et al. Feb 1999 A
5892508 Howe et al. Apr 1999 A
5900867 Schindler et al. May 1999 A
5910970 Lu Jun 1999 A
5933498 Schneck et al. Aug 1999 A
5953318 Nattkemper et al. Sep 1999 A
5956024 Strickland et al. Sep 1999 A
5956716 Kenner et al. Sep 1999 A
5970088 Chen Oct 1999 A
5987061 Chen Nov 1999 A
5990927 Hendricks et al. Nov 1999 A
5995155 Schindler et al. Nov 1999 A
5999518 Nattkemper et al. Dec 1999 A
5999563 Polley et al. Dec 1999 A
6002722 Wu Dec 1999 A
6014184 Knee et al. Jan 2000 A
6021158 Schurr et al. Feb 2000 A
6021167 Wu Feb 2000 A
6028600 Rosin et al. Feb 2000 A
6029045 Picco et al. Feb 2000 A
6038251 Chen Mar 2000 A
6044107 Gatherer et al. Mar 2000 A
6052120 Nahi et al. Apr 2000 A
6055268 Timm et al. Apr 2000 A
6072483 Rosin et al. Jun 2000 A
6084584 Nahi et al. Jul 2000 A
6111582 Jenkins Aug 2000 A
6118498 Reitmeier Sep 2000 A
6122660 Baransky et al. Sep 2000 A
6124799 Parker Sep 2000 A
6137839 Mannering et al. Oct 2000 A
6166734 Nahi et al. Dec 2000 A
6181335 Hendricks et al. Jan 2001 B1
6192282 Smith et al. Feb 2001 B1
6195692 Hsu Feb 2001 B1
6215483 Zigmond Apr 2001 B1
6237022 Bruck et al. May 2001 B1
6243366 Bradley et al. Jun 2001 B1
6252588 Dawson Jun 2001 B1
6252989 Geisler et al. Jun 2001 B1
6260192 Rosin et al. Jul 2001 B1
6269394 Kenner et al. Jul 2001 B1
6275268 Ellis et al. Aug 2001 B1
6275989 Broadwin et al. Aug 2001 B1
6281813 Vierthaler et al. Aug 2001 B1
6286142 Ehreth Sep 2001 B1
6295057 Rosin et al. Sep 2001 B1
6311214 Rhoads Oct 2001 B1
6314409 Schneck et al. Nov 2001 B2
6344882 Shim et al. Feb 2002 B1
6357043 Ellis et al. Mar 2002 B1
6359636 Schindler et al. Mar 2002 B1
6363149 Candelore Mar 2002 B1
6385693 Gerszberg et al. May 2002 B1
6396480 Schindler et al. May 2002 B1
6396531 Gerszberg et al. May 2002 B1
6396544 Schindler et al. May 2002 B1
6397387 Rosin et al. May 2002 B1
6400407 Zigmond et al. Jun 2002 B1
6411307 Rosin et al. Jun 2002 B1
6442285 Rhoads et al. Aug 2002 B2
6442549 Schneider Aug 2002 B1
6449601 Freidland et al. Sep 2002 B1
6450407 Freeman et al. Sep 2002 B1
6460075 Krueger et al. Oct 2002 B2
6463585 Hendricks et al. Oct 2002 B1
6481011 Lemmons Nov 2002 B1
6486892 Stern Nov 2002 B1
6492913 Vierthaler et al. Dec 2002 B2
6496983 Schindler et al. Dec 2002 B1
6502242 Howe et al. Dec 2002 B1
6505348 Knowles et al. Jan 2003 B1
6510519 Wasilewski et al. Jan 2003 B2
6515680 Hendricks et al. Feb 2003 B1
6516467 Schindler et al. Feb 2003 B1
6519011 Shendar Feb 2003 B1
6522769 Rhoads et al. Feb 2003 B1
6526577 Knudson et al. Feb 2003 B1
6529949 Getsin et al. Mar 2003 B1
6535590 Tidwell et al. Mar 2003 B2
6538704 Grabb et al. Mar 2003 B1
6542740 Olgaard et al. Apr 2003 B1
6557030 Hoang Apr 2003 B1
6563430 Kemink et al. May 2003 B1
6567982 Howe et al. May 2003 B1
6574083 Krass et al. Jun 2003 B1
6587873 Nobakht et al. Jul 2003 B1
6598231 Basawapatna et al. Jul 2003 B1
6599199 Hapshie Jul 2003 B1
6607136 Atsmon et al. Aug 2003 B1
6609253 Swix et al. Aug 2003 B1
6611537 Edens et al. Aug 2003 B1
6614987 Ismail et al. Sep 2003 B1
6622148 Noble et al. Sep 2003 B1
6622307 Ho Sep 2003 B1
6631523 Matthews, III et al. Oct 2003 B1
6640239 Gidwani Oct 2003 B1
6643495 Gallery et al. Nov 2003 B1
6643684 Malkin et al. Nov 2003 B1
6650761 Rodriguez et al. Nov 2003 B1
6658568 Ginter et al. Dec 2003 B1
6678215 Treyz et al. Jan 2004 B1
6678733 Brown et al. Jan 2004 B1
6690392 Wugoski Feb 2004 B1
6693236 Gould et al. Feb 2004 B1
6701523 Hancock et al. Mar 2004 B1
6704931 Schaffer et al. Mar 2004 B1
6714264 Kempisty Mar 2004 B1
6725281 Zintel et al. Apr 2004 B1
6731393 Currans et al. May 2004 B1
6732179 Brown et al. May 2004 B1
6745223 Nobakht et al. Jun 2004 B1
6745392 Basawapatna et al. Jun 2004 B1
6754206 Nattkemper et al. Jun 2004 B1
6756997 Ward, III et al. Jun 2004 B1
6760918 Rodriguez et al. Jul 2004 B2
6763226 McZeal, Jr. Jul 2004 B1
6765557 Segal et al. Jul 2004 B1
6766305 Fucarile et al. Jul 2004 B1
6769128 Knee et al. Jul 2004 B1
6771317 Ellis et al. Aug 2004 B2
6773344 Gabai et al. Aug 2004 B1
6778559 Hyakutake Aug 2004 B2
6779004 Zintel Aug 2004 B1
6781518 Hayes et al. Aug 2004 B1
6784804 Hayes et al. Aug 2004 B1
6785716 Nobakht et al. Aug 2004 B1
6788709 Hyakutake Sep 2004 B1
6804824 Potrebic et al. Oct 2004 B1
6826775 Howe et al. Nov 2004 B1
6828993 Hendricks et al. Dec 2004 B1
6909874 Holtz et al. Jun 2005 B2
6938021 Shear et al. Aug 2005 B2
7310807 Pearson Dec 2007 B2
7436346 Walter Oct 2008 B2
7474359 Sullivan et al. Jan 2009 B2
20010011261 Mullen-Schultz Aug 2001 A1
20010016945 Inoue Aug 2001 A1
20010016946 Inoue Aug 2001 A1
20010034664 Brunson Oct 2001 A1
20010044794 Nasr et al. Nov 2001 A1
20010048677 Boys Dec 2001 A1
20010049826 Wilf Dec 2001 A1
20010054008 Miller et al. Dec 2001 A1
20010054009 Miller et al. Dec 2001 A1
20010054067 Miller et al. Dec 2001 A1
20010056350 Calderone et al. Dec 2001 A1
20020001303 Boys Jan 2002 A1
20020001310 Mai et al. Jan 2002 A1
20020002496 Miller et al. Jan 2002 A1
20020003166 Miller et al. Jan 2002 A1
20020007307 Miller et al. Jan 2002 A1
20020007313 Mai et al. Jan 2002 A1
20020007485 Rodriguez et al. Jan 2002 A1
20020010639 Howey et al. Jan 2002 A1
20020010745 Schneider Jan 2002 A1
20020010935 Sitnik Jan 2002 A1
20020016736 Cannon et al. Feb 2002 A1
20020022963 Miller et al. Feb 2002 A1
20020022970 Noll et al. Feb 2002 A1
20020022992 Miller et al. Feb 2002 A1
20020022993 Miller et al. Feb 2002 A1
20020022994 Miller et al. Feb 2002 A1
20020022995 Miller et al. Feb 2002 A1
20020023959 Miller et al. Feb 2002 A1
20020026357 Miller et al. Feb 2002 A1
20020026358 Miller et al. Feb 2002 A1
20020026369 Miller et al. Feb 2002 A1
20020026475 Marmor Feb 2002 A1
20020029181 Miller et al. Mar 2002 A1
20020030105 Miller et al. Mar 2002 A1
20020032603 Yeiser Mar 2002 A1
20020035404 Ficco et al. Mar 2002 A1
20020040475 Yap et al. Apr 2002 A1
20020042915 Kubischta et al. Apr 2002 A1
20020046093 Miller et al. Apr 2002 A1
20020049635 Mai et al. Apr 2002 A1
20020054087 Noll et al. May 2002 A1
20020054750 Ficco et al. May 2002 A1
20020059163 Smith May 2002 A1
20020059425 Belfiore et al. May 2002 A1
20020059599 Schein et al. May 2002 A1
20020065717 Miller et al. May 2002 A1
20020067438 Baldock Jun 2002 A1
20020069220 Tran Jun 2002 A1
20020069282 Reisman Jun 2002 A1
20020069294 Herkersdorf et al. Jun 2002 A1
20020072970 Miller et al. Jun 2002 A1
20020078442 Reyes et al. Jun 2002 A1
20020097261 Gottfurcht et al. Jul 2002 A1
20020106119 Foran et al. Aug 2002 A1
20020112239 Goldman Aug 2002 A1
20020116392 McGrath et al. Aug 2002 A1
20020124055 Reisman Sep 2002 A1
20020128061 Blanco Sep 2002 A1
20020129094 Reisman Sep 2002 A1
20020133402 Faber et al. Sep 2002 A1
20020138840 Schein et al. Sep 2002 A1
20020152264 Yamasaki Oct 2002 A1
20020169611 Guerra et al. Nov 2002 A1
20020170063 Ansari et al. Nov 2002 A1
20020173344 Cupps et al. Nov 2002 A1
20020188955 Thompson et al. Dec 2002 A1
20020193997 Fitzpatrick et al. Dec 2002 A1
20020194601 Perkes et al. Dec 2002 A1
20020198874 Nasr et al. Dec 2002 A1
20030005445 Schein et al. Jan 2003 A1
20030009771 Chang Jan 2003 A1
20030012365 Goodman Jan 2003 A1
20030014750 Kamen Jan 2003 A1
20030018975 Stone Jan 2003 A1
20030023435 Josephson Jan 2003 A1
20030023440 Chu Jan 2003 A1
20030028890 Swart et al. Feb 2003 A1
20030033416 Schwartz Feb 2003 A1
20030043915 Costa et al. Mar 2003 A1
20030046091 Arneson et al. Mar 2003 A1
20030046689 Gaos Mar 2003 A1
20030056223 Costa et al. Mar 2003 A1
20030058277 Bowman-Amuah Mar 2003 A1
20030061611 Pendakur Mar 2003 A1
20030071792 Safadi Apr 2003 A1
20030093793 Gutta May 2003 A1
20030100340 Cupps et al. May 2003 A1
20030110161 Schneider Jun 2003 A1
20030110503 Perkes Jun 2003 A1
20030126136 Omoigui Jul 2003 A1
20030135771 Cupps et al. Jul 2003 A1
20030141987 Hayes Jul 2003 A1
20030145321 Bates et al. Jul 2003 A1
20030149989 Hunter et al. Aug 2003 A1
20030153353 Cupps et al. Aug 2003 A1
20030153354 Cupps et al. Aug 2003 A1
20030159026 Cupps et al. Aug 2003 A1
20030160830 DeGross Aug 2003 A1
20030163601 Cupps et al. Aug 2003 A1
20030163666 Cupps et al. Aug 2003 A1
20030172380 Kikinis Sep 2003 A1
20030182237 Costa et al. Sep 2003 A1
20030182420 Jones et al. Sep 2003 A1
20030185232 Moore et al. Oct 2003 A1
20030187641 Moore et al. Oct 2003 A1
20030187646 Smyers et al. Oct 2003 A1
20030187800 Moore et al. Oct 2003 A1
20030189509 Hayes et al. Oct 2003 A1
20030189589 LeBlanc et al. Oct 2003 A1
20030194141 Kortum et al. Oct 2003 A1
20030194142 Kortum et al. Oct 2003 A1
20030208396 Miller et al. Nov 2003 A1
20030208758 Schein et al. Nov 2003 A1
20030226044 Cupps et al. Dec 2003 A1
20030226145 Marsh Dec 2003 A1
20030229900 Reisman Dec 2003 A1
20040003041 Moore et al. Jan 2004 A1
20040003403 Marsh Jan 2004 A1
20040006769 Ansari et al. Jan 2004 A1
20040006772 Ansari et al. Jan 2004 A1
20040010602 Van Vleck et al. Jan 2004 A1
20040015997 Ansari et al. Jan 2004 A1
20040030750 Moore et al. Feb 2004 A1
20040031058 Reisman Feb 2004 A1
20040031856 Atsmon et al. Feb 2004 A1
20040034877 Nogues Feb 2004 A1
20040049728 Langford Mar 2004 A1
20040064351 Mikurak Apr 2004 A1
20040068740 Fukuda et al. Apr 2004 A1
20040068753 Robertson et al. Apr 2004 A1
20040070491 Huang et al. Apr 2004 A1
20040073918 Ferman et al. Apr 2004 A1
20040098571 Falcon May 2004 A1
20040107125 Guheen et al. Jun 2004 A1
20040107439 Hassell et al. Jun 2004 A1
20040111745 Schein et al. Jun 2004 A1
20040111756 Stuckman et al. Jun 2004 A1
20040117813 Karaoguz et al. Jun 2004 A1
20040117824 Karaoguz et al. Jun 2004 A1
20040128342 Maes et al. Jul 2004 A1
20040139173 Karaoguz et al. Jul 2004 A1
20040143600 Musgrove et al. Jul 2004 A1
20040143652 Grannan et al. Jul 2004 A1
20040148408 Nadarajah Jul 2004 A1
20040150676 Gottfurcht et al. Aug 2004 A1
20040183839 Gottfurcht et al. Sep 2004 A1
20040194136 Finseth et al. Sep 2004 A1
20040198386 Dupray Oct 2004 A1
20040201600 Kakivaya et al. Oct 2004 A1
20040210633 Brown et al. Oct 2004 A1
20040210935 Schein et al. Oct 2004 A1
20040213271 Lovy et al. Oct 2004 A1
20040221302 Ansari et al. Nov 2004 A1
20040223485 Arellano et al. Nov 2004 A1
20040226035 Hauser, Jr. Nov 2004 A1
20040226045 Nadarajah Nov 2004 A1
20040239624 Ramian Dec 2004 A1
20040252119 Hunleth et al. Dec 2004 A1
20040252120 Hunleth et al. Dec 2004 A1
20040252769 Costa et al. Dec 2004 A1
20040252770 Costa et al. Dec 2004 A1
20040260407 Wimsatt Dec 2004 A1
20040261116 McKeown et al. Dec 2004 A1
20040267729 Swaminathan et al. Dec 2004 A1
20040268393 Hunleth et al. Dec 2004 A1
20050027851 McKeown et al. Feb 2005 A1
20050038814 Iyengar et al. Feb 2005 A1
20050044280 Reisman Feb 2005 A1
20050097612 Pearson et al. May 2005 A1
20050132295 Noll et al. Jun 2005 A1
20050149988 Grannan Jul 2005 A1
20050168372 Hollemans Aug 2005 A1
20050195961 Pasquale et al. Sep 2005 A1
20060026663 Kortum Feb 2006 A1
20060037043 Kortum Feb 2006 A1
20060037083 Kortum Feb 2006 A1
20060048178 Kortum Mar 2006 A1
20060077921 Radpour Apr 2006 A1
20060114360 Kortum Jun 2006 A1
20060117374 Kortum Jun 2006 A1
20060156372 Cansler, Jr. Jul 2006 A1
20060161953 Walter Jul 2006 A1
20060168610 Williams Jul 2006 A1
20060174279 Sullivan Aug 2006 A1
20060174309 Pearson Aug 2006 A1
20060179466 Pearson Aug 2006 A1
20060179468 Pearson Aug 2006 A1
20060184991 Schlamp Aug 2006 A1
20060184992 Kortum Aug 2006 A1
20060190402 Patron Aug 2006 A1
20060218590 White Sep 2006 A1
20060230421 Pierce Oct 2006 A1
20060236343 Chang Oct 2006 A1
20060268917 Nadarajah Nov 2006 A1
20060282785 McCarthy et al. Dec 2006 A1
20060290814 Walter Dec 2006 A1
20060294559 Ansari Dec 2006 A1
20060294561 Grannan Dec 2006 A1
20060294568 Walter Dec 2006 A1
20070011133 Chang Jan 2007 A1
20070011250 Kortum Jan 2007 A1
20070021211 Walter Jan 2007 A1
20070025449 Van Vleck Feb 2007 A1
Foreign Referenced Citations (11)
Number Date Country
9963759 Dec 1999 WO
0028689 May 2000 WO
0160066 Aug 2001 WO
0217627 Feb 2002 WO
02058382 Jul 2002 WO
03003710 Jan 2003 WO
03025726 Mar 2003 WO
2004018060 Mar 2004 WO
2004032514 Apr 2004 WO
2004062279 Jul 2004 WO
2005045554 May 2005 WO
Related Publications (1)
Number Date Country
20080100492 A1 May 2008 US
Continuations (1)
Number Date Country
Parent 11049629 Feb 2005 US
Child 11924757 US