Technical Field
The present disclosure relates to the use of streaming audio data to a wireless headset. The present disclosure relates, more particularly, to conserving the battery life of a wireless headset streaming audio data.
Description of the Related Art
Media programs such as television programming, movies, video games, etc. typically include a video portion and an audio portion. The video portion of the media programs is commonly displayed on a television or computer monitor. The audio portion of the media programs is commonly output from speakers connected to the television or monitor, or from a home entertainment sound system including a large arrangement of speakers. However, it has become increasingly common for users to receive the audio portion of a media program through the headphones of a wireless headset. The wireless headset receives the audio portion of the media program wirelessly from a television receiver, a game console, a DVD player, stereo system, etc. The wireless headset reproduces the audio portion for the user via the earphones of the wireless headset.
Wireless headsets are typically powered by a battery or batteries. A comparatively large amount of power is consumed by the wireless headset when the wireless transceiver, which receives the audio portion of the media program, is active. There are many instances in which the wireless transceiver of the wireless headset continues to function when the user is no longer listening. This consumes battery power and causes the user to need to replace or recharge batteries more frequently than desired.
One embodiment is a television receiver that transmits the audio portion of a media program to a wireless headset worn by a user. The television receiver is configured to receive user input indicating that after a particular media program or at a particular time, the television receiver should transmit a command to the wireless headset that causes the wireless headset to turn off. Thus, when a particular media program has ended, or when a particular time has arrived, the television receiver transmits a command to the wireless headset causing the wireless headset to turn off. In this way, if a user intends to watch a media program and plans to fall asleep during the program or plans to go to bed after the program, the wireless headset will not needlessly consume batteries long after the user has stopped using the wireless headset.
One embodiment is a wireless headset that is configured to receive the audio portion of the media program from a television receiver or another media device. The wireless headset includes a sensor that monitors a physical trait of the user. If the physical trait of the user indicates that the user has fallen asleep, then the wireless headset turns off. In one embodiment the sensor includes an inertial sensor that detects the movements of the user's head. If the movements of the user's head indicates that the user is asleep, then the wireless headset turns off. Alternatively, the inertial sensor can detect the orientation of the user's head, for example whether the user's head is upright or tilted to one side. If the orientation of the user's head indicates that the user is asleep, then the wireless headset turns off.
In one embodiment the monitoring and sending of commands is done by the television receiver or other media device that is configured to transmit an audio portion of the media program to the wireless headset.
In one embodiment, the sensor includes a camera that monitors the user's eyes to see if they are closed for a prolonged period of time. In one embodiment the camera monitors the orientation of the user's head to detect if the orientation of the user's head indicates that the user has fallen asleep.
The television receiver 22 receives media content from a television programming distributor such as a cable television distributor, satellite television distributor, an Internet television distributor, or a terrestrial broadcast television distributor. The media content includes media programs such as television programs, movies, pay-per-view movies, radio programs, or other types of media content.
The television receiver 22 typically displays the video portion of a media program on a display coupled to the television receiver 22. The television receiver 22 outputs an audio portion of the media program to the wireless headset 24 worn by a user. In particular, the television receiver 22 wirelessly transmits a signal including the audio portion of the media program to the wireless headset 24. The transceiver 26 of a wireless headset 24 receives the signal from the television receiver 22 and outputs the audio portion of the media program to headphones of the wireless headset 24.
The wireless headset 24 will typically be powered by batteries. If the batteries become depleted, the wireless headset 24 will become inoperable until the batteries are replaced or recharged. The transceiver 26 of the wireless headset 24 consumes a relatively large amount of energy when it is receiving the audio portion of a media program. To avoid the inconvenience of having to frequently replace rechargeable batteries of the wireless headset 24, the system 20 of
In one embodiment, the television receiver 22 includes an electronic programming guide which can be accessed by the user to view which media programs are available on particular channels at particular times. By operating a remote control, or by utilizing inputs coupled directly to the television receiver 22, the user can access the electronic programming guide and can select a media program to view. When the user selects a media program to view, the user can also enter input directing the television receiver to send a command to the wireless headset 24 to turn off at the end of the selected media program. At the end of the media program, the television receiver 22 will transmit a wireless command signal to the wireless headset 24 directing the wireless headset 24 to enter a reduced power state or to turn off entirely.
This can be of particular use when the user anticipates that he will stop using the wireless headset 24 at the end of the selected media program, or if the user anticipates that he may fall asleep during the media program. In many cases a user plans to stop using the wireless headset at the end of the selected media program, but forgets to turn off the wireless headset 24 or the television receiver 22. This is particularly common in an instance in which the user turns off a television coupled to the television receiver 22, but fails to turn off the television receiver 22. The television receiver 22 may continue to broadcast the audio portion of a subsequent media program to the wireless headset 24. If the user has also forgotten to turn off the wireless headset 24, the transceiver 26 of the wireless headset 24 will continue to operate and receive the audio portion of the subsequent media program. The continued operation of the transceiver 26 will deplete the batteries of the wireless headset 24 even though the user is no longer using the wireless headset 24. When the user returns at a future time to use the wireless headset 24, he may find that the batteries are entirely depleted. It is both inconvenient and expensive to repeatedly recharge the batteries or purchase new batteries.
However, the functionality of the system 20 allows the user to avoid this situation by enabling the user to choose to turn off the wireless headset 24 at the end of a selected media program. Thus, if the user has scheduled the television receiver 22 to turn off the wireless headset 24 at the end of a selected media program, the television receiver 22 will transmit a command to the wireless headset 24 instructing wireless headset 24 to turn off the transceiver 26 or to shut down altogether. If the user then forgets to turn off the wireless headset 24 or the television receiver 22, the wireless headset 24 will nevertheless stop the function of the transceiver 26. In this way the battery life of the wireless headset 24 is not needlessly wasted.
Alternatively, the user of the wireless headset 24 can instruct the television receiver 22 to turn off the wireless headset 24 at a particular time of day. For instance, the user may plan to relax and channel surf at a later time in the evening, without a plan to view any particular media program. Nevertheless the user believes that he will most likely go to bed by midnight. Or, the user can set his planned schedule to be in bed by midnight. The user can thus instruct the television receiver 22 to turn off the wireless headset 24 at midnight. Thus, if the user has gone to bed or if the user has fallen asleep while watching a media program, at midnight the television receiver 22 will transmit a command to the wireless headset 24 causing the wireless headset 24 to turn off the transceiver 26 or to shut down entirely. The user can store a long term, scheduled program to turn off at selected times each day. In this way, the batteries of the wireless headset 24 can be preserved when the user is no longer viewing the media program.
This functionality can also be used by media devices other than a television receiver 22. For example, the wireless headset may receive the audio portion of a media program from a game console, a computer, a tablet, a stereo system, or other kinds of media devices. The functionality described above with respect to the television receiver 22 can also be implemented in these other media devices.
In one example, a user of the wireless headset may be playing a video game and receiving an audio portion of the videogame, as well as audio communication from other players, through the wireless headset 24. The user can schedule the game console or other device to turn off the wireless headset 24 at a particular time or after the user is no longer playing in a particular game. In this way, the wireless headset 24 does not needlessly deplete the batteries after the user is no longer using the wireless headset 24. Those of skill in the art will recognize, in light of the present disclosure, that the energy-saving functionality can be implemented in many other kinds of devices that communicate with a wireless headset 24. All such other devices fall within the scope of the present disclosure.
In one embodiment, the sensor 28 of the wireless headset 24 detects when the user of the wireless headset 24 has fallen asleep. The sensor 28 monitors a physical state of the user and detects whether the user is awake or asleep based on the monitored physical state of the user. When the sensor 28 detects that the user has fallen asleep, the sensor 28 outputs a signal to control circuitry of the wireless headset 24 causing the wireless headset 24 to turn off the transceiver 26 or to shut down altogether.
In one example, the sensor 28 is an inertial sensor that detects the motion of the user's head. Commonly, when a user is awake, the user's head will make particular shifting movements such as nodding, quickly moving to look another direction then moving back, and many other kinds of movements. In contrast, when the user is asleep, the head moves very little or only makes certain kinds of movements particular to a state of sleep. Based on these movements, the sensor 28 can detect whether the user is awake or asleep. If the motion of the user's head, as detected by the sensor 28, indicates that the user is asleep, the sensor 28 can output a signal causing the wireless headset 24 to turn off the transceiver 26 or to shut down altogether.
Alternatively, the sensor 28 can include a microphone that senses the breathing of the user. The breathing pattern of the user can provide an indication of whether the user is asleep or not. When a user falls asleep, the user's breathing pattern changes in a known manner. In particular, the frequency of breathing decreases when a user is asleep. The microphone can detect the users breathing pattern and can determine if the user has fallen asleep based on the breathing pattern. If the microphone determines that the user has fallen asleep, based on the users breathing pattern, the microphone can cause the wireless headset 24 to turn off the transceiver 26 or to shut down altogether.
Sensor 28 can also include a pulse rate monitor that is capable of measuring the heart rate of the user. The heart rate of the user can provide an indication of whether the user has fallen asleep. In particular, when the user falls asleep, the heart rate of the user typically decreases to a level that is significantly lower than the heart rate of the user when the user is awake. If the pulse rate monitor detects that the pulse rate has decreased to a level indicative of the user being asleep, the pulse rate monitor can cause the wireless headset 24 to turn off the transceiver 26 or to shut down altogether.
In one embodiment, when the sensor 28 causes the wireless headset 24 to turn off the radio or to shut down altogether, the wireless headset 24 first transmits a signal to the television receiver 22 indicating that the user is asleep. The television receiver 22 can then enter a low power state while the user is asleep. The low power state can include ceasing transmission of the audio portion to the wireless headset 24 and ceasing transmission of the video portion to the display. The low power state can further include turning off the television receiver 22 altogether. The television receiver 22 can also transfer the shutdown command to the display or to other media devices coupled to the television receiver 22. In this way, when the wireless headset 24 detects that the user has fallen asleep, the wireless headset 24 can also cause other media devices to enter a reduced power state or to shut down altogether, thereby reducing the power consumed by the media devices while the user is asleep.
When the television receiver 22 or other media device receives a signal from the wireless headset 24 indicating that a user is asleep, the television receiver 22 or the media device can take steps to ensure that the user does not miss any portion of the media program that the user is watching. For example, if the user is watching a television program broadcast at a particular time, upon being notified that the user has fallen asleep the television receiver 22 can either pause or automatically record the program to a DVR. The recording can be the remaining portion of the program or going back to record the entire program, which can be done easily since the last few hours of viewed program content are stored in a buffer. In this way, when the user wakes up she can immediately unpause the television program and proceed to watch the remaining portion of the television program or go back to a prior portion that was missed as the user was starting to fall asleep. Alternatively, the user can enter the DVR menu and select to play the remaining portion of the program from among the titles recorded in the DVR. In a similar manner, if the user is watching a movie on DVD or Blu-ray, the DVD or Blu-ray player can immediately cause the DVD or Blu-ray to stop upon being notified by the wireless headset 24 that the user has fallen asleep. Those of skill in the art will recognize that many other actions can be taken by the television receiver or other media devices for the user's convenience upon being notified that the user has fallen asleep.
The sensor 28 can also cause the transceiver 26 of the wireless headset 24 to turn back on when the user wakes up. For example, if the sensor 28 has caused the wireless transceiver 26 to turn off because the sensor 28 has detected that the user has fallen asleep, the sensor 28 can still be in a functioning state and continue to monitor the physical state of the user. If the physical state of the user indicates that the user has woken up, the sensor 28 can cause the transceiver 26 to turn back on and to continue to receive the audio portion of the media program. The wireless headset 24 can also transmit signals to the television receiver 22 or other media devices indicating that the user has woken up. The television receiver 22 or other media devices that have entered a low power mode and/or paused or recorded a media program can immediately resume playing the media program upon notification that the user has woken up. Alternatively, the television receiver 22 or other media device can notify the user that the media program was paused or recorded upon detecting that the user fell asleep. The television receiver 22 or other media device can prompt the user for input regarding whether the user would like to immediately begin playing the paused or recorded program.
In one embodiment, the television receiver 22 includes a sensor 29 that can monitor a physical state of the user. The sensor 29 of the television receiver 22 detects that the user has fallen asleep, the television receiver 22 can transmit a signal via transceiver 27 to the wireless headset 24 indicating that the user has fallen asleep. In response to receiving the signal from the television receiver 22, the wireless headset 24 can enter a low power mode by turning off the transceiver 26 or by shutting down altogether.
In one embodiment, the sensor 29 of the television receiver 22 includes a camera that can monitor the eyes of the user. Sensor 29 can detect if the user's eyes are closed. If the sensor 29 detects that the user's eyes are closed for an extended period of time, then the television receiver 22 determines that the user is asleep. The television receiver 22 then transmits a signal to the wireless headset 24 causing the wireless headset 24 to enter a low power or shutdown mode as described previously. Further details regarding the features of a television receiver 22 or other media device that monitors a user's eyes can be found in U.S. patent application Ser. No. 13/910,804, hereby incorporated by reference in its entirety. Other systems, such as Xbox One and Kinect that are known in the art can also be used.
The television receiver 22 can also monitor and dynamically learn the user's habits/routines and use that information to determine when to automatically power down the wireless headset 24. For example, the television receiver 22 detects that the user commonly watches the evening news and then turns off the television receiver 22 and the wireless headset 24 after the news has ended. On a particular day, the television receiver 22 may detect that the user has not powered down the television receiver 22 and the wireless headset 24 after the conclusion of the evening news. The television receiver can assume that the user might have fallen asleep and that this is the reason for the break from the user's normal routine. The television receiver 22 outputs a prompt on a display indicating that the system 20 will be powered down unless the user provides feedback such as an audible statement command detected by the headset 24 or the television receiver 22, a button press on the wireless headset 24 or on a remote control, etc. If the user does not respond then the wireless headset 24 and the television receiver 22 are powered down.
While many of the features of the system 20 have been described in relation to a television receiver 22 and the wireless headset 24, principles of the present disclosure extend to 3-D glasses or other types of headwear or devices that can be worn by a user in conjunction with viewing media programs. Thus, the 3-D glasses can include the sensor 28 that detects whether the user has fallen asleep and can cause the 3-D glasses to enter into a low power state. Likewise, the television receiver 22 or other media device can transmit a signal to the 3-D glasses causing the 3-D glasses to enter a low power or shutdown state at the end of a selected media program, at a selected time, or upon detecting that the user has fallen asleep.
While a wireless headset 24 has been shown to include a single transceiver 26, those of skill in the art will understand that the wireless headset 24 can include multiple wireless receivers and transmitters. Upon detecting that a user is asleep, the wireless headset 24 may shutdown one or more of the wireless receivers and transmitters while leaving other wireless receivers and transmitters still functioning. For example, in one embodiment the transceiver 26 includes a Bluetooth transceiver that receives the audio portion of the media program. The Bluetooth transceiver can be shutdown when the user falls asleep while other transceivers may still be active. Many configurations of the transceiver 26 are apparent in light of the present disclosure. All such configurations fall within the scope of the present disclosure.
In one embodiment, the television receiver 22 receives media content from a satellite television service provider, cable television provider, the Internet, terrestrial broadcast signals, etc. The television receiver 22 displays media programs on the television 34. The user 30 can operate a remote control 32 to control the television receiver 22. The user 30 can select media programs to be displayed on a television 34 by the television receiver 22. The audio portion of the media programs are transmitted from the transceiver 27 to the wireless headset 24. The user 30 hears the audio portion of the media program via the headphones of the wireless headset 24.
By using the remote control 32, the user 30 can access menu screens of the television receiver 22. In the menu screens, the user can select a particular media program after which the television receiver 22 should transmit a command to the wireless headset 24 to enter a low power mode or to shut down altogether. For example, the user may wish to watch Sports Center at 10 PM on ESPN. Prior to or during viewing of Sports Center, the user can access the programming guide and can select Sports Center as a final media program to be viewed that night. In this way the user can tell the television receiver 22 to transmit the signal to the wireless headset 24 causing the wireless headset to enter the low power or shutdown mode at the conclusion of the program. In another example, at the end of Sports Center, the television receiver 22 can cease transmitting the audio portion to the wireless headset 24. The wireless headset 24 can preserve power by not actively receiving the audio portion of the broadcast.
Alternatively, from the menu screens of the television receiver 22, the user can select a particular time at which to transmit the signal to the wireless headset 24 causing the wireless headset to enter a reduced power mode or to shut down altogether.
In one example, the user can sit down to the various television programs on the television 34. The user expects to be done watching television by 1 AM. In particular, the user expects either to have fallen asleep while watching television or to have gone to bed by 1 AM. The user therefore accesses the menu screens of the television receiver 22 and designates 1 AM as a time after which the wireless headset should enter a low power mode and/or the audio portion of the media programs should no longer be transmitted to the wireless headset 24 from the television receiver 22. Therefore, at 1 AM the television receiver 22 transmits a signal to the wireless headset 24 causing wireless headset 24 to enter the low power or shutdown state. The television receiver 22 can also turn off or cease transmitting the audio portion of the media program to the wireless headset 24.
In one embodiment, the sensor 28 of the wireless headset 24 monitors a physical state of the user such as head motion, head orientation, pulse, breathing rate etc. to detect when the user has fallen asleep. If the sensor 28 detects that the user 30 has fallen asleep, then the sensor 28 can cause the wireless headset 24 to enter a low power mode by shutting down the transceiver 26 or a particular portion of the transceiver 26. The sensor 28 can also cause the entire wireless headset 24 to shut down.
In one embodiment, the television receiver 22 includes a sensor 29, to monitor a physical state of the user 30 such as whether the user's eyes are open. If the sensor 29 detects that the user has fallen asleep, then the television receiver 22 can transmit a signal to the wireless headset 24 causing the wireless headset 24 to enter a low power or shutdown mode.
The media device 36 can be a game console, a DVD player, stereo system or other electronic media device that plays media programs that include an audio portion. The media device 36 transmits the audio portion of the media program to the wireless headset 24. The television receiver 22 can be configured to cause the media device 36 to shut down at a particular time or after a particular program has selected by the user 30. The television receiver 22 can also cause the media device 36 to stop transmitting an audio portion of the media program to the wireless headset 24 at the particular time or after the particular media program has ended. Alternatively, the media device 36 can include functionality allowing the user to select a particular time to cease transmission of the audio portion to the wireless headset 24 or to send a signal to the wireless headset 24 causing the wireless headset 24 to enter the low power or shutdown mode as described previously. Those of skill in the art will recognize that many configurations of the electronic device 36 and television receiver 22 are possible in light of the present disclosure. All such other configurations of the electronic device 36 and television receiver 22 fall within the scope of the present disclosure.
The memory 44 can include one or more of an EEPROM, ROM, SRAM, DRAM, flash RAM, or other types of memory devices. The controller 40 executes instructions stored in the memory 44 to perform the functions of the wireless headset 24.
The wireless transceiver 26 includes one or more wireless transmitters and receivers by which the wireless headset 24 communicates with other devices. The controller 40 controls the wireless transceiver 26. The wireless transceiver 26 receives the audio portion of the media program from a television receiver 22 or other media device 36 as described previously. In one embodiment, the wireless transceiver 26 includes IR and RF transmitters and receivers including a Bluetooth transceiver that receives the audio portion of the media program from the television receiver 22 or other media device 36. The wireless transceiver 26 can also transmit signals to the television receiver 22 and other electronic media devices 36 indicating that the user 30 has fallen asleep. In this way the wireless transceiver 26 can cause the television receiver 22 or other electronic media devices 36 to pause or record the media program, to enter a low power mode, etc., as described previously.
The earphones 46 include speakers that output the audio portion of the media program as an audible sound to the user 30. In particular, the earphones 46 fit on or inside the ears of the user 30 and output sound to the user 30 received via the wireless transceiver 26.
The user input keys 48 are the inputs by which a user 30 can control the wireless headset 24. User input keys 48 can include on, off, and standby keys, volume control keys, wireless transceiver control keys or any other keys suitable for allowing the user 30 to interact with and control the wireless headset 24.
The user inputs 48 can also be on the remote control for the television receiver 22. The remote control can send signals to the television receiver which will store the program for the headset 24 and then output signals to control it.
The sensor 28 monitors the physical state of the user. The sensor 28 can detect whether the user 30 has fallen asleep based on the physical state monitored by the sensor 28. The sensor 28 can include one or more accelerometers, gyroscopes, microphones, pulse rate monitors, breathing monitors, cameras, or any other suitable device for detecting whether the user 30 has fallen asleep. The controller 40 controls the sensor 28 and receives signals from the sensor 28 indicating the physical state of the user 30. In one embodiment, the controller 40 detects whether or not the user has fallen asleep based on comparing the signals received from the sensor 28 to data stored in the memory 44. If the controller 40 determines that the user has fallen asleep, the controller 40 can cause the wireless transceiver 26 to output a signal to the television receiver 22, the television 34, or any other electronic media devices 36. The controller 40 can shut down the wireless transceiver 26 or a portion of the wireless transceiver 26 based on instructions stored in the memory 44. The controller 40 can also cause the entire wireless headset 24 to shut down. In this way, the sensor 28 and the controller 40 can preserve the life of the battery 42 by shutting down one or more portions of the wireless headset 24 when the sensor 28 indicates that the user 30 has fallen asleep. The controller 40 can also cause the wireless transceiver 26 or other components of the wireless headset 24 to wake up and resume full functionality when the sensor 28 indicates that the user has woken up.
Those of skill in the art will understand that the wireless headset 24 can include many more or fewer components than those disclosed in the block diagram of
The media input 58 receives media program data or signals from a satellite television provider, cable television provider, terrestrial broadcast signals, other electronic media devices coupled to the television receiver 22, or any other suitable source of media programs. The media input 58 is controlled by the controller 50.
The media output 60 outputs media programs to a display 34 or other electronic media devices coupled to the television receiver 22 either by a wired connection or a wireless connection. For example, when the television receiver 22 receives a media program from a content provider via the media input 58, the controller 50 processes the input media program and outputs the video portion of the media program to the display 34 via the media output 60.
The digital video recorder (DVR 56) records media programs selected by the user and stores them in memory. In one embodiment, when the television receiver 22 receives a signal from the wireless headset 24 indicating that the user has fallen asleep, the controller 50 causes the DVR 56 to record the remaining portion of the media program currently being viewed.
The memory 54 stores data and software instructions for execution by the controller 50. In particular, the controller 50 controls the various components of the television receiver 22 in accordance with instructions stored in the memory 54 and input received from the user 30.
The wireless transceiver 27 includes one or more wireless receivers and transmitters. The wireless transceiver 27 can include one or more infrared receivers and transmitters, one or more RF receivers and transmitters, a Bluetooth transceiver, etc. In one embodiment, the wireless transceiver 27 transmits to the headset 24 a signal causing the wireless headset 24 to enter a low power or shutdown mode as described previously. The wireless transceiver 27 can also transmit signals to the television 34 or the other electronic media devices 36 causing them to enter a low power or shutdown mode as described previously. The wireless transceiver 27 also receives signals from the remote control 32 by which the user controls the television receiver 22.
The user input 62 can include one or more keys, buttons or other input controls on the face of the television receiver 22. The user input 62 can include keys for allowing the user 30 to manually turn off the television receiver 22, to change the channel of the television receiver 22, or to perform other common input commands for controlling a television receiver 22.
The sensor 29 monitors a physical state of the user 30 while the user is wearing the wireless headset 24. As described previously, if the sensor 29 detects that the user 30 has fallen asleep while viewing a media program the television receiver 22 outputs a signal to the wireless headset 24 causing the wireless headset 24 to enter a low power or shutdown mode. In one embodiment, the sensor 29 includes one or more cameras that track the movements of the user's eyes or head to determine if the user is asleep. The cameras can also detect if the user's eyes are opened and closed. The television receiver 22 can determine if the user is asleep based on the sensor 29 as described previously.
In one embodiment, the sensor 28 includes one or more accelerometers and/or gyroscopes that detect the orientation of the users head. In
In
In one embodiment, the sensor 28 monitors the movements of the users head. While a user is awake, the user's head will typically make small movements from time to time such as briefly looking away from the television 34, nodding, jostling due to laughter, etc. When the sensor 28 detects such characteristics head movements, the sensor 28 determines that the user is still awake. However, when the user has fallen asleep, the users head will typically not move at all for relatively long periods of time. If the sensor 28 determines that the users head has not move significantly for a duration of time greater than a threshold period of time, the sensor 28 determines that the user has fallen asleep. Sensor 28 causes the transceiver 26 to power down as described previously.
The sensor 28 can be utilized in many ways to determine if the user has fallen asleep. The sensor 28 can determine whether the user has fallen asleep based on a combination of head orientation and head movements or other factors as will be apparent to those of skill in the art in light of the present disclosure. All such ways of determining whether the user has fallen asleep fall within the scope of the present disclosure.
In one embodiment the sensor 29 includes one or more cameras that monitor the user's eyes. The camera can monitor the user's eyes to determine if the user is awake or asleep. If the sensor 29 detects that the user's eyes are closed for a period of time longer than a threshold period of time, the sensor 29 determines that the user has fallen asleep and transmits the power down command to the wireless headset 24 as described previously.
Alternatively, the sensor 29 can monitor the orientation and/or movements of the users head. As described previously, the orientation and movements of the users head provide an indication of whether the user is awake or sleep. If the sensor 29 determines that the user has fallen asleep based on the movements and/or orientation of the users head, the television receiver 22 transmits the power down command to the wireless headset 24 as described previously.
In one embodiment, the sensor 29 is a video camera that detects when the user is wearing the wireless headset 24. If the video camera 29 indicates that the user is not wearing the wireless headset 24, then the television receiver 22 can transmit a command to power down the wireless headset 24. In a similar manner, if the video camera indicates that the user has put on the wireless headset 24, then the television receiver 22 can transmit a command to turn on the wireless headset 24. Alternatively, the sensor 29 can be a camera that periodically takes a picture. The television receiver 22 then analyzes the picture to determine whether or not the wireless headset is being worn by the user and powers down or powers on the wireless headset 24 accordingly.
At 72, the television receiver 22 outputs to the wireless headset 24 the audio portion of a media program that the user is viewing on the display coupled to the television receiver 22. At 73, the selected program ends or the selected stop time arrives and the television receiver 22 transmits a power down signal to the wireless headset 24. When the wireless headset 24 receives the power down signal, the wireless headset 24 turns off wireless transceiver 26 or shuts down altogether.
The various embodiments described above can be combined to provide further embodiments. All of the U.S. patents, U.S. patent application publications, U.S. patent applications, foreign patents, foreign patent applications and non-patent publications referred to in this specification and/or listed in the Application Data Sheet are incorporated herein by reference, in their entirety. Aspects of the embodiments can be modified, if necessary to employ concepts of the various patents, applications and publications to provide yet further embodiments.
These and other changes can be made to the embodiments in light of the above-detailed description. In general, in the following claims, the terms used should not be construed to limit the claims to the specific embodiments disclosed in the specification and the claims, but should be construed to include all possible embodiments along with the full scope of equivalents to which such claims are entitled. Accordingly, the claims are not limited by the disclosure.
Number | Name | Date | Kind |
---|---|---|---|
4386436 | Kocher et al. | May 1983 | A |
4581606 | Mallory | Apr 1986 | A |
4728949 | Platte et al. | Mar 1988 | A |
4959713 | Morotomi et al. | Sep 1990 | A |
5400246 | Wilson et al. | Mar 1995 | A |
5770896 | Nakajima | Jun 1998 | A |
5805442 | Crater et al. | Sep 1998 | A |
5822012 | Jeon et al. | Oct 1998 | A |
5894331 | Yang | Apr 1999 | A |
5926090 | Taylor et al. | Jul 1999 | A |
5970030 | Dimitri et al. | Oct 1999 | A |
6081758 | Parvulescu | Jun 2000 | A |
6104334 | Allport | Aug 2000 | A |
6107918 | Klein et al. | Aug 2000 | A |
6107935 | Comerford et al. | Aug 2000 | A |
6119088 | Ciluffo | Sep 2000 | A |
6182094 | Humpleman et al. | Jan 2001 | B1 |
6330621 | Bakke et al. | Dec 2001 | B1 |
6337899 | Alcendor et al. | Jan 2002 | B1 |
6377858 | Koeppe | Apr 2002 | B1 |
6405284 | Bridge | Jun 2002 | B1 |
6415257 | Junqua et al. | Jul 2002 | B1 |
6502166 | Cassidy | Dec 2002 | B1 |
6529230 | Chong | Mar 2003 | B1 |
6553375 | Huang et al. | Apr 2003 | B1 |
6646676 | DaGraca et al. | Nov 2003 | B1 |
6662282 | Cochran | Dec 2003 | B2 |
6663375 | Ulcej | Dec 2003 | B1 |
6756998 | Bilger | Jun 2004 | B1 |
6931104 | Foster et al. | Aug 2005 | B1 |
6976187 | Arnott et al. | Dec 2005 | B2 |
6989731 | Kawai et al. | Jan 2006 | B1 |
7009528 | Griep | Mar 2006 | B2 |
7010332 | Irvin | Mar 2006 | B1 |
7103545 | Furuta | Sep 2006 | B2 |
7143298 | Wells et al. | Nov 2006 | B2 |
7234074 | Cohn et al. | Jun 2007 | B2 |
7260538 | Calderone et al. | Aug 2007 | B2 |
7346917 | Gatto et al. | Mar 2008 | B2 |
7372370 | Stults et al. | May 2008 | B2 |
7386666 | Beauchamp et al. | Jun 2008 | B1 |
7395369 | Sepez et al. | Jul 2008 | B2 |
7395546 | Asmussen | Jul 2008 | B1 |
7529677 | Wittenberg | May 2009 | B1 |
7574494 | Mayernick et al. | Aug 2009 | B1 |
7590703 | Cashman et al. | Sep 2009 | B2 |
7640351 | Reckamp et al. | Dec 2009 | B2 |
7694005 | Reckamp et al. | Apr 2010 | B2 |
7739718 | Young et al. | Jun 2010 | B1 |
7861034 | Yamamoto et al. | Dec 2010 | B2 |
7870232 | Reckamp et al. | Jan 2011 | B2 |
7945297 | Philipp | May 2011 | B2 |
7969318 | White et al. | Jun 2011 | B2 |
8013730 | Oh et al. | Sep 2011 | B2 |
8086757 | Chang | Dec 2011 | B2 |
8106768 | Neumann | Jan 2012 | B2 |
8156368 | Chambliss et al. | Apr 2012 | B2 |
8171148 | Lucas et al. | May 2012 | B2 |
8180735 | Ansari et al. | May 2012 | B2 |
8201261 | Barfield et al. | Jun 2012 | B2 |
8221290 | Vincent et al. | Jul 2012 | B2 |
8275143 | Johnson | Sep 2012 | B2 |
8289157 | Patenaude et al. | Oct 2012 | B2 |
8290545 | Terlizzi | Oct 2012 | B2 |
8310335 | Sivakkolundhu | Nov 2012 | B2 |
8320578 | Kahn | Nov 2012 | B2 |
8335312 | Gerhardt | Dec 2012 | B2 |
8498572 | Schooley et al. | Jul 2013 | B1 |
8550368 | Butler et al. | Oct 2013 | B2 |
8619136 | Howarter et al. | Dec 2013 | B2 |
8644525 | Bathurst et al. | Feb 2014 | B2 |
8645327 | Falkenburg et al. | Feb 2014 | B2 |
8667529 | Taxier | Mar 2014 | B2 |
8750576 | Huang et al. | Jun 2014 | B2 |
8780201 | Scalisi et al. | Jul 2014 | B1 |
8786698 | Chen | Jul 2014 | B2 |
8799413 | Taylor et al. | Aug 2014 | B2 |
8930700 | Wielopolski | Jan 2015 | B2 |
8965170 | Benea et al. | Feb 2015 | B1 |
9019111 | Sloo et al. | Apr 2015 | B1 |
9049567 | Le Guen et al. | Jun 2015 | B2 |
9246921 | Vlaminck et al. | Jan 2016 | B1 |
9462041 | Hagins et al. | Oct 2016 | B1 |
20010012998 | Jouet et al. | Aug 2001 | A1 |
20020019725 | Petite | Feb 2002 | A1 |
20020063633 | Park | May 2002 | A1 |
20020080238 | Ohmura | Jun 2002 | A1 |
20020193989 | Geilhufe et al. | Dec 2002 | A1 |
20030005431 | Shinohara | Jan 2003 | A1 |
20030052789 | Colmenarez | Mar 2003 | A1 |
20030097452 | Kim et al. | May 2003 | A1 |
20030126593 | Mault | Jul 2003 | A1 |
20030133551 | Kahn | Jul 2003 | A1 |
20030140352 | Kim | Jul 2003 | A1 |
20030201900 | Bachinski et al. | Oct 2003 | A1 |
20040019489 | Funk et al. | Jan 2004 | A1 |
20040117038 | Karaoguz et al. | Jun 2004 | A1 |
20040117843 | Karaoguz et al. | Jun 2004 | A1 |
20040121725 | Matsui | Jun 2004 | A1 |
20040128034 | Lenker et al. | Jul 2004 | A1 |
20040143838 | Rose | Jul 2004 | A1 |
20040148419 | Chen et al. | Jul 2004 | A1 |
20040148632 | Park et al. | Jul 2004 | A1 |
20040260407 | Wimsatt | Dec 2004 | A1 |
20040266419 | Arling et al. | Dec 2004 | A1 |
20050038875 | Park | Feb 2005 | A1 |
20050049862 | Choi et al. | Mar 2005 | A1 |
20050188315 | Campbell et al. | Aug 2005 | A1 |
20050200478 | Koch et al. | Sep 2005 | A1 |
20050245292 | Bennett | Nov 2005 | A1 |
20050264698 | Eshleman | Dec 2005 | A1 |
20050289614 | Baek et al. | Dec 2005 | A1 |
20060011145 | Kates | Jan 2006 | A1 |
20060087428 | Wolfe et al. | Apr 2006 | A1 |
20060136968 | Han et al. | Jun 2006 | A1 |
20060143679 | Yamada et al. | Jun 2006 | A1 |
20060155389 | Pessolano | Jul 2006 | A1 |
20070044119 | Sullivan et al. | Feb 2007 | A1 |
20070078910 | Bopardikar | Apr 2007 | A1 |
20070129220 | Bardha | Jun 2007 | A1 |
20070135225 | Nieminen et al. | Jun 2007 | A1 |
20070142022 | Madonna et al. | Jun 2007 | A1 |
20070146545 | Iwahashi | Jun 2007 | A1 |
20070157258 | Jung et al. | Jul 2007 | A1 |
20070192486 | Wilson et al. | Aug 2007 | A1 |
20070256085 | Reckamp et al. | Nov 2007 | A1 |
20070271518 | Tischer et al. | Nov 2007 | A1 |
20070275670 | Chen et al. | Nov 2007 | A1 |
20080021971 | Halgas | Jan 2008 | A1 |
20080022322 | Grannan et al. | Jan 2008 | A1 |
20080062258 | Bentkovski et al. | Mar 2008 | A1 |
20080062965 | Silva et al. | Mar 2008 | A1 |
20080109095 | Braithwaite et al. | May 2008 | A1 |
20080114963 | Cannon et al. | May 2008 | A1 |
20080123825 | Abramson et al. | May 2008 | A1 |
20080140736 | Jarno | Jun 2008 | A1 |
20080163330 | Sparrell | Jul 2008 | A1 |
20080278635 | Hardacker et al. | Nov 2008 | A1 |
20080284905 | Chuang | Nov 2008 | A1 |
20080288876 | Fleming | Nov 2008 | A1 |
20080297660 | Shioya | Dec 2008 | A1 |
20090023554 | Shim | Jan 2009 | A1 |
20090069038 | Olague et al. | Mar 2009 | A1 |
20090112541 | Anderson et al. | Apr 2009 | A1 |
20090138507 | Burckart et al. | May 2009 | A1 |
20090146834 | Huang | Jun 2009 | A1 |
20090165069 | Kirchner | Jun 2009 | A1 |
20090167555 | Kohanek | Jul 2009 | A1 |
20090190040 | Watanabe et al. | Jul 2009 | A1 |
20090249428 | White et al. | Oct 2009 | A1 |
20090271203 | Resch et al. | Oct 2009 | A1 |
20100031286 | Gupta | Feb 2010 | A1 |
20100046918 | Takao et al. | Feb 2010 | A1 |
20100083371 | Bennetts et al. | Apr 2010 | A1 |
20100097225 | Petricoin, Jr. | Apr 2010 | A1 |
20100107184 | Shintani | Apr 2010 | A1 |
20100122284 | Yoon et al. | May 2010 | A1 |
20100131280 | Bogineni | May 2010 | A1 |
20100138007 | Clark et al. | Jun 2010 | A1 |
20100138858 | Velazquez et al. | Jun 2010 | A1 |
20100146445 | Kraut | Jun 2010 | A1 |
20100211546 | Grohman et al. | Aug 2010 | A1 |
20100283579 | Kraus et al. | Nov 2010 | A1 |
20100321151 | Matsuura et al. | Dec 2010 | A1 |
20110030016 | Pino, Jr. et al. | Feb 2011 | A1 |
20110032423 | Jing et al. | Feb 2011 | A1 |
20110093126 | Toba et al. | Apr 2011 | A1 |
20110119325 | Paul et al. | May 2011 | A1 |
20110150432 | Paul et al. | Jun 2011 | A1 |
20110156862 | Langer | Jun 2011 | A1 |
20110187928 | Crabtree | Aug 2011 | A1 |
20110187930 | Crabtree | Aug 2011 | A1 |
20110187931 | Kim | Aug 2011 | A1 |
20110202956 | Connelly et al. | Aug 2011 | A1 |
20110270549 | Jeansonne et al. | Nov 2011 | A1 |
20110282837 | Gounares et al. | Nov 2011 | A1 |
20110283311 | Luong | Nov 2011 | A1 |
20110295396 | Chinen et al. | Dec 2011 | A1 |
20120019388 | Kates | Jan 2012 | A1 |
20120047532 | McCarthy, III | Feb 2012 | A1 |
20120059495 | Weiss et al. | Mar 2012 | A1 |
20120069246 | Thornberry et al. | Mar 2012 | A1 |
20120094696 | Ahn et al. | Apr 2012 | A1 |
20120124456 | Perez et al. | May 2012 | A1 |
20120154108 | Sugaya | Jun 2012 | A1 |
20120271670 | Zaloom | Oct 2012 | A1 |
20120280802 | Yoshida et al. | Nov 2012 | A1 |
20120291068 | Khushoo et al. | Nov 2012 | A1 |
20120316876 | Jang et al. | Dec 2012 | A1 |
20120326835 | Cockrell et al. | Dec 2012 | A1 |
20130046800 | Assi et al. | Feb 2013 | A1 |
20130053063 | McSheffrey | Feb 2013 | A1 |
20130060358 | Li et al. | Mar 2013 | A1 |
20130070044 | Naidoo et al. | Mar 2013 | A1 |
20130074061 | Averbuch et al. | Mar 2013 | A1 |
20130090213 | Amini et al. | Apr 2013 | A1 |
20130124192 | Lindmark et al. | May 2013 | A1 |
20130138757 | Ferron | May 2013 | A1 |
20130152139 | Davis et al. | Jun 2013 | A1 |
20130204408 | Thiruvengada et al. | Aug 2013 | A1 |
20130267383 | Watterson | Oct 2013 | A1 |
20130300576 | Sinsuan et al. | Nov 2013 | A1 |
20130318559 | Crabtree | Nov 2013 | A1 |
20130321637 | Frank et al. | Dec 2013 | A1 |
20130324247 | Esaki et al. | Dec 2013 | A1 |
20140028546 | Jeon | Jan 2014 | A1 |
20140095684 | Nonaka et al. | Apr 2014 | A1 |
20140101465 | Wang et al. | Apr 2014 | A1 |
20140142724 | Park et al. | May 2014 | A1 |
20140160360 | Hsu et al. | Jun 2014 | A1 |
20140168277 | Ashley et al. | Jun 2014 | A1 |
20140192197 | Hanko et al. | Jul 2014 | A1 |
20140192997 | Niu et al. | Jul 2014 | A1 |
20140215505 | Balasubramanian et al. | Jul 2014 | A1 |
20140218517 | Kim et al. | Aug 2014 | A1 |
20140266669 | Fadell et al. | Sep 2014 | A1 |
20140266684 | Poder et al. | Sep 2014 | A1 |
20140310075 | Ricci | Oct 2014 | A1 |
20140333529 | Kim | Nov 2014 | A1 |
20140351832 | Cho et al. | Nov 2014 | A1 |
20140373074 | Hwang et al. | Dec 2014 | A1 |
20150029087 | Klappert | Jan 2015 | A1 |
20150029096 | Ishihara | Jan 2015 | A1 |
20150054910 | Offen et al. | Feb 2015 | A1 |
20150061859 | Matsuoka et al. | Mar 2015 | A1 |
20150066173 | Ellis et al. | Mar 2015 | A1 |
20150084770 | Xiao et al. | Mar 2015 | A1 |
20150106866 | Fujita | Apr 2015 | A1 |
20150127712 | Fadell et al. | May 2015 | A1 |
20150143406 | Cho et al. | May 2015 | A1 |
20150143408 | Sallas | May 2015 | A1 |
20150145643 | Fadell et al. | May 2015 | A1 |
20150154850 | Fadell et al. | Jun 2015 | A1 |
20150156030 | Fadell et al. | Jun 2015 | A1 |
20150156612 | Vemulapalli | Jun 2015 | A1 |
20150159401 | Patrick et al. | Jun 2015 | A1 |
20150160623 | Holley | Jun 2015 | A1 |
20150160634 | Smith et al. | Jun 2015 | A1 |
20150160635 | Schofield et al. | Jun 2015 | A1 |
20150160636 | McCarthy, III et al. | Jun 2015 | A1 |
20150160663 | McCarthy, III et al. | Jun 2015 | A1 |
20150160935 | Nye | Jun 2015 | A1 |
20150161452 | McCarthy, III et al. | Jun 2015 | A1 |
20150161882 | Lett | Jun 2015 | A1 |
20150162006 | Kummer | Jun 2015 | A1 |
20150163411 | McCarthy, III et al. | Jun 2015 | A1 |
20150163412 | Holley et al. | Jun 2015 | A1 |
20150163535 | McCarthy, III et al. | Jun 2015 | A1 |
20150172742 | Richardson | Jun 2015 | A1 |
20150192914 | Slupik | Jul 2015 | A1 |
20150198941 | Pederson | Jul 2015 | A1 |
20150208125 | Robinson | Jul 2015 | A1 |
20150309487 | Lyman | Oct 2015 | A1 |
20150341599 | Carey | Nov 2015 | A1 |
20150347910 | Fadell et al. | Dec 2015 | A1 |
20160063854 | Burton et al. | Mar 2016 | A1 |
20160066046 | Mountain | Mar 2016 | A1 |
20160091471 | Benn | Mar 2016 | A1 |
20160100696 | Palashewski et al. | Apr 2016 | A1 |
20160109864 | Lonn | Apr 2016 | A1 |
20160121161 | Mountain | May 2016 | A1 |
20160123741 | Mountain | May 2016 | A1 |
20160163168 | Brav et al. | Jun 2016 | A1 |
20160182249 | Lea | Jun 2016 | A1 |
20160191912 | Lea et al. | Jun 2016 | A1 |
20160191990 | McCarthy, III | Jun 2016 | A1 |
20160203700 | Bruhn et al. | Jul 2016 | A1 |
20160234034 | Mahar et al. | Aug 2016 | A1 |
20160260135 | Zomet et al. | Sep 2016 | A1 |
20160286327 | Marten | Sep 2016 | A1 |
20160334811 | Marten | Nov 2016 | A1 |
20160335423 | Beals | Nov 2016 | A1 |
20160342379 | Keipert et al. | Nov 2016 | A1 |
Number | Date | Country |
---|---|---|
2267988 | Apr 1998 | CA |
105814555 | Jul 2016 | CN |
2 736 027 | May 2014 | EP |
2 304 952 | Mar 1997 | GB |
2008-148016 | Jun 2008 | JP |
9320544 | Oct 1993 | WO |
2004068386 | Aug 2004 | WO |
2011095567 | Aug 2011 | WO |
2014068556 | May 2014 | WO |
2015088603 | Jun 2015 | WO |
2015088608 | Jun 2015 | WO |
2016034880 | Mar 2016 | WO |
2016066399 | May 2016 | WO |
2016066442 | May 2016 | WO |
2016182696 | Nov 2016 | WO |
Entry |
---|
U.S. Appl. No. 13/910,804, filed Jun. 5, 2013 (44 pgs.). |
Allseen Alliance, “AllJoyn™ Onboarding Framework,” URL=http://allseenalliance.org/framework/documentation/learn/base-services/onboarding, download date Mar. 13, 2017, 8 pages. |
Amulet Devices, “Voice Activated TV using the Amulet Remote for Media Center,” URL=http://www.amuletdevices.com/index.php/Features/television.html, download date Mar. 13, 2017, 1 page. |
Bravedo.com, “Do you want to know how to find water leaks?,” URL=http://bravedo.com, download date Mar. 13, 2017, 10 pages. |
Cheng et al., “A Wireless Sensor System for Prognostics and Health Management,” IEEE Sensors Journal 10(4):856-862, 2010. |
Crestron, “Control App for Samsung Smart TV®,” Aug. 2014, URL=www.crestron.com/downloads/pdf/product—misc/qs—crestron-app-sstv.pdf, download date Mar. 13, 2017, 3 pages. |
Flood, “Acoustic/Ultrasound Ultrasonic Flowmeter Basics,” Oct. 1, 1997, URL=http://www.sensorsmag.com/sensors/acoustic-ultrasound/ultrasonic-flowmeter-basics-842, download date Mar. 13, 2017, 5 pages. |
Fong et al., “Indoor Air Quality Control for Asthma Patients Using Smart Home Technology,” IEEE 15th International Symposium on Consumer Electronics, Singapore, Singapore, Jun. 14-17, 2011, pp. 18-19. |
LaMonica, “CES 2010 preview: Green comes in many colors,” Dec. 22, 2009, URL=https://www.cnet.com/news/ces-2010-preview-green-comes-in-many-colors/, download date Mar. 13, 2017, 2 pages. |
Mitchell, “Addison Fire Department Access Control Installation,” 2012 International Fire Code Section 1008.1.9.8, URL=https://addisontexas.net/ckeditorfiles/files/forms/Fire%20Department/Fire—Prevention—files/2015-Addison-Access-Control-Installation-Requirements.pdf, download date Mar. 13, 2017, 2 pages. |
Omega, “Ultrasonic Flowmeters,” URL=http://www.omega.com/prodinfo/ultrasonicflowmeters.html, download date Mar. 13, 2017, 2 pages. |
Pulsar Process Measurement, “Flow Pulse®: Non-invasive clamp-on flow monitor for pipes,” URL=https://www.pulsar-pm.com/product-types/flow/flow-pulse.aspx, download date Mar. 13, 2017, 5 pages. |
RSHydro, “Ultrasonic Flow Meters,” URL=http://www.rshydro.co.uk/flow-meters/ultrasonic-flow-meters/, download date Mar. 13, 2017, 6 pages. |
Securitron, “International Building Code Excerpts: Updated with Recent Code Changes that Impact Electromagnetic Locks,” URL=http://accesshardware.com/wp-content/uploads/2014/08/IBC-IFC-Access-Controlled-Egress-Doors.pdf, download date Mar. 13, 2017, 2 pages. |
Wang et al., “Mixed Sound Event Verification on Wireless Sensor Network for Home Automation,” IEEE Transactions on Industrial Informatics 10(1):803-812, 2014. |
Number | Date | Country | |
---|---|---|---|
20150281824 A1 | Oct 2015 | US |