Cue-aware privacy filter for participants in persistent communications

Information

  • Patent Grant
  • 9704502
  • Patent Number
    9,704,502
  • Date Filed
    Friday, July 30, 2004
    20 years ago
  • Date Issued
    Tuesday, July 11, 2017
    7 years ago
Abstract
A cue, for example a facial expression or hand gesture, is identified, and a device communication is filtered according to the cue.
Description
TECHNICAL FIELD

The present disclosure relates to inter-device communication.


BACKGROUND

Modern communication devices are growing increasingly complex. Devices such as cell phones and laptop computers now often are equipped with cameras, microphones, and other sensors. Depending on the context of a communication (e.g. where the person using the device is located and to whom they are communicating, the date and time of day, among possible factors), it may not always be advantageous to communicate information collected by the device in its entirety, and/or unaltered.


SUMMARY

The following summary is intended to highlight and introduce some aspects of the disclosed embodiments, but not to limit the scope of the invention. Thereafter, a detailed description of illustrated embodiments is presented, which will permit one skilled in the relevant art to make and use aspects of the invention. One skilled in the relevant art can obtain a full appreciation of aspects of the invention from the subsequent detailed description, read together with the figures, and from the claims (which follow the detailed description).


A device communication is filtered according to an identified cue. The cue can include at least one of a facial expression, a hand gesture, or some other body movement. The cue can also include at least one of opening or closing a device, deforming a flexible surface of the device, altering an orientation of the device with respect to one or more objects of the environment, or sweeping a sensor of the device across the position of at least one object of the environment. Filtering may also take place according to identified aspects of a remote environment.


Filtering the device communication can include, when the device communication includes images/video, at least one of including a visual or audio effect in the device communication, such as blurring, de-saturating, color modification of, or snowing of one or more images communicated from the device. When the device communication includes audio, filtering the device communication comprises at least one of altering the tone of, altering the pitch of, altering the volume of, adding echo to, or adding reverb to audio information communicated from the device.


Filtering the device communication may include substituting image information of the device communication with predefined image information, such as substituting a background of a present location with a background of a different location. Filtering can also include substituting audio information of the device communication with predefined audio information, such as substituting at least one of a human voice or functional sound detected by the device with a different human voice or functional sound.


Filtering may also include removing information from the device communication, such as suppressing background sound information of the device communication, suppressing background image information of the device communication, removing a person's voice information from the device communication, removing an object from the background information of the device communication, and removing the image background from the device communication.





BRIEF DESCRIPTION OF THE DRAWINGS

The headings provided herein are for convenience only and do not necessarily affect the scope or meaning of the claimed invention.


In the drawings, the same reference numbers and acronyms identify elements or acts with the same or similar functionality for ease of understanding and convenience. To easily identify the discussion of any particular element or act, the most significant digit or digits in a reference number refer to the figure number in which that element is first introduced.



FIG. 1 is a block diagram of an embodiment of a device communication arrangement.



FIG. 2 is a block diagram of an embodiment of an arrangement to produce filtered device communications.



FIG. 3 is a block diagram of another embodiment of a device communication arrangement.



FIG. 4 is a flow chart of an embodiment of a method of filtering device communications according to a cue.



FIG. 5 is a flow chart of an embodiment of a method of filtering device communications according to a cue and a remote environment.





DETAILED DESCRIPTION

The invention will now be described with respect to various embodiments. The following description provides specific details for a thorough understanding of, and enabling description for, these embodiments of the invention. However, one skilled in the art will understand that the invention may be practiced without these details. In other instances, well known structures and functions have not been shown or described in detail to avoid unnecessarily obscuring the description of the embodiments of the invention. References to “one embodiment” or “an embodiment” do not necessarily refer to the same embodiment, although they may.



FIG. 1 is a block diagram of an embodiment of a device communication arrangement. A wireless device 102 comprises logic 118, a video/image sensor 104, an audio sensor 106, and a tactile/motion sensor 105. A video/image sensor (such as 104) comprises a transducer that converts light signals (e.g. a form of electromagnetic radiation) to electrical, optical, or other signals suitable for manipulation by logic. Once converted, these signals may be known as images or a video stream. An audio sensor (such as 106) comprises a transducer that converts sound waves (e.g. audio signals in their original form) to electrical, optical, or other signals suitable for manipulation by logic. Once converted, these signals may be known as an audio stream. A tactile/motion sensor (such as 105) comprises a transducer that converts contact events with the sensor, and/or motion of the sensor, to electrical, optical, or other signals suitable for manipulation by logic. Logic (such as 116, 118, and 120) comprises information represented in device memory that may be applied to affect the operation of a device. Software and firmware are examples of logic. Logic may also be embodied in circuits, and/or combinations of software and circuits.


The wireless device 102 communicates with a network 108, which comprises logic 120. As used herein, a network (such as 108) is comprised of a collection of devices that facilitate communication between other devices. The devices that communicate via a network may be referred to as network clients. A receiver 110 comprises a video/image display 112, a speaker 114, and logic 116. A speaker (such as 114) comprises a transducer that converts signals from a device (typically optical and/or electrical signals) to sound waves. A video/image display (such as 112) comprises a device to display information in the form of light signals. Examples are monitors, flat panels, liquid crystal devices, light emitting diodes, and televisions. The receiver 110 communicates with the network 108. Using the network 108, the wireless device 102 and the receiver 110 may communicate.


The device 102 or the network 108 identify a cue, either by using their logic or by receiving a cue identification from the device 102 user. Device 102 communication is filtered, either by the device 102 or the network 108, according to the cue. Cues can comprise conditions that occur in the local environment of the device 102, such as body movements, for example a facial expression or a hand gesture. Many more conditions or occurrences in the local environment can potentially be cues. Examples include opening or closing the device (e.g. opening or closing a phone), the deforming of a flexible surface of the device 102, altering of the device 102 orientation with respect to one or more objects of the environment, or sweeping a sensor of the device 102 across at least one object of the environment. The device 102, or user, or network 108 may identify a cue in the remote environment. The device 102 and/or network 108 may filter the device communication according to the cue and the remote environment. The local environment comprises those people, things, sounds, and other phenomenon that affect the sensors of the device 102. In the context of this figure, the remote environment comprises those people, things, sounds, and other signals, conditions or items that affect the sensors of or are otherwise important in the context of the receiver 110.


The device 102 or network 108 may monitor an audio stream, which forms at least part of the communication of the device 102, for at least one pattern (the cue). A pattern is a particular configuration of information to which other information, in this case the audio stream, may be compared. When the at least one pattern is detected in the audio stream, the device 102 communication is filtered in a manner associated with the pattern. Detecting a pattern can include detecting a specific sound. Detecting the pattern can include detecting at least one characteristic of an audio stream, for example, detecting whether the audio stream is subject to copyright protection.


The device 102 or network 108 may monitor a video stream, which forms at least part of a communication of the device 102, for at least one pattern (the cue). When the at least one pattern is detected in the video stream, the device 102 communication is filtered in a manner associated with the pattern. Detecting the pattern can include detecting a specific image. Detecting the pattern can include detecting at least one characteristic of the video stream, for example, detecting whether the video stream is subject to copyright protection.



FIG. 2 is a block diagram of an embodiment of an arrangement to produce filtered device communications. Cue definitions 202 comprise hand gestures, head movements, and facial expressions. In the context of this figure, the remote environment information 204 comprise a supervisor, spouse, and associates. The filter rules 206 define operations to apply to the device communications and the conditions under which those operations are to be applied. The filter rules 206 in conjunction with at least one of the cue definitions 202 are applied to the local environment information to produce filtered device communications. Optionally, a remote environment definition 204 may be applied to the filter rules 206, to determine at least in part the filter rules 206 applied to the local environment information.


Filtering can include modifying the device communication to incorporate a visual or audio effect. Examples of visual effects include blurring, de-saturating, color modification of, or snowing of one or more images communicated from the device. Examples of audio effects include altering the tone of, altering the pitch of, altering the volume of, adding echo to, or adding reverb to audio information communicated from the device.


Filtering can include removing (e.g. suppressing) or substituting (e.g. replacing) information from the device communication. Examples of information that may suppressed as a result of filtering include the background sounds, the background image, a background video, a person's voice, and the image and/or sounds associated with an object within the image or video background. Examples of information that may be replaced as a result of filtering include background sound information which is replaced with potentially different sound information and background video information which is replaced with potentially different video information. Multiple filtering operations may occur; for example, background audio and video may both be suppressed by filtering. Filtering can also result in application of one or more effects and removal of part of the communication information and substitution of part of the communication information.



FIG. 3 is a block diagram of another embodiment of a device communication arrangement. The substitution objects 304 comprise office, bus, and office sounds. The substitution objects 304 are applied to the substitution rules 308 along with the cue definitions 202 and, optionally, the remote environment information 204. Accordingly, the substitution rules 308 produce a substitution determination for the device communication. The substitution determination may result in filtering.


Filtering can include substituting image information of the device communication with predefined image information. An example of image information substitution is the substituting a background of a present location with a background of a different location, e.g. substituting the office background for the local environment background when the local environment is a bar.


Filtering can include substituting audio information of the device communication with predefined audio information. An example of audio information substitution is the substituting at least one of a human voice or functional sound detected by the device with a different human voice or functional sound, e.g. the substitution of bar background noise (the local environment background noise) with tasteful classical music.



FIG. 4 is a flow chart of an embodiment of a method of filtering device communications according to a cue. At 402 it is determined that there is a cue. If at 404 it is determined that no filter is associated with the cue, the process concludes. If at 404 it is determined that a filter is associated with the cue, the filter is applied to device communication at 408. At 410 the process concludes.



FIG. 5 is a flow chart of an embodiment of a method of filtering device communications according to a cue and a remote environment. At 502 it is determined that there is a cue. At 504 at least one aspect of the remote environment is determined. If at 506 it is determined that no filter is associated with the cue and with at least one remote environment aspect, the process concludes. If at 506 it is determined that a filter is associated with the cue and with at least one remote environment aspect, the filter is applied to device communication at 508. At 510 the process concludes.


Unless the context clearly requires otherwise, throughout the description and the claims, the words “comprise,” “comprising,” and the like are to be construed in an inclusive sense as opposed to an exclusive or exhaustive sense; that is to say, in the sense of “including, but not limited to.” Words using the singular or plural number also include the plural or singular number respectively. Additionally, the words “herein,” “above,” “below” and words of similar import, when used in this application, shall refer to this application as a whole and not to any particular portions of this application. When the claims use the word “or” in reference to a list of two or more items, that word covers all of the following interpretations of the word: any of the items in the list, all of the items in the list and any combination of the items in the list.

Claims
  • 1. A system comprising: at least one communication device including at least: circuitry configured for engaging at least one synchronous communication between the at least one communication device and at least one receiving device in a remote environment;one or more sensors including one or more of at least one audio sensor configured for sensing at least one of an audio signal stream or at least one video sensor configured for sensing at least one visual signal stream in a local environment for transmission to the at least one receiving device in the remote environment;circuitry configured for obtaining remote environment information including at least one identifier of at least one participant in the at least one synchronous communication in the remote environment;circuitry configured for detecting at least one manipulation of the at least one communication device by at least one user of the at least one communication device, wherein the at least one manipulation includes at least one of opening of the at least one communication device, closing of the at least one communication device, deforming a flexible surface of the at least one communication device, or altering an orientation of the at least one communication device;circuitry configured for determining one or more filter rules based at least partly on the detected at least one manipulation of the at least one communication device by the at least one user of the at least one communication device and the at least one identifier of at least one participant in the at least one synchronous communication in the remote environment;circuitry configured for filtering at least part of the at least one of an audio signal stream or a visual signal stream according to the one or more filter rules; andcircuitry configured for transmitting the filtered at least one of the audio signal stream or the visual signal stream to the at least one receiving device.
  • 2. The system of claim 1, wherein the circuitry configured for filtering at least part of the at least one of an audio signal stream or a visual signal stream according to the one or more filter rules comprises: circuitry configured for replacing at least some content of the at least one of an audio signal stream or a visual signal stream according to the one or more filter rules.
  • 3. The system of claim 1, wherein the circuitry configured for filtering at least part of the at least one of an audio signal stream or a visual signal stream according to the one or more filter rules comprises: circuitry configured for removing at least one voice of the at least one audio signal stream according to the one or more filter rules.
  • 4. The system of claim 1, wherein the circuitry configured for filtering at least part of the at least one of an audio signal stream or a visual signal stream according to the one or more filter rules comprises: circuitry configured for removing at least some video content of the at least one visual signal stream according to the one or more filter rules.
  • 5. The system of claim 1, wherein the circuitry configured for filtering at least part of the at least one of an audio signal stream or a visual signal stream according to the one or more filter rules comprises: circuitry configured for replacing at least some video content of the at least one visual signal stream according to the one or more filter rules.
  • 6. The system of claim 1, wherein the circuitry configured for filtering at least part of the at least one of an audio signal stream or a visual signal stream according to the one or more filter rules comprises: circuitry configured for substituting at least one voice of the at least one communication with at least one different voice in the at least one audio signal stream according to the one or more filter rules.
  • 7. The system of claim 1, wherein the circuitry configured for filtering at least part of the at least one of an audio signal stream or a visual signal stream according to the one or more filter rules comprises: circuitry configured for removing at least one background sound of the at least one audio signal stream according to the one or more filter rules.
  • 8. The system of claim 1, wherein the circuitry configured for filtering at least part of the at least one of an audio signal stream or a visual signal stream according to the one or more filter rules comprises: circuitry configured for replacing at least one background sound of the at least one communication with at least one different background sound according to the one or more filter rules.
  • 9. The system of claim 1, wherein the circuitry configured for filtering at least part of the at least one of an audio signal stream or a visual signal stream according to the one or more filter rules comprises: circuitry configured for replacing at least one background sound of the at least one communication with at least one audio effect according to the one or more filter rules.
  • 10. The system of claim 1, wherein the circuitry configured for filtering at least part of the at least one of an audio signal stream or a visual signal stream according to the one or more filter rules comprises: circuitry configured for replacing at least one background noise of the at least one communication with at least some music according to the one or more filter rules.
  • 11. The system of claim 1, wherein the circuitry configured for filtering at least part of the at least one of an audio signal stream or a visual signal stream according to the one or more filter rules comprises: circuitry configured for altering at least one of tone, pitch, or volume of the at least one communication according to the one or more filter rules.
  • 12. The system of claim 1, wherein the circuitry configured for filtering at least part of the at least one of an audio signal stream or a visual signal stream according to the one or more filter rules comprises: circuitry configured for filtering at least part of the at least one communication including adding one or more audio effects according to the one or more filter rules.
  • 13. The system of claim 1, wherein the circuitry configured for filtering at least part of the at least one of an audio signal stream or a visual signal stream according to the one or more filter rules comprises: circuitry configured for suppressing at least part of the at least one communication according to the one or more filter rules.
  • 14. The system of claim 1, wherein the circuitry configured for filtering at least part of the at least one of an audio signal stream or a visual signal stream according to the one or more filter rules comprises: circuitry configured for filtering at least part of the at least one phone communication according to the one or more filter rules.
  • 15. The system of claim 1, wherein the circuitry configured for filtering at least part of the at least one of an audio signal stream or a visual signal stream according to the one or more filter rules comprises: circuitry configured for filtering at least part of the at least one audiovisual communication according to the one or more filter rules.
  • 16. The system of claim 1, wherein the circuitry configured for obtaining remote environment information including at least one identifier of at least one participant in the at least one synchronous communication in the remote environment includes at least one of: circuitry configured for receiving a cue identification from the at least one communication device;circuitry configured for identifying participants in the at least one communication present in the remote environment;circuitry configured for detecting one or more signals in a context of the at least one receiving device;circuitry configured for detecting one or more sounds in the remote environment;circuitry configured for detecting at least one specific sound in the remote environment;circuitry configured for detecting at least one pattern of an audio stream from the remote environment;circuitry configured for detecting at least one specific image in the remote environment;circuitry configured for detecting at least one pattern of a video stream from the remote environment;circuitry configured for detecting one or more conditions in the context of the at least one receiving device; orat least one video sensor configured to detect at least one of hand gestures, head movements, facial expressions, body movements, or sweeping a sensor of the device across at least one object of an environment.
  • 17. The system of claim 1, wherein the at least one communication device includes: at least one of a cell phone, a wireless device, or a computer.
  • 18. The system of claim 1, wherein the circuitry configured for detecting at least one manipulation of the at least one communication device by at least one user of the at least one communication device comprises: at least one of: circuitry configured for detecting at least one manipulation of the at least one communication device by at least one user of the at least one communication device including at least one body movement of the at least one user of the at least one communication device;circuitry configured for detecting at least one manipulation of the at least one communication device by at least one user of the at least one communication device including at least one hand gesture of the at least one user of the at least one communication device;circuitry configured for detecting at least one manipulation of the at least one communication device by at least one user of the at least one communication device including at least one facial expression of the at least one user of the at least one communication device; orcircuitry configured for detecting at least one manipulation of the at least one communication device by at least one user of the at least one communication device including at least one head movement of the at least one user of the at least one communication device.
  • 19. The system of claim 1 wherein the at least one receiving device includes at least one of a cell phone, a wireless device, a computer, a video/image display, or a speaker.
  • 20. A method at least partly performed using one or more processing components in at least one communication device, the method comprising: engaging at least one synchronous communication between at least one communication device and at least one receiving device in a remote environment;sensing at least one of an audio signal stream via at least one communication device audio sensor or a visual signal stream via at least one communication device video sensor in a local environment for transmission to the at least one receiving device in the remote environment;obtaining remote environment information including at least one identifier of at least one participant in the at least one synchronous communication in the remote environment;detecting at least one manipulation of the at least one communication device by at least one user of the at least one communication device, wherein the at least one manipulation includes at least one of opening of the at least one communication device, closing of the at least one communication device, deforming a flexible surface of the at least one communication device, or altering an orientation of the at least one communication device;determining one or more filter rules based at least partly on the detected at least one manipulation of the at least one communication device by the at least one user of the at least one communication device and the at least one identifier of at least one participant in the at least one synchronous communication in the remote environment;filtering at least part of the at least one of an audio signal stream or a visual signal stream according to the one or more filter rules; andtransmitting the filtered at least one of an audio signal stream or a visual signal stream to the at least one receiving device.
US Referenced Citations (195)
Number Name Date Kind
4531228 Noso et al. Jul 1985 A
4532651 Pennebaker, Jr. et al. Jul 1985 A
4757541 Beadles Jul 1988 A
4802231 Davis Jan 1989 A
4829578 Roberts May 1989 A
4952931 Serageldin et al. Aug 1990 A
5126840 Dufresne et al. Jun 1992 A
5278889 Papanicolaou et al. Jan 1994 A
5288938 Wheaton Feb 1994 A
5297198 Butani et al. Mar 1994 A
5323457 Ehara et al. Jun 1994 A
5386210 Lee Jan 1995 A
5436653 Ellis et al. Jul 1995 A
5511003 Agarwal Apr 1996 A
5548188 Lee Aug 1996 A
5617508 Reaves Apr 1997 A
5666426 Helms Sep 1997 A
5675708 Fitzpatrick et al. Oct 1997 A
5764852 Williams Jun 1998 A
5880731 Liles Mar 1999 A
5918222 Fukui et al. Jun 1999 A
5949891 Wagner et al. Sep 1999 A
5966440 Hair Oct 1999 A
5983369 Bakoglu Nov 1999 A
6037986 Zhang et al. Mar 2000 A
RE36707 Papanicolaou et al. May 2000 E
6169541 Smith Jan 2001 B1
6184937 Williams Feb 2001 B1
6212233 Alexandre et al. Apr 2001 B1
6243683 Peters Jun 2001 B1
6259381 Small Jul 2001 B1
6262734 Ishikawa Jul 2001 B1
6266430 Rhoads Jul 2001 B1
6269483 Broussard Jul 2001 B1
6317716 Braida et al. Nov 2001 B1
6317776 Broussard et al. Nov 2001 B1
6356704 Callway et al. Mar 2002 B1
6377680 Foladare et al. Apr 2002 B1
6377919 Burnett et al. Apr 2002 B1
6396399 Dunlap May 2002 B1
6400996 Hoffberg Jun 2002 B1
6438223 Eskafi et al. Aug 2002 B1
6473137 Godwin et al. Oct 2002 B1
6483532 Girod Nov 2002 B1
6597405 Iggulden Jul 2003 B1
6611281 Strubbe Aug 2003 B2
6617980 Endo et al. Sep 2003 B2
6622115 Brown et al. Sep 2003 B1
6690883 Pelletier Feb 2004 B2
6720949 Pryor et al. Apr 2004 B1
6724862 Shaffer et al. Apr 2004 B1
6727935 Allen Apr 2004 B1
6749505 Kunzle et al. Jun 2004 B1
6751446 Kim et al. Jun 2004 B1
6760017 Banerjee Jul 2004 B1
6771316 Iggulden Aug 2004 B1
6775835 Ahmad et al. Aug 2004 B1
6819919 Tanaka Nov 2004 B1
6825873 Nakamura et al. Nov 2004 B2
6829582 Barsness Dec 2004 B1
6845127 Koh Jan 2005 B2
6882971 Craner Apr 2005 B2
6950796 Ma et al. Sep 2005 B2
6968294 Gutta et al. Nov 2005 B2
7043530 Isaacs et al. May 2006 B2
7110951 Lemelson et al. Sep 2006 B1
7113618 Junkins Sep 2006 B2
7120865 Horvitz et al. Oct 2006 B1
7120880 Dryer Oct 2006 B1
7129927 Mattsson Oct 2006 B2
7149686 Cohen et al. Dec 2006 B1
7162532 Koehler Jan 2007 B2
7203635 Oliver et al. Apr 2007 B2
7203911 Williams Apr 2007 B2
7209757 Naghian et al. Apr 2007 B2
7233684 Fedorovskaya et al. Jun 2007 B2
7319955 Deligne et al. Jan 2008 B2
RE40054 Girod Feb 2008 E
7336804 Steffin Feb 2008 B2
7379568 Movellan et al. May 2008 B2
7409639 Dempski et al. Aug 2008 B2
7418116 Fedorovskaya et al. Aug 2008 B2
7424098 Kovales et al. Sep 2008 B2
7472063 Nefian Dec 2008 B2
7496272 DaSilva Feb 2009 B2
7587069 Movellan et al. Sep 2009 B2
7624076 Movellan et al. Nov 2009 B2
7634533 Rudolph et al. Dec 2009 B2
7647560 Macauley Jan 2010 B2
7660806 Brill et al. Feb 2010 B2
7664637 Deligne et al. Feb 2010 B2
7680302 Steffin Mar 2010 B2
7684982 Taneda Mar 2010 B2
7689413 Hershey et al. Mar 2010 B2
7768543 Christiansen Aug 2010 B2
7860718 Lee et al. Dec 2010 B2
7953112 Hindus et al. May 2011 B2
7995090 Liu et al. Aug 2011 B2
8009966 Bloom et al. Aug 2011 B2
8132110 Appelman Mar 2012 B1
8416806 Hindus et al. Apr 2013 B2
8571853 Peleg et al. Oct 2013 B2
8578439 Mathias et al. Nov 2013 B1
8599266 Trivedi et al. Dec 2013 B2
8676581 Flaks et al. Mar 2014 B2
8769297 Rhoads Jul 2014 B2
8977250 Malamud et al. Mar 2015 B2
9563278 Xiang Feb 2017 B2
20010033666 Benz Oct 2001 A1
20020025026 Gerszberg et al. Feb 2002 A1
20020025048 Gustafsson Feb 2002 A1
20020028674 Slettengren et al. Mar 2002 A1
20020097842 Guedalia et al. Jul 2002 A1
20020113757 Hoisko Aug 2002 A1
20020116196 Tran Aug 2002 A1
20020116197 Erten Aug 2002 A1
20020119802 Hijii Aug 2002 A1
20020138587 Koehler Sep 2002 A1
20020155844 Rankin et al. Oct 2002 A1
20020161882 Chatani Oct 2002 A1
20020164013 Carter et al. Nov 2002 A1
20020176585 Egelmeers et al. Nov 2002 A1
20020180864 Nakamura et al. Dec 2002 A1
20020184505 Mihcak et al. Dec 2002 A1
20020191804 Luo et al. Dec 2002 A1
20030005462 Broadus Jan 2003 A1
20030007648 Currell Jan 2003 A1
20030009248 Wiser et al. Jan 2003 A1
20030035553 Baumgarte Feb 2003 A1
20030048880 Horvath et al. Mar 2003 A1
20030076293 Mattsson Apr 2003 A1
20030088397 Karas et al. May 2003 A1
20030093790 Logan et al. May 2003 A1
20030117987 Brebner Jun 2003 A1
20030187657 Erhart Oct 2003 A1
20030202780 Dumm et al. Oct 2003 A1
20030210800 Yamada et al. Nov 2003 A1
20040006767 Robson Jan 2004 A1
20040008423 Driscoll, Jr. et al. Jan 2004 A1
20040012613 Rast Jan 2004 A1
20040044777 Alkhatib et al. Mar 2004 A1
20040049780 Gee Mar 2004 A1
20040056857 Zhang et al. Mar 2004 A1
20040101212 Fedorovskaya et al. May 2004 A1
20040109023 Tsuchiya Jun 2004 A1
20040125877 Chang et al. Jul 2004 A1
20040127241 Shostak Jul 2004 A1
20040143636 Horvitz et al. Jul 2004 A1
20040148346 Weaver et al. Jul 2004 A1
20040193910 Moles Sep 2004 A1
20040204135 Zhao Oct 2004 A1
20040205775 Heikes et al. Oct 2004 A1
20040215731 Tzann-en Szeto Oct 2004 A1
20040215732 McKee et al. Oct 2004 A1
20040220812 Bellomo Nov 2004 A1
20040230659 Chase Nov 2004 A1
20040236836 Appelman et al. Nov 2004 A1
20040243682 Markki et al. Dec 2004 A1
20040252813 Rhemtulla Dec 2004 A1
20040261099 Durden et al. Dec 2004 A1
20040263914 Yule et al. Dec 2004 A1
20050010637 Dempski Jan 2005 A1
20050018925 Bhagavatula et al. Jan 2005 A1
20050028221 Liu et al. Feb 2005 A1
20050037742 Patton Feb 2005 A1
20050042591 Bloom et al. Feb 2005 A1
20050053356 Mate et al. Mar 2005 A1
20050064826 Bennetts Mar 2005 A1
20050073575 Thacher et al. Apr 2005 A1
20050083248 Biocca et al. Apr 2005 A1
20050125500 Wu Jun 2005 A1
20050131744 Brown Jun 2005 A1
20050262201 Rudolph Nov 2005 A1
20060004911 Becker et al. Jan 2006 A1
20060015560 MacAuley Jan 2006 A1
20060025220 Macauley Feb 2006 A1
20060056639 Ballas Mar 2006 A1
20060187305 Trivedi et al. Aug 2006 A1
20060224382 Taneda Oct 2006 A1
20070038455 Murzina et al. Feb 2007 A1
20070201731 Fedorovskaya et al. Aug 2007 A1
20070203911 Chiu Aug 2007 A1
20070211141 Christiansen Sep 2007 A1
20070280290 Hindus et al. Dec 2007 A1
20070288978 Pizzurro et al. Dec 2007 A1
20080037840 Steinberg et al. Feb 2008 A1
20080059530 Cohen et al. Mar 2008 A1
20080192983 Steffin Aug 2008 A1
20080235165 Movellan et al. Sep 2008 A1
20080247598 Movellan et al. Oct 2008 A1
20090147971 Kuhr et al. Jun 2009 A1
20090167839 Ottmar Jul 2009 A1
20100124363 Ek et al. May 2010 A1
20110228039 Hindus et al. Sep 2011 A1
20120135787 Kusunoki et al. May 2012 A1
Foreign Referenced Citations (1)
Number Date Country
WO 03058485 Jul 2003 WO
Non-Patent Literature Citations (4)
Entry
Rugaard, Peer; Sapaty, Peter; “Mobile Control of Mobile Communications”; pp. 1-2; located at: http://www-zorn.ira.uka.de/wave/abstract2.html; printed on Mar. 4, 2005.
PCT International Search Report; International App. No. PCT/US05/26428; Feb. 2, 2006.
PCT International Search Report; International App. No. PCT/US05/26429; Feb. 1, 2007.
PCT International Search Report; International App. No. PCT/US05/29768; Apr. 18, 2006.
Related Publications (1)
Number Date Country
20060026626 A1 Feb 2006 US