Various consumer electronic devices, such as televisions, set top boxes and media players, are configured to be remotely controlled by handheld remote control devices that transmit modulated infrared (IR) remote control signals. Such IR remote control signals typically have a wavelength of about 940 nm and typically have a carrier frequency between 10 kHz and 100 kHz, and even more specifically between 30 kHz and 60 kHz. For an even more specific example, many IR remote control signals have a carrier frequency of about 36 kHz (this is not to be confused with the actual frequency of the IR light itself).
A time-of-flight (TOF) camera, which can also be referred to as a TOF system, may be located in close proximity to (e.g., within the same room as) one or more of the aforementioned consumer electronic devices (e.g., a television, a set top box and/or a media player) that is/are configured to be remotely controlled by a handheld remote control device. For example, a TOF camera may be part of a gaming console that is within the same room as a television, a set top box and/or a DVD player, which can also be referred to as other systems. Such a TOF camera typically operates by illuminating a target with a modulated IR light source and detecting IR light that reflects off the target and is incident on an image pixel detector array of the TOF camera. The IR light source is usually modulated at a relatively high carrier frequency (e.g., about 100 MHz, which is within the radio frequency range) during integration and is typically switched off between frames or captures and during readout. While the carrier frequency of the modulated IR light source is typically well above the carrier frequency of remote control signals, transitions from times during which the light source does not emit the RF modulated light to times during which the light source emits RF modulated light, and vice versa, can produce lower frequency content that can interfere with the remote control signals. Explained another way, a low frequency (LF) power envelope associated with the modulated IR light, produced by the TOF camera, may interfere with remote control signals intended to control another system (e.g., a television) within close proximity to the TOF camera. A vast majority of the interference produced by the TOF camera will not correspond to a valid remote control command, and thus, will be rejected by an IR receiver of the other system (e.g., the television) that is intended to be controlled by remote control signals. However, the interference produced by the TOF camera may be significant enough to prevent a user from being able to actually remotely control the other system (e.g., the television) that is within close proximity to the TOF camera. This can be frustrating to the user, as they may not be able to adjust the volume, brightness, channel, and/or the like, of the other system (e.g., the television) using the remote control device. In other words, a TOF camera can render a remote control device inoperative. Due to relatively poor optical bandpass characteristics of IR receives of televisions, or other systems, such interference problems may even occur where the IR wavelength used by a TOF camera differs from the IR wavelength used by a remote control device. For example, such interference problems may even occur where the wavelength of the IR light used by the TOF camera is about 860 nm and the IR light used by a remote control device is about 940 nm. Further, it is noted that a TOF camera may also cause similar interference problems with other systems that receive and respond to wireless IR signals, such as, but not limited to, systems that include wireless IR headphones and three-dimensional (3D) television systems that include active shutter 3D glasses.
Certain embodiments disclosed are directed to time-of-flight (TOF) systems, and methods for use therewith, that substantially reduce interference that the TOF system may cause to at least one other system that is configured to wirelessly receive and respond to IR light signals. Some such embodiments involve emitting IR light having a low frequency (LF) power envelope that is shaped to substantially reduce frequency content within at least one frequency range known to be used by at least one other system configured to wirelessly receive and respond to IR light signals, wherein at least a portion of IR light being emitted is radio frequency (RF) modulated IR light, and thus, includes an RF component. Such embodiments can also involve detecting at least a portion of the emitted RF modulated IR light that has reflected off one or more objects. A TOF system can produce depth images in dependence on results of the detecting, as well as update an application in dependence on the depth images. A LF power envelope, as the term is used herein, is the LF average power delivered over time by a signal.
A TOF system can be configured to obtain a separate depth image corresponding to each of a plurality of frame periods, wherein each frame period is followed by an inter-frame period, each frame period includes at least two integration periods, and each integration period is followed by a readout period. IR light can be emitted during each of the integration periods, to enable depth images to be produced. Additionally, to reduce how often there are transitions from times during which IR light is being emitted and times during which IR light is not being emitted, and thereby reduce frequency content associated with the transitions, the IR light can also be emitted during the readout periods between pairs of the integration periods within each frame period.
In certain embodiments, in order to decrease a gain level of an automatic gain control (AGC) circuit for use with an IR light receiver of at least one other system configured to wirelessly receive and respond to IR light signals, and thereby make the IR light receiver of the at least one other system less sensitive to interference from the TOF system, IR light can be emitted during the readout periods between pairs of the integration periods within each frame period, as well as during at least a portion the inter-frame periods between pairs of frames. This can be in addition to the IR light that is emitted during the integration periods.
IR light may be emitted by producing a drive signal including an RF component and having a LF power envelope that is shaped to substantially reduce frequency content within at least one frequency range known to be used by at least one other system configured to wirelessly receive and respond to IR light signals, and driving at least one light source with the drive signal including the RF component.
In an embodiment, the LF power envelope can be shaped by ramping up pulse amplitudes of the drive signal when transitioning from a time during which no light source is driven to emit IR light to a time during which a light source is driven by the drive signal to emit IR light, and ramping down pulse amplitudes of the drive signal when transitioning from a time during which a light source is driven by the drive signal to emit IR light to a time during which no light source is driven to emit IR light.
In an embodiment, the LF power envelope can be shaped by ramping up pulse duty cycles of the drive signal when transitioning from a time during which no light source is driven to emit IR light to a time during which a light source is driven by the drive signal to emit IR light, and ramping down pulse duty cycles of the drive signal when transitioning from a time during which a light source is driven by the drive signal to emit IR light to a time during which no light source is driven to emit IR light.
In an embodiment, the LF power envelope can be shaped by ramping down temporal gaps between pulses or pulse trains of the drive signal when transitioning from a time during which no light source is driven to emit IR light to a time during which a light source is driven by the drive signal to emit IR light, and ramping up temporal gaps between pulses or pulse trains of the drive signal when transitioning from a time during which a light source is driven by the drive signal to emit IR light to a time during which no light source is driven to emit IR light.
In an embodiment, the LF power envelope can be shaped by ramping down how often gaps occur between pulses or pulse trains of the drive signal when transitioning from a time during which no light source is driven to emit IR light to a time during which a light source is driven by the drive signal to emit IR light, and ramping up how often gaps occur between pulses or pulse trains of the drive signal when transitioning from a time during which a light source is driven by the drive signal to emit IR light to a time during which no light source is drive to emit IR light.
Any of the aforementioned ramping up preferably occurs over a time period of at least 50 μsec, and any of the aforementioned ramping down preferably also occurs over a time period of at least 50 μsec. Time permitting, the ramping up and ramping down may occur over longer periods of time.
More generally, embodiments of the present technology can be used to reduce the adverse effects that TOF systems may have on other systems that are configured to wirelessly receive and respond to IR light signals, while preserving correct TOF operation. Such embodiments preferably do not degrade, or minimally degrade, performance of TOF systems. Additionally, such embodiments preferably do not increase, or minimally increase, power usage by TOF system.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
Certain embodiments of the present technology disclosed herein are directed to TOF systems, and methods for user therewith, that substantially reduce interference that a TOF system may cause to at least one other system (e.g., a television, a set top box, a DVD player, IR headphones and/or active 3D glasses) that is configured to wirelessly receive and respond to IR light signals. Before providing additional details of such embodiments of the present technology, exemplary details of systems with which embodiments of the present technology can be used will first be described.
The computing system 112 may be a computer, a gaming system or console, or the like. According to an example embodiment, the computing system 112 may include hardware components and/or software components such that computing system 112 may be used to execute applications such as gaming applications, non-gaming applications, or the like. In one embodiment, computing system 112 may include a processor such as a standardized processor, a specialized processor, a microprocessor, or the like that may execute instructions stored on a processor readable storage device for performing the processes described herein.
The capture device 120 may include, for example, a camera that may be used to visually monitor one or more users, such as the user 118, such that gestures and/or movements performed by the one or more users may be captured, analyzed, and tracked to perform one or more controls or actions within the application and/or animate an avatar or on-screen character, as will be described in more detail below.
According to one embodiment, the tracking system 100 may be connected to an audiovisual device 116 such as a television, a monitor, a high-definition television (HDTV), or the like that may provide game or application visuals and/or audio to a user such as the user 118. For example, the computing system 112 may include a video adapter such as a graphics card and/or an audio adapter such as a sound card that may provide audiovisual signals associated with the game application, non-game application, or the like. The audiovisual device 116 may receive the audiovisual signals from the computing system 112 and may then output the game or application visuals and/or audio associated with the audiovisual signals to the user 118. According to one embodiment, the audiovisual device 16 may be connected to the computing system 112 via, for example, an S-Video cable, a coaxial cable, an HDMI cable, a DVI cable, a VGA cable, component video cable, or the like.
As shown in
In the example depicted in
Other movements by the user 118 may also be interpreted as other controls or actions and/or used to animate the player avatar, such as controls to bob, weave, shuffle, block, jab, or throw a variety of different power punches. Furthermore, some movements may be interpreted as controls that may correspond to actions other than controlling the player avatar 140. For example, in one embodiment, the player may use movements to end, pause, or save a game, select a level, view high scores, communicate with a friend, etc. According to another embodiment, the player may use movements to select the game or other application from a main user interface. Thus, in example embodiments, a full range of motion of the user 118 may be available, used, and analyzed in any suitable manner to interact with an application.
In example embodiments, the human target such as the user 118 may have an object. In such embodiments, the user of an electronic game may be holding the object such that the motions of the player and the object may be used to adjust and/or control parameters of the game. For example, the motion of a player holding a racket may be tracked and utilized for controlling an on-screen racket in an electronic sports game. In another example embodiment, the motion of a player holding an object may be tracked and utilized for controlling an on-screen weapon in an electronic combat game. Objects not held by the user can also be tracked, such as objects thrown, pushed or rolled by the user (or a different user) as well as self-propelled objects. In addition to boxing, other games can also be implemented.
According to other example embodiments, the tracking system 100 may further be used to interpret target movements as operating system and/or application controls that are outside the realm of games. For example, virtually any controllable aspect of an operating system and/or application may be controlled by movements of the target such as the user 118.
As shown in
As shown in
According to another example embodiment, TOF analysis may be used to indirectly determine a physical distance from the capture device 120 to a particular location on the targets or objects by analyzing the intensity of the reflected beam of light over time via various techniques including, for example, shuttered light pulse imaging.
In another example embodiment, the capture device 120 may use a structured light to capture depth information. In such an analysis, patterned light (i.e., light displayed as a known pattern such as grid pattern, a stripe pattern, or different pattern) may be projected onto the scene via, for example, the IR light component 224. Upon striking the surface of one or more targets or objects in the scene, the pattern may become deformed in response. Such a deformation of the pattern may be captured by, for example, the 3-D camera 226 and/or the RGB camera 228 and may then be analyzed to determine a physical distance from the capture device to a particular location on the targets or objects. In some implementations, the IR Light component 224 is displaced from the cameras 226 and 228 so triangulation can be used to determined distance from cameras 226 and 228. In some implementations, the capture device 120 will include a dedicated IR sensor to sense the IR light.
According to another embodiment, the capture device 120 may include two or more physically separated cameras that may view a scene from different angles to obtain visual stereo data that may be resolved to generate depth information. Other types of depth image sensors can also be used to create a depth image.
The capture device 120 may further include a microphone 230. The microphone 230 may include a transducer or sensor that may receive and convert sound into an electrical signal. According to one embodiment, the microphone 230 may be used to reduce feedback between the capture device 120 and the computing system 112 in the target recognition, analysis, and tracking system 100. Additionally, the microphone 230 may be used to receive audio signals (e.g., voice commands) that may also be provided by the user to control applications such as game applications, non-game applications, or the like that may be executed by the computing system 112.
In an example embodiment, the capture device 120 may further include a processor 232 that may be in operative communication with the image camera component 222. The processor 232 may include a standardized processor, a specialized processor, a microprocessor, or the like that may execute instructions including, for example, instructions for receiving a depth image, generating the appropriate data format (e.g., frame) and transmitting the data to computing system 112.
The capture device 120 may further include a memory component 234 that may store the instructions that may be executed by the processor 232, images or frames of images captured by the 3-D camera and/or RGB camera, or any other suitable information, images, or the like. According to an example embodiment, the memory component 234 may include random access memory (RAM), read only memory (ROM), cache, Flash memory, a hard disk, or any other suitable storage component. As shown in
As shown in
Computing system 112 includes gestures library 240, structure data 242, depth image processing and object reporting module 244 and application 246. Depth image processing and object reporting module 244 uses the depth images to track motion of objects, such as the user and other objects. To assist in the tracking of the objects, depth image processing and object reporting module 244 uses gestures library 240 and structure data 242.
Structure data 242 includes structural information about objects that may be tracked. For example, a skeletal model of a human may be stored to help understand movements of the user and recognize body parts. Structural information about inanimate objects may also be stored to help recognize those objects and help understand movement.
Gestures library 240 may include a collection of gesture filters, each comprising information concerning a gesture that may be performed by the skeletal model (as the user moves). The data captured by the cameras 226, 228 and the capture device 120 in the form of the skeletal model and movements associated with it may be compared to the gesture filters in the gesture library 240 to identify when a user (as represented by the skeletal model) has performed one or more gestures. Those gestures may be associated with various controls of an application. Thus, the computing system 112 may use the gestures library 240 to interpret movements of the skeletal model and to control application 246 based on the movements. As such, gestures library may be used by depth image processing and object reporting module 244 and application 246.
Application 246 can be a video game, productivity application, etc. In one embodiment, depth image processing and object reporting module 244 will report to application 246 an identification of each object detected and the location of the object for each frame. Application 246 will use that information to update the position or movement of an avatar or other images in the display.
The TOF system 226 is also shown as including a clock signal generator 262, which produces a clock signal that is provided to the driver 260. Additionally, the TOF system 226 is shown as including a microprocessor 264 that can control the clock signal generator 262 and/or the driver 260. The TOF system 226 is also shown as including an image pixel detector array 268, readout circuitry 270 and memory 266. The image pixel detector array 268 might include, e.g., 320×240 image pixel detectors, but is not limited thereto. Each image pixel detector can be, e.g., a complementary metal-oxide-semiconductor (CMOS) sensor or a charged coupled device (CCD) sensor, but is not limited thereto. Depending upon implementation, each image pixel detector can have its own dedicated readout circuit, or readout circuitry can be shared by many image pixel detectors. In accordance with certain embodiments, the components of the TOF system 226 shown within the block 280 are implemented in a single integrated circuit (IC), which can also be referred to as a single TOF chip.
The driver 260 can produce a radio frequency (RF) modulated drive signal in dependence on a clock signal received from clock signal generator 262. Accordingly, the driver 260 can include, for example, one or more buffers, amplifiers and/or modulators, but is not limited thereto. The clock signal generator 262 can include, for example, one or more reference clocks and/or voltage controlled oscillators, but is not limited thereto. The microprocessor 264, which can be part of a microcontroller unit, can be used to control the clock signal generator 262 and/or the driver 260. For example, the microprocessor 264 can access waveform information stored in the memory 266 in order to produce an RF modulated drive signal in accordance with various embodiments described herein. The TOF system 226 can includes its own memory 266 and microprocessor 264, as shown in
In response to being driven by an RF modulated drive signal, the light source 250 emits RF modulated light, which can also be referred to as an RF modulate light signal. For an example, a carrier frequency of the RF modulated drive signal and the RF modulated light can be in a range from about 5 MHz to many hundreds of MHz, but for illustrative purposes will be assumed to be about 100 MHz. The light emitted by the light source 250 is transmitted through an optional lens or light shaping diffuser 252 towards a target object (e.g., a user 118). Assuming that there is a target object within the field of view of the TOF camera, a portion of the RF modulated emitted light reflects off the target object, passes through an aperture field stop and lens (collectively 272), and is incident on the image pixel detector array 268 where an image is formed. In some implementations, each individual image pixel detector of the array 268 produces an integration value indicative of a magnitude and a phase of detected RF modulated light originating from the light source that has reflected off the object and is incident of the image pixel detector. Such integrations values, or more generally TOF information, enable distances (Z) to be determined, and collectively, enable depth images to be produced. In certain embodiments, optical energy from the light source 250 and detected optical energy signals are synchronized to each other such that a phase difference, and thus a distance Z, can be measured from each image pixel detector. The readout circuitry 270 converts analog integration values generated by the image pixel detector array 268 into digital readout signals, which are provided to the microprocessor 264 and/or the memory 266, and which can be used to produce depth images.
A graphics processing unit (GPU) 308 and a video encoder/video codec (coder/decoder) 314 form a video processing pipeline for high speed and high resolution graphics processing. Data is carried from the graphics processing unit 308 to the video encoder/video codec 314 via a bus. The video processing pipeline outputs data to an A/V (audio/video) port 340 for transmission to a television or other display. A memory controller 310 is connected to the GPU 308 to facilitate processor access to various types of memory 312, such as, but not limited to, a RAM (Random Access Memory).
The multimedia console 300 includes an I/O controller 320, a system management controller 322, an audio processing unit 323, a network interface 324, a first USB host controller 326, a second USB controller 328 and a front panel I/O subassembly 330 that are preferably implemented on a module 318. The USB controllers 326 and 328 serve as hosts for peripheral controllers 342(1)-342(2), a wireless adapter 348, and an external memory device 346 (e.g., flash memory, external CD/DVD ROM drive, removable media, etc.). The network interface 324 and/or wireless adapter 348 provide access to a network (e.g., the Internet, home network, etc.) and may be any of a wide variety of various wired or wireless adapter components including an Ethernet card, a modem, a Bluetooth module, a cable modem, and the like.
System memory 343 is provided to store application data that is loaded during the boot process. A media drive 344 is provided and may comprise a DVD/CD drive, Blu-Ray drive, hard disk drive, or other removable media drive, etc. The media drive 344 may be internal or external to the multimedia console 300. Application data may be accessed via the media drive 344 for execution, playback, etc. by the multimedia console 300. The media drive 344 is connected to the I/O controller 320 via a bus, such as a Serial ATA bus or other high speed connection (e.g., IEEE 1394).
The system management controller 322 provides a variety of service functions related to assuring availability of the multimedia console 300. The audio processing unit 323 and an audio codec 332 form a corresponding audio processing pipeline with high fidelity and stereo processing. Audio data is carried between the audio processing unit 323 and the audio codec 332 via a communication link. The audio processing pipeline outputs data to the A/V port 340 for reproduction by an external audio player or device having audio capabilities.
The front panel I/O subassembly 330 supports the functionality of the power button 350 and the eject button 352, as well as any LEDs (light emitting diodes) or other indicators exposed on the outer surface of the multimedia console 300. A system power supply module 336 provides power to the components of the multimedia console 300. A fan 338 cools the circuitry within the multimedia console 300.
The CPU 301, GPU 308, memory controller 310, and various other components within the multimedia console 300 are interconnected via one or more buses, including serial and parallel buses, a memory bus, a peripheral bus, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures can include a Peripheral Component Interconnects (PCI) bus, PCI-Express bus, etc.
When the multimedia console 300 is powered ON, application data may be loaded from the system memory 343 into memory 312 and/or caches 302, 304 and executed on the CPU 301. The application may present a graphical user interface that provides a consistent user experience when navigating to different media types available on the multimedia console 300. In operation, applications and/or other media contained within the media drive 344 may be launched or played from the media drive 344 to provide additional functionalities to the multimedia console 300.
The multimedia console 300 may be operated as a standalone system by simply connecting the system to a television or other display. In this standalone mode, the multimedia console 300 allows one or more users to interact with the system, watch movies, or listen to music. However, with the integration of broadband connectivity made available through the network interface 324 or the wireless adapter 348, the multimedia console 300 may further be operated as a participant in a larger network community.
When the multimedia console 300 is powered ON, a set amount of hardware resources are reserved for system use by the multimedia console operating system. These resources may include a reservation of memory (e.g., 16 MB), CPU and GPU cycles (e.g., 5%), networking bandwidth (e.g., 8 Kbps), etc. Because these resources are reserved at system boot time, the reserved resources do not exist from the application's view.
In particular, the memory reservation preferably is large enough to contain the launch kernel, concurrent system applications and drivers. The CPU reservation is preferably constant such that if the reserved CPU usage is not used by the system applications, an idle thread will consume any unused cycles.
With regard to the GPU reservation, lightweight messages generated by the system applications (e.g., popups) are displayed by using a GPU interrupt to schedule code to render popup into an overlay. The amount of memory required for an overlay depends on the overlay area size and the overlay preferably scales with screen resolution. Where a full user interface is used by the concurrent system application, it is preferable to use a resolution independent of application resolution. A scaler may be used to set this resolution such that the need to change frequency and cause a TV resynch is eliminated.
After the multimedia console 300 boots and system resources are reserved, concurrent system applications execute to provide system functionalities. The system functionalities are encapsulated in a set of system applications that execute within the reserved system resources described above. The operating system kernel identifies threads that are system application threads versus gaming application threads. The system applications are preferably scheduled to run on the CPU 301 at predetermined times and intervals in order to provide a consistent system resource view to the application. The scheduling is to minimize cache disruption for the gaming application running on the console.
When a concurrent system application requires audio, audio processing is scheduled asynchronously to the gaming application due to time sensitivity. A multimedia console application manager (described below) controls the gaming application audio level (e.g., mute, attenuate) when system applications are active.
Input devices (e.g., controllers 342(1) and 342(2)) are shared by gaming applications and system applications. The input devices are not reserved resources, but are to be switched between system applications and the gaming application such that each will have a focus of the device. The application manager preferably controls the switching of input stream, without knowledge the gaming application's knowledge and a driver maintains state information regarding focus switches. The cameras 226, 228 and capture device 120 may define additional input devices for the console 300 via USB controller 326 or other interface.
Computing system 420 comprises a computer 441, which typically includes a variety of computer readable media. Computer readable media can be any available media that can be accessed by computer 441 and includes both volatile and nonvolatile media, removable and non-removable media. The system memory 422 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 423 and random access memory (RAM) 460. A basic input/output system 424 (BIOS), containing the basic routines that help to transfer information between elements within computer 441, such as during start-up, is typically stored in ROM 423. RAM 460 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 459. By way of example, and not limitation,
The computer 441 may also include other removable/non-removable, volatile/nonvolatile computer storage media. By way of example only,
The drives and their associated computer storage media discussed above and illustrated in
The computer 441 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 446. The remote computer 446 may be a personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 441, although only a memory storage device 447 has been illustrated in
When used in a LAN networking environment, the computer 441 is connected to the LAN 445 through a network interface 437. When used in a WAN networking environment, the computer 441 typically includes a modem 450 or other means for establishing communications over the WAN 449, such as the Internet. The modem 450, which may be internal or external, may be connected to the system bus 421 via the user input interface 436, or other appropriate mechanism. In a networked environment, program modules depicted relative to the computer 441, or portions thereof, may be stored in the remote memory storage device. By way of example, and not limitation,
As explained above, the capture device 120 provides RGB images (also known as color images) and depth images to the computing system 112. The depth image may be a plurality of observed pixels where each observed pixel has an observed depth value. For example, the depth image may include a two-dimensional (2-D) pixel area of the captured scene where each pixel in the 2-D pixel area may have a depth value such as a length or distance in, for example, centimeters, millimeters, or the like of an object in the captured scene from the capture device.
In one embodiment, the depth image may be colorized or grayscale such that different colors or shades of the pixels of the depth image correspond to and/or visually depict different distances of the targets from the capture device 120. Upon receiving the image, one or more high-variance and/or noisy depth values may be removed and/or smoothed from the depth image; portions of missing and/or removed depth information may be filled in and/or reconstructed; and/or any other suitable processing may be performed on the received depth image.
Techniques for Reducing IR Remote Control Interference Caused by TOF Systems
As mentioned above, a TOF system (e.g., 226) may be located in close proximity to (e.g., within the same room as) a consumer electronic device (e.g., a television, a set top box and/or a media player) that is/are configured to be remotely controlled by a handheld remote control device. For example, referring back to
Most, IR remote control signals have a carrier frequency between 10 kHz and 100 kHz, and even more specifically between 30 kHz and 60 kHz. Certain remote control devices, for example, transmit IR remote control signals having a carrier frequency of about 36 kHz (this is not to be confused with the actual frequency of the IR light itself). There are also some systems that utilize IR remote control signals having a carrier frequency of about 455 KHz. Still other systems utilize IR remote control signals having a carrier frequency of about 1 MHz. A consumer electronic device (e.g., television, set top box or media player) that is controllable by remote control signals includes a remote control receiver that is configured to receive and decode remote control signals within an expected frequency range, examples of which were discussed above.
Before describing various embodiments of the present technology,
As shown in
Still referring to
Certain embodiments of the present technology, which are described below, smooth out the edges of the LF power envelopes of the drive and IR light signals. This has the effect of substantially reducing frequency content within the frequency ranges known to be used by remote controlled devices and other systems configured to wirelessly receive and respond to IR light signals.
A first embodiment for smoothing out the edges of the LF power envelopes, which is illustrated in
There are various different ways to implement the embodiment described with reference to
A second embodiment for smoothing out the edges of the LF power envelopes, which is illustrated in
There are various different ways to implement the embodiment described with reference to
Another embodiment for smoothing out the edges of the LF power envelopes, which is illustrated in
There are various different ways to implement the embodiment described with reference to
As mentioned above, the abrupt transitions from times during which a light source emits IR light to times during which no light source emits IR light, and vice versa, produces frequency content that can interfere with the IR remote control signals and/or other IR signals used by other systems. As also mentioned above, explained another way, the LF power envelopes associated with the IR light, produced by the TOF camera, may interfere with one or more other system (configured to wirelessly receive and respond to IR light signals) that is/are within close proximity to the TOF camera. An embodiment, which will now be described with reference to
Typically, the RF modulated drive and RF modulated light signals are only produced during the integration periods of frame periods, but are not produced during the readout periods of frame periods, as was shown in
The rising and falling edges of the LF power envelope 1202 in
The embodiments described above can also be combined in other manners. For example, the embodiment described with reference to
In accordance with certain embodiments, where IR light signals are also emitted during readout periods, the readout circuitry (e.g., 270 in
Many systems (e.g., televisions or set top boxes) that are configured to wireless receive and respond to IR signals include a receiver that has an automatic gain control (AGC) circuit that adjusts the sensitivity of the receiver in dependence on ambient light conditions. More specifically, such AGC circuits usually decrease gain of a receiver amplifier when there is high ambient light conditions, which makes the receiver less sensitive; and the AGC circuits usually increase the gain of the receiver amplifier when where there is low ambient light conditions, which makes the receiver more sensitive. The more sensitive the receiver, the less need for a direct line of sight between a sub-system that transmit IR signals (e.g., a remote control device that transmits IR remote control signals) and the receiver, which for example, can be built into a television or set top box. Conversely, the less sensitive the receiver, the more need for a direct line of sight between the sub-system (e.g., a remote control device that transmits IR remote control signals) and the receiver. The reason for reducing the amplifier gain during high ambient light conditions is that the reduction in gain makes the receiver less sensitive to interference resulting from ambient light. Experiments have shown that reducing the gain of a receiver amplifier also has the effect of making the receiver less sensitive to interference resulting from RF component of IR light produced by a TOF system. An embodiment of the present technology, which shall now be described with reference to
The AGC circuit of a receiver may, for example, have about 50 dB of adjustable gain. Such an AGC circuit automatically varies the gain of a receiver amplifier between its minimum gain (in a bright environment) and its maximum gain (in a dark environment). In accordance with an embodiments, in order to decrease the sensitivity of an AGC circuit of another system (e.g., a system that is configured to wirelessly receive and respond to IR remote control signals) and thereby make the other system less susceptible to interference from the TOF system, the driver (e.g., 250) of a TOF system drives a light source (e.g., 250) of the TOF system to cause IR light to also be emitted during the readout periods between pairs of the integration periods within each frame period and during at least a portion of the inter-frame periods between pairs of frames. In other words, the TOF system purposely increases the percentage of each frame period during with IR light is emitted, as shown in
In
The high level flow diagram of
Referring to
Still referring to
As was discussed above with reference to
As was discussed above with reference to
As was discussed above with reference to
As was discussed above with reference to
Additional details of various methods of the present technology can be appreciated from the above discussion of
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims. It is intended that the scope of the technology be defined by the claims appended hereto.
This application claims priority to U.S. Provisional Patent Application No. 61/822,873, filed May 13, 2013, which is incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
4492473 | Richter et al. | Jan 1985 | A |
4627620 | Yang | Dec 1986 | A |
4630910 | Ross et al. | Dec 1986 | A |
4645458 | Williams | Feb 1987 | A |
4695953 | Blair et al. | Sep 1987 | A |
4702475 | Elstein et al. | Oct 1987 | A |
4711543 | Blair et al. | Dec 1987 | A |
4751642 | Silva et al. | Jun 1988 | A |
4796997 | Svetkoff et al. | Jan 1989 | A |
4809065 | Harris et al. | Feb 1989 | A |
4817950 | Goo | Apr 1989 | A |
4843568 | Krueger et al. | Jun 1989 | A |
4893183 | Nayar | Jan 1990 | A |
4901362 | Terzian | Feb 1990 | A |
4925189 | Braeunig | May 1990 | A |
5101444 | Wilson et al. | Mar 1992 | A |
5148154 | MacKay et al. | Sep 1992 | A |
5184295 | Mann | Feb 1993 | A |
5229754 | Aoki et al. | Jul 1993 | A |
5229756 | Kosugi et al. | Jul 1993 | A |
5239463 | Blair et al. | Aug 1993 | A |
5239464 | Blair et al. | Aug 1993 | A |
5278923 | Nazarathy et al. | Jan 1994 | A |
5288078 | Capper et al. | Feb 1994 | A |
5295491 | Gevins | Mar 1994 | A |
5320538 | Baum | Jun 1994 | A |
5347306 | Nitta | Sep 1994 | A |
5385519 | Hsu et al. | Jan 1995 | A |
5405152 | Katanics et al. | Apr 1995 | A |
5417210 | Funda et al. | May 1995 | A |
5423554 | Davis | Jun 1995 | A |
5454043 | Freeman | Sep 1995 | A |
5469740 | French et al. | Nov 1995 | A |
5495576 | Ritchey | Feb 1996 | A |
5516105 | Eisenbrey et al. | May 1996 | A |
5524637 | Erickson et al. | Jun 1996 | A |
5534917 | MacDougall | Jul 1996 | A |
5563988 | Maes et al. | Oct 1996 | A |
5577981 | Jarvik | Nov 1996 | A |
5580249 | Jacobsen et al. | Dec 1996 | A |
5594469 | Freeman et al. | Jan 1997 | A |
5597309 | Riess | Jan 1997 | A |
5616078 | Oh | Apr 1997 | A |
5617312 | Iura et al. | Apr 1997 | A |
5638300 | Johnson | Jun 1997 | A |
5641288 | Zaenglein | Jun 1997 | A |
5682196 | Freeman | Oct 1997 | A |
5682229 | Wangler | Oct 1997 | A |
5690582 | Ulrich et al. | Nov 1997 | A |
5703367 | Hashimoto et al. | Dec 1997 | A |
5704837 | Iwasaki et al. | Jan 1998 | A |
5715834 | Bergamasco et al. | Feb 1998 | A |
5875108 | Hoffberg et al. | Feb 1999 | A |
5877803 | Wee et al. | Mar 1999 | A |
5913727 | Ahdoot | Jun 1999 | A |
5933125 | Fernie | Aug 1999 | A |
5980256 | Carmein | Nov 1999 | A |
5989157 | Walton | Nov 1999 | A |
5995649 | Marugame | Nov 1999 | A |
6005548 | Latypov et al. | Dec 1999 | A |
6009210 | Kang | Dec 1999 | A |
6054991 | Crane et al. | Apr 2000 | A |
6066075 | Poulton | May 2000 | A |
6072494 | Nguyen | Jun 2000 | A |
6073489 | French et al. | Jun 2000 | A |
6077201 | Cheng et al. | Jun 2000 | A |
6098458 | French et al. | Aug 2000 | A |
6100896 | Strohecker et al. | Aug 2000 | A |
6101289 | Kellner | Aug 2000 | A |
6128003 | Smith et al. | Oct 2000 | A |
6130677 | Kunz | Oct 2000 | A |
6141463 | Covell et al. | Oct 2000 | A |
6147678 | Kumar et al. | Nov 2000 | A |
6152856 | Studor et al. | Nov 2000 | A |
6159100 | Smith | Dec 2000 | A |
6173066 | Peurach et al. | Jan 2001 | B1 |
6181343 | Lyons | Jan 2001 | B1 |
6188777 | Darrell et al. | Feb 2001 | B1 |
6215890 | Matsuo et al. | Apr 2001 | B1 |
6215898 | Woodfill et al. | Apr 2001 | B1 |
6226396 | Marugame | May 2001 | B1 |
6229913 | Nayar et al. | May 2001 | B1 |
6256033 | Nguyen | Jul 2001 | B1 |
6256400 | Takata et al. | Jul 2001 | B1 |
6283860 | Lyons et al. | Sep 2001 | B1 |
6289112 | Jain et al. | Sep 2001 | B1 |
6299308 | Voronka et al. | Oct 2001 | B1 |
6308565 | French et al. | Oct 2001 | B1 |
6316934 | Amorai-Moriya et al. | Nov 2001 | B1 |
6363160 | Bradski et al. | Mar 2002 | B1 |
6384819 | Hunter | May 2002 | B1 |
6411744 | Edwards | Jun 2002 | B1 |
6430997 | French et al. | Aug 2002 | B1 |
6476834 | Doval et al. | Nov 2002 | B1 |
6496598 | Harman | Dec 2002 | B1 |
6503195 | Keller et al. | Jan 2003 | B1 |
6515740 | Bamji et al. | Feb 2003 | B2 |
6539931 | Trajkovic et al. | Apr 2003 | B2 |
6570555 | Prevost et al. | May 2003 | B1 |
6633294 | Rosenthal et al. | Oct 2003 | B1 |
6640202 | Dietz et al. | Oct 2003 | B1 |
6661918 | Gordon et al. | Dec 2003 | B1 |
6681031 | Cohen et al. | Jan 2004 | B2 |
6714665 | Hanna et al. | Mar 2004 | B1 |
6731799 | Sun et al. | May 2004 | B1 |
6738066 | Nguyen | May 2004 | B1 |
6765726 | French et al. | Jul 2004 | B2 |
6788809 | Grzeszczuk et al. | Sep 2004 | B1 |
6801637 | Voronka et al. | Oct 2004 | B2 |
6873723 | Aucsmith et al. | Mar 2005 | B1 |
6876496 | French et al. | Apr 2005 | B2 |
6937742 | Roberts et al. | Aug 2005 | B2 |
6950534 | Cohen et al. | Sep 2005 | B2 |
7003134 | Covell et al. | Feb 2006 | B1 |
7036094 | Cohen et al. | Apr 2006 | B1 |
7038855 | French et al. | May 2006 | B2 |
7039676 | Day et al. | May 2006 | B1 |
7042440 | Pryor et al. | May 2006 | B2 |
7050606 | Paul et al. | May 2006 | B2 |
7058204 | Hildreth et al. | Jun 2006 | B2 |
7060957 | Lange et al. | Jun 2006 | B2 |
7113918 | Ahmad et al. | Sep 2006 | B1 |
7121946 | Paul et al. | Oct 2006 | B2 |
7170492 | Bell | Jan 2007 | B2 |
7184048 | Hunter | Feb 2007 | B2 |
7202898 | Braun et al. | Apr 2007 | B1 |
7222078 | Abelow | May 2007 | B2 |
7227526 | Hildreth et al. | Jun 2007 | B2 |
7259747 | Bell | Aug 2007 | B2 |
7308112 | Fujimura et al. | Dec 2007 | B2 |
7317836 | Fujimura et al. | Jan 2008 | B2 |
7348963 | Bell | Mar 2008 | B2 |
7359121 | French et al. | Apr 2008 | B2 |
7367887 | Watabe et al. | May 2008 | B2 |
7379563 | Shamaie | May 2008 | B2 |
7379566 | Hildreth | May 2008 | B2 |
7389591 | Jaiswal et al. | Jun 2008 | B2 |
7405812 | Bamji | Jul 2008 | B1 |
7412077 | Li et al. | Aug 2008 | B2 |
7421093 | Hildreth et al. | Sep 2008 | B2 |
7430312 | Gu | Sep 2008 | B2 |
7436496 | Kawahito | Oct 2008 | B2 |
7450736 | Yang et al. | Nov 2008 | B2 |
7452275 | Kuraishi | Nov 2008 | B2 |
7460690 | Cohen et al. | Dec 2008 | B2 |
7489812 | Fox et al. | Feb 2009 | B2 |
7536032 | Bell | May 2009 | B2 |
7555142 | Hildreth et al. | Jun 2009 | B2 |
7560701 | Oggier et al. | Jul 2009 | B2 |
7570805 | Gu | Aug 2009 | B2 |
7574020 | Shamaie | Aug 2009 | B2 |
7576727 | Bell | Aug 2009 | B2 |
7590262 | Fujimura et al. | Sep 2009 | B2 |
7593552 | Higaki et al. | Sep 2009 | B2 |
7598942 | Underkoffler et al. | Oct 2009 | B2 |
7607509 | Schmiz et al. | Oct 2009 | B2 |
7620202 | Fujimura et al. | Nov 2009 | B2 |
7668340 | Cohen et al. | Feb 2010 | B2 |
7680298 | Roberts et al. | Mar 2010 | B2 |
7683954 | Ichikawa et al. | Mar 2010 | B2 |
7684592 | Paul et al. | Mar 2010 | B2 |
7701439 | Hillis et al. | Apr 2010 | B2 |
7702130 | Im et al. | Apr 2010 | B2 |
7704135 | Harrison, Jr. | Apr 2010 | B2 |
7710391 | Bell et al. | May 2010 | B2 |
7729530 | Antonov et al. | Jun 2010 | B2 |
7746345 | Hunter | Jun 2010 | B2 |
7760182 | Ahmad et al. | Jul 2010 | B2 |
7809167 | Bell | Oct 2010 | B2 |
7834846 | Bell | Nov 2010 | B1 |
7852262 | Namineni et al. | Dec 2010 | B2 |
RE42256 | Edwards | Mar 2011 | E |
7898522 | Hildreth et al. | Mar 2011 | B2 |
8035612 | Bell et al. | Oct 2011 | B2 |
8035614 | Bell et al. | Oct 2011 | B2 |
8035624 | Bell et al. | Oct 2011 | B2 |
8072470 | Marks | Dec 2011 | B2 |
20020083474 | Hennenhoefer | Jun 2002 | A1 |
20030151453 | Laletin | Aug 2003 | A1 |
20050024259 | Berry et al. | Feb 2005 | A1 |
20050195383 | Breed et al. | Sep 2005 | A1 |
20070104015 | Srinivas et al. | May 2007 | A1 |
20080026838 | Dunstan et al. | Jan 2008 | A1 |
20080057881 | Dwyer | Mar 2008 | A1 |
20100110280 | Aoyama | May 2010 | A1 |
20100214214 | Corson et al. | Aug 2010 | A1 |
20100290054 | Cho | Nov 2010 | A1 |
20110037849 | Niclass et al. | Feb 2011 | A1 |
20120017153 | Matsuda et al. | Jan 2012 | A1 |
20120065494 | Gertner et al. | Mar 2012 | A1 |
20120082346 | Katz et al. | Apr 2012 | A1 |
20120098964 | Oggier et al. | Apr 2012 | A1 |
20120313900 | Dahl | Dec 2012 | A1 |
20130181119 | Bikumandla et al. | Jul 2013 | A1 |
Number | Date | Country |
---|---|---|
101254344 | Jun 2010 | CN |
102006050303 | Jun 2007 | DE |
0583061 | Feb 1994 | EP |
1647839 | Apr 2006 | EP |
08044490 | Feb 1996 | JP |
9310708 | Jun 1993 | WO |
9717598 | May 1997 | WO |
9944698 | Sep 1999 | WO |
Entry |
---|
Miles Hansard, Seungkyu Lee, Ouk Choi, Radu Horaud. Time of Flight Cameras: Principles, Methods, and Applications. Springer, pp. 95, 2012, SpringerBriefs in Computer Science, ISBN 978-1-4471-4658-2. <10.1007/978-1-4471-4658-2>. |
DigitalHome. “Bell TV Update for 9242 HD PVR Fixes IR Interference Bug.” Digital Home Website. Feb. 5, 2010. http://www.digitalhome.ca/2010/02/bell-tv-update-for-9242-hd-pvr-fixes-ir-interference-bug/. |
Dietz, Christian, et al., “Infrared Reflectography Using 3D Laser Scanning,” e-Conservation Magazine, Issue 18, Mar. 13, 2011, 6 pages. |
Infrared Receivers for 3D Active Glasses, [http://www.vishay.com/docs/49150/vmn-sg21.pdf], Jun. 12, 2012, 4 pages. |
Wu, Martin K., “Interference Problems of Fluorescent Lamps Operating on High Frequency Electronic Ballasts with Infrared Remote Control Equipment and Infrared Simultaneous Interpretation System,” Energy Efficient Office, Electrical and Mechanical Services Dept., Aug. 2003, 8 pages. |
Moreira, A.J.C., et al., “Performance of infrared transmission systems under ambient light interference,” IEEE Proceedings of Optoelectronics, vol. 143, Issue 6, Dec. 1996, 8 pages. |
Gokturk, S. Burak, “A Time-of-Flight Depth Sensor—System Description, Issues and Solutions,” Conference on Computer Vision and Pattern Recognition Workshop, Jun. 2004, 9 pages. |
Mistele, Thomas, et al., “Choosing an Infrared Receiver Based on AGC Type,” Vishay Semiconductors, published in Wireless Design & Development, Nov. 2008, 2 pages. |
“Circuit Description of the IR Receiver Modules,” Vishay Semiconductors, Document No. 80069, Feb. 27, 2013, 2 pages. |
International Search Report & Written Opinion mailed Sep. 10, 2014, in PCT Patent Application No. PCT/US2014/037753 filed May 13, 2014. |
Mure-Dubois, et al., “Fusion of Time of Flight Camera Point Clouds”, In Workshop on Multi-Camera and Multi-Modal Sensor Fusion Algorithms and Applications, Oct. 1, 2008, 12 pages. |
“SwissRanger SR-3000 Manual”, Version 1.02, Published on: Oct. 2006, Available at: https://aiweb.techfak.uni-bielefeld.de/files/SR3000—manual—V1.03.pdf. |
Amendment filed Nov. 17, 2014, in PCT Patent Application No. PCT/US2014/037753 filed May 13, 2014. |
Kanade et al., “A Stereo Machine for Video-rate Dense Depth Mapping and Its New Applications”, IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 1996, pp. 196-202, The Robotics Institute, Carnegie Mellon University, Pittsburgh, PA. |
Miyagawa et al., “CCD-Based Range Finding Sensor”, Oct. 1997, pp. 1648-1652, vol. 44 No. 10, IEEE Transactions on Electron Devices. |
Rosenhahn et al., “Automatic Human Model Generation”, 2005, pp. 41-48, University of Auckland (CITR), New Zealand. |
Aggarwal et al., “Human Motion Analysis: A Review”, IEEE Nonrigid and Articulated Motion Workshop, 1997, University of Texas at Austin, Austin, TX. |
Shag et al., “An Open System Architecture for a Multimedia and Multimodal User Interface”, Aug. 24, 1998, Japanese Society for Rehabilitation of Persons with Disabilities (JSRPD), Japan. |
Kohler, “Special Topics of Gesture Recognition Applied in Intelligent Home Environments”, In Proceedings of the Gesture Workshop, 1998, pp. 285-296, Germany. |
Kohler, “Vision Based Remote Control in Intelligent Home Environments”, University of Erlangen-Nuremberg/Germany, 1996, pp. 147-154, Germany. |
Kohler, “Technical Details and Ergonomical Aspects of Gesture Recognition applied in Intelligent Home Environments”, 1997, Germany. |
Hasegawa et al., “Human-Scale Haptic Interaction with a Reactive Virtual Human in a Real-Time Physics Simulator”, Jul. 2006, vol. 4, No. 3, Article 6C, ACM Computers in Entertainment, New York, NY. |
Qian et al., “A Gesture-Driven Multimodal Interactive Dance System”, Jun. 2004, pp. 1579-1582, IEEE International Conference on Multimedia and Expo (ICME), Taipei, Taiwan. |
Zhao, “Dressed Human Modeling, Detection, and Parts Localization”, 2001, The Robotics Institute, Carnegie Mellon University, Pittsburgh, PA. |
He, “Generation of Human Body Models”, Apr. 2005, University of Auckland, New Zealand. |
Isard et al., “Condensation—Conditional Density Propagation for Visual Tracking”, 1998, pp. 5-28, International Journal of Computer Vision 29(1), Netherlands. |
Livingston, “Vision-based Tracking with Dynamic Structured Light for Video See-through Augmented Reality”, 1998, University of North Carolina at Chapel Hill, North Carolina, USA. |
Wren et al., “Pfinder: Real-Time Tracking of the Human Body”, MIT Media Laboratory Perceptual Computing Section Technical Report No. 353, Jul. 1997, vol. 19, No. 7, pp. 780-785, IEEE Transactions on Pattern Analysis and Machine Intelligence, Caimbridge, MA. |
Breen et al., “Interactive Occlusion and Collision of Real and Virtual Objects in Augmented Reality”, Technical Report ECRC-95-02, 1995, European Computer-Industry Research Center GmbH, Munich, Germany. |
Freeman et al., “Television Control by Hand Gestures”, Dec. 1994, Mitsubishi Electric Research Laboratories, TR94-24, Caimbridge, MA. |
Hongo et al., “Focus of Attention for Face and Hand Gesture Recognition Using Multiple Cameras”, Mar. 2000, pp. 156-161, 4th IEEE International Conference on Automatic Face and Gesture Recognition, Grenoble, France. |
Pavlovic et al., “Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review”, Jul. 1997, pp. 677-695, vol. 19, No. 7, IEEE Transactions on Pattern Analysis and Machine Intelligence. |
Azarbayejani et al., “Visually Controlled Graphics”, Jun. 1993, Vo1. 15, No. 6, IEEE Transactions on Pattern Analysis and Machine Intelligence. |
Granieri et al., “Simulating Humans in VR”, The British Computer Society, Oct. 1994, Academic Press. |
Brogan et al., “Dynamically Simulated Characters in Virtual Environments”, Sep./Oct. 1998, pp. 2-13, vol. 18, Issue 5, IEEE Computer Graphics and Applications. |
Fisher et al., “Virtual Environment Display System”, ACM Workshop on Interactive 3D Graphics, Oct. 1986, Chapel Hill, NC. |
“Virtual High Anxiety”, Tech Update, Aug. 1995, pp. 22. |
Sheridan et al., “Virtual Reality Check”, Technology Review, Oct. 1993, pp. 22-28, vol. 96, No. 7. |
Stevens, “Flights into Virtual Reality Treating Real World Disorders”, The Washington Post, Mar. 27, 1995, Science Psychology, 2 pages. |
“Simulation and Training”, 1994, Division Incorporated. |
English Machine-translation of Japanese Publication No. JP08-044490 published on Feb. 16, 1996. |
Written Opinion mailed Mar. 20, 2015, in PCT Patent Application No. PCT/US2014/037753 filed May 13, 2014. |
International Preliminary Report on Patentability mailed Jun. 30, 2015, in PCT Patent Application No. PCT/US2014/037753 filed May 13, 2014. |
Number | Date | Country | |
---|---|---|---|
20140333917 A1 | Nov 2014 | US |
Number | Date | Country | |
---|---|---|---|
61822873 | May 2013 | US |