A number of different techniques are known for generating three-dimensional (3D) images of a spatial scene in real time. For example, 3D images of a spatial scene may be generated using triangulation based on multiple two-dimensional (2D) images. However, a significant drawback of such a technique is that it generally requires very intensive computations, and can therefore consume an excessive amount of the available computational resources of a computer or other processing device. Also, it can be difficult to generate an accurate 3D image under conditions involving insufficient ambient lighting when using such a technique.
Other known techniques include directly generating a 3D image using a depth imager such as a time of flight (ToF) camera. ToF cameras are usually compact, provide rapid image generation, and operate in the near-infrared part of the electromagnetic spectrum. As a result, ToF cameras are commonly used in machine vision applications such as gesture recognition in video gaming systems or other types of image processing systems implementing gesture-based human-machine interfaces. ToF cameras are also utilized in a wide variety of other machine vision applications, including, for example, face detection and singular or multiple person tracking.
A typical conventional ToF camera includes an optical source comprising, for example, one or more light-emitting diodes (LEDs) or laser diodes. Each such LED or laser diode is controlled to produce continuous wave (CW) output light having substantially constant frequency and amplitude. The output light illuminates a scene to be imaged and is scattered or reflected by objects in the scene. The resulting return light is detected and utilized to create a depth map or other type of 3D image. This more particularly involves, for example, utilizing phase differences between the output light and the return light to determine distances to the objects in the scene. Also, the amplitude of the return light is used to determine intensity levels for the image.
However, the use of CW output light in a ToF camera has a number of significant drawbacks. For example, the frequency of the CW output light unduly restricts the maximum unambiguous range of the camera. More particularly, the maximum unambiguous range is generally given by c/2f, where f is the frequency of the CW output light and c is the speed of light. The maximum unambiguous range can be extended by decreasing the frequency f, but this approach also decreases the measurement precision.
In addition, when utilizing CW output light, image quality degrades as the length of an integration time window is decreased. As a result, a ToF camera often cannot support a frame rate that is sufficiently high to track dynamic objects in the scene. On the other hand, saturation of image pixels is observed as the length of the integration time window is increased. Conventional ToF cameras based on CW light are generally unable to provide suitable optimization of the integration time window.
Embodiments of the invention provide, by way of example, optical source driver circuits for ToF cameras and other types of depth imagers.
In one embodiment, a depth imager comprises a driver circuit and an optical source. The driver circuit comprises a frequency control module and a controllable oscillator having a control input coupled to an output of the frequency control module. An output of the controllable oscillator is coupled to an input of the optical source, and a driver signal provided by the driver circuit to the optical source utilizing the controllable oscillator varies in frequency under control of the frequency control module in accordance with a designated type of frequency variation, such as a ramped or stepped frequency variation.
The driver circuit in a given embodiment may additionally or alternatively comprise an amplitude control module, such that a driver signal provided to the optical source varies in amplitude under control of the amplitude control module in accordance with a designated type of amplitude variation, such as a ramped or stepped amplitude variation.
Other embodiments of the invention include but are not limited to methods, systems, integrated circuits, and computer-readable media storing program code which when executed causes a processing device to execute a sequence of steps.
Embodiments of the invention will be illustrated herein in conjunction with exemplary image processing systems that include depth imagers having optical source driver circuits configured to provide at least one of frequency variation and amplitude variation in a given optical source driver signal. It should be understood, however, that embodiments of the invention are more generally applicable to any image processing system or associated depth imager in which it is desirable to provide improved quality for depth maps or other types of 3D images.
Although shown as being separate from the processing devices 102 in the present embodiment, the depth imager 101 may be at least partially combined with one or more of the processing devices. Thus, for example, the depth imager 101 may be implemented at least in part using a given one of the processing devices 102. By way of example, a computer may be configured to incorporate depth imager 101.
In a given embodiment, the image processing system 100 is implemented as a video gaming system or other type of gesture-based system that generates images in order to recognize user gestures. The disclosed imaging techniques can be similarly adapted for use in a wide variety of other systems requiring a gesture-based human-machine interface, and can also be applied to numerous applications other than gesture recognition, such as machine vision systems involving face detection, person tracking or other techniques that process depth images from a depth imager.
The depth imager 101 as shown in
The driver circuit 112 controls the LEDs 114 so as to generate output light having particular frequency and amplitude variations. Ramped and stepped examples of such variations provided by the driver circuit 112 can be seen in
The driver circuit 112 in a given embodiment may comprise a frequency control module, such that a driver signal provided to at least one of the LEDs 114 varies in frequency under control of the frequency control module in accordance with a designated type of frequency variation, such as a ramped or stepped frequency variation.
The ramped or stepped frequency variation can be configured to provide, for example, an increasing frequency as a function of time, a decreasing frequency as a function of time, or combinations of increasing and decreasing frequency. Also, the increasing or decreasing frequency may follow a linear function or a non-linear function, or combinations of linear and non-linear functions.
In an embodiment with ramped frequency variation, a frequency control module implemented in the driver circuit may be configured to permit user selection of one or more parameters of the ramped frequency variation including one or more of a start frequency, an end frequency and a duration for the ramped frequency variation.
Similarly, in an embodiment with stepped frequency variation, the frequency control module may be configured to permit user selection of one or more parameters of the stepped frequency variation including one or more of a start frequency, an end frequency, a frequency step size, a time step size and a duration for the stepped frequency variation.
The driver circuit 112 in a given embodiment may additionally or alternatively comprise an amplitude control module, such that a driver signal provided to at least one of the LEDs 114 varies in amplitude under control of the amplitude control module in accordance with a designated type of amplitude variation, such as a ramped or stepped amplitude variation. Like the ramped or stepped frequency variations noted above, the ramped or stepped amplitude variation can be configured to provide an increasing amplitude as a function of time, a decreasing amplitude as a function of time, or combinations of increasing and decreasing amplitude. Also, the increasing or decreasing amplitude may follow a linear function or a non-linear function, or combinations of linear and non-linear functions. Moreover, the amplitude variations may be synchronized with the frequency variations if the embodiment includes both an amplitude control module and a frequency control module.
In an embodiment with ramped amplitude variation, the amplitude control module may be configured to permit user selection of one or more parameters of the ramped amplitude variation including one or more of a start amplitude, an end amplitude, a bias amplitude and a duration for the ramped amplitude variation.
Similarly, in an embodiment with stepped amplitude variation, the amplitude control module may be configured to permit user selection of one or more parameters of the stepped amplitude variation including a one or more of a start amplitude, an end amplitude, a bias amplitude, an amplitude step size, a time step size and a duration for the stepped amplitude variation.
The driver circuit 112 can therefore be configured to generate driver signals having designated types of frequency and amplitude variations, in a manner that provides significantly improved performance in depth imager 101 relative to conventional depth imagers. For example, such an arrangement may be configured to allow particularly efficient optimization of not only driver signal frequency and amplitude, but also other parameters such as an integration time window.
The depth imager 101 in the present embodiment is assumed to be implemented using at least one processing device and comprises a processor 112 coupled to a memory 122. The processor 120 controls the driver circuit 112 and detector arrays 116 using software code stored in memory 122.
The processor 120 may comprise, for example, a microprocessor, an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), a central processing unit (CPU), an arithmetic logic unit (ALU), a digital signal processor (DSP), or other similar processing device component, as well as other types and arrangements of image processing circuitry, in any combination.
The memory 122 stores software code for execution by the processor 120 in implementing portions of the functionality of depth imager 101, such as portions of the frequency and amplitude control modules described previously. A given such memory that stores software code for execution by a corresponding processor is an example of what is more generally referred to herein as a computer-readable medium or other type of computer program product having computer program code embodied therein, and may comprise, for example, electronic memory such as random access memory (RAM) or read-only memory (ROM), magnetic memory, optical memory, or other types of storage devices in any combination. As indicated above, the processor may comprise portions or combinations of a microprocessor, ASIC, FPGA, CPU, ALU, DSP or other image processing circuitry.
Also included in the depth imager 101 in the present embodiment is a parameter optimization module 125 that is illustratively configured to optimize the integration time window of the depth imager as well as optimization of the frequency and amplitude variations for a given imaging operation. For example, the parameter optimization module 125 may be configured to determine an appropriate set of parameters including integration time window, frequency variation and amplitude variation for the given imaging operation.
Such an arrangement allows the depth imager to be configured for optimal performance under a wide variety of different operating conditions, such as distance to objects in the scene, number and type of objects in the scene, and so on. Thus, for example, integration time window length of the depth imager 101 in the present embodiment can be determined in conjunction with selection of driver signal frequency and amplitude variations in a manner that optimizes overall performance under particular conditions. The parameter optimization module 125 may be implemented at least in part in the form of software stored in memory 122 and executed by processor 120. It should be noted that terms such as “optimal” and “optimization” as used in this context are intended to be broadly construed, and do not require minimization or maximization of any particular performance measure.
The network 104 may comprise a wide area network (WAN) such as the Internet, a local area network (LAN), a cellular network, or any other type of network, as well as combinations of multiple networks. The depth imager 101 and each of the processing devices 102 may incorporate transceivers or other network interface circuitry to allow these devices to communicate with one another over the network 104.
It should also be appreciated that embodiments of the invention may be implemented in the form of integrated circuits. In a given such integrated circuit implementation, identical die are typically formed in a repeated pattern on a surface of a semiconductor wafer. Each die includes at least one driver circuit and possibly other image processing circuitry as described herein, and may further include other structures or circuits. The individual die are cut or diced from the wafer, then packaged as an integrated circuit. One skilled in the art would know how to dice wafers and package die to produce integrated circuits. Integrated circuits so manufactured are considered embodiments of the invention.
The particular configuration of image processing system 100 as shown in
The mixer 208 more particularly has a first input coupled to the output of the voltage controlled oscillator 206, a second input coupled to an output of the amplitude control module 207, and an output providing the driver signal for the optical source 204. In this embodiment, the mixer 208 serves to provide a single driver signal that combines the amplitude variations exhibited by an output signal of the amplitude control module 207 with the frequency variations exhibited by an output signal of the voltage controlled oscillator 206. In generating the driver signal, which is illustratively a current signal in the present embodiment, the mixer 208 performs a voltage to current (V→I) conversion.
Although a voltage controlled oscillator 206 is utilized in driver circuit 202 in the present embodiment, other embodiments can utilize other types of oscillators, such as, for example, numerically controlled oscillators.
The driver circuit 202 is configured to generate a driver signal for application to the optical source 204 utilizing the voltage controlled oscillator 206. The frequency and amplitude of the driver signal are controlled by the respective frequency control and amplitude control modules 205 and 207 such that the driver signal exhibits designated types of frequency and amplitude variation.
The designated type of frequency variation in the present embodiment comprises a ramped frequency variation providing a decreasing frequency as a function of time. This is also referred to in the figure as a “ramp-down” frequency variation. The frequency control module 205 is configured to permit user selection of designated parameters of the ramped frequency variation, including in this embodiment a start frequency, an end frequency and a duration for the ramped frequency variation. The start and end frequencies are specified using corresponding input voltages in this embodiment.
It should be noted that the term “user” in this context is intended to be broadly construed, so as to encompass not only human users but also other types of users, including automated software or hardware entities of the image processing system that utilize the depth imager 101 to generate depth images of scenes. Thus, for example, a software program or other type of agent running on or otherwise associated with one of the processing devices 102 may be configured to interact with driver circuit 202 so as to select one or more parameters of at least one of a frequency variation provided by the frequency control module 205 and an amplitude variation provided by the amplitude control module 207.
The designated type of amplitude variation in the present embodiment comprises a ramped amplitude variation providing an increasing amplitude as a function of time. This is also referred to in the figure as a “ramp-up” amplitude variation. The amplitude control module 207 is configured to permit user selection of designated parameters of the ramped amplitude variation, including in this embodiment a start amplitude, an end amplitude, a bias amplitude and a duration for the ramped amplitude variation. The start, end and bias amplitudes are specified using corresponding input voltages in this embodiment. These amplitude parameters should be selected so as to be above a threshold current level of the optical source 204.
The corresponding input-output response of the optical source 204 is shown in
In other embodiments, other combinations of increasing or decreasing frequency and amplitude variations may be used. Also, although the frequency and amplitude variations are in the form of substantially linear ramps in this embodiment, other embodiments can utilize variations that follow non-linear functions, or multiple linear and non-linear functions, in any combination.
The driver circuit 202 synchronizes the frequency and amplitude variations by utilizing a common trigger signal for the frequency control module 205, voltage controlled oscillator 206 and amplitude control module 207. The trigger signal is generated by a falling edge trigger circuit 210 responsive to a signal provided by a gating circuit 212 illustratively implemented as an LED gate. The trigger signal may be a pulse signal having a designated pulse width. Although the trigger signal is falling edge triggered in this embodiment, other types of trigger circuitry and resulting trigger signals may be used.
The gating circuit 212 generates its output signal for application to an input of the trigger circuit 210 responsive to a gate voltage or other optical source control signal which may be provided by the processor 120 of depth imager 101. The trigger signal generated by trigger circuit 210 is subject to a predetermined delay in delay circuit 214 before being applied to respective trigger inputs of the frequency control module 205, the voltage controlled oscillator 206 and the amplitude control module 207. The predetermined delay in the present embodiment is an amount of delay that will allow the voltage controlled oscillator 206 to reach a stable output condition after being powered on.
Referring now to
The driver circuit 302 in this embodiment comprises a frequency control module 305, a voltage controlled oscillator 306 and an amplitude control module 307. The voltage controlled oscillator 306 has a control input coupled to an output of the frequency control module 306 and its output is coupled to an input of the optical source 304 via a mixer 308. The mixer 308 more particularly has a first input coupled to the output of the voltage controlled oscillator 306, a second input coupled to an output of the amplitude control module 307, and an output providing the driver signal for the optical source 304.
Again, although a voltage controlled oscillator 306 is utilized in driver circuit 302 in the present embodiment, other embodiments can utilize other types of oscillators, such as, for example, numerically controlled oscillators.
The driver circuit 302 is configured to generate a driver signal for application to the optical source 304 utilizing the voltage controlled oscillator 306. The frequency and amplitude of the driver signal are controlled by the respective frequency control and amplitude control modules 305 and 307 such that the driver signal exhibits designated types of frequency and amplitude variation.
The designated type of frequency variation in the present embodiment comprises a stepped frequency variation providing a decreasing frequency that follows downward steps as a function of time. This is also referred to in the figure as a “step-down” frequency variation. The frequency control module 305 is configured to permit user selection of designated parameters of the stepped frequency variation, including in this embodiment a start frequency, an end frequency, a frequency step size, a time step size and a duration for the stepped frequency variation.
The designated type of amplitude variation in the present embodiment comprises a stepped amplitude variation providing an increasing amplitude that follows upward steps as a function of time. This is also referred to in the figure as a “step-up” amplitude variation. The amplitude control module 307 is configured to permit user selection of designated parameters of the stepped amplitude variation, including in this embodiment a start amplitude, an end amplitude, a bias amplitude, an amplitude step size, a time step size and a duration for the stepped amplitude variation.
Again, in other embodiments, other combinations of increasing or decreasing frequency and amplitude variations may be used. Also, although the frequency and amplitude variations are in the form of substantially uniform steps in this embodiment, other embodiments can utilize variations that follow non-linear functions, or multiple linear and non-linear functions, in any combination.
As in the
The synchronized frequency and amplitude variations in the driver signals provided by driver circuits 202 and 302 in the embodiments of
In the
As another example, use of ramped or stepped amplitude with constant frequency may be beneficial in cases in which the scene to be imaged comprises a single primary object that is moving either toward or away from the depth imager, or moving from a periphery of the scene to a center of the scene or vice versa. In such arrangements, a decreasing amplitude driver signal is expected to be well suited for cases in which the primary object is moving toward the depth imager or from the periphery to the center, and an increasing amplitude driver signal is expected to be well suited for cases in which the primary object is moving away from the depth imager or from the center to the periphery. Similar considerations may be used in selecting the type of amplitude variation to be applied in embodiments that include both frequency and amplitude variations.
As noted above, a wide variety of different types and combinations of frequency and amplitude variations may be used in other embodiments, including variations following linear, exponential, quadratic or arbitrary functions.
It is to be appreciated that the particular driver circuitry arrangements, driver signals and output light waveforms shown in
Also, numerous other types of control modules may be used to establish different frequency and amplitude variations for a given driver signal waveform. For example, static frequency and amplitude control modules may be used, in which the respective frequency and amplitude variations are not dynamically variable by user selection in conjunction with operation of the depth imager but are instead fixed to particular configurations by design. Thus, for example, a particular type of frequency variation and a particular type of amplitude variation may be predetermined during a design phase and those predetermined variations may be made fixed rather than variable in the depth imager. Static circuitry arrangements of this type providing at least one of frequency variation and amplitude variation for an optical source driver signal are considered examples of “control modules” as that term is broadly utilized herein, and are distinct from conventional arrangements such as ToF cameras that generally utilize CW output light having substantially constant frequency and amplitude.
It should again be emphasized that the embodiments of the invention as described herein are intended to be illustrative only. For example, other embodiments of the invention can be implemented utilizing a wide variety of different types and arrangements of image processing systems, depth imagers, image processing circuitry, driver circuits, control modules, processing devices and processing operations than those utilized in the particular embodiments described herein. In addition, the particular assumptions made herein in the context of describing certain embodiments need not apply in other embodiments. These and numerous other alternative embodiments within the scope of the following claims will be readily apparent to those skilled in the art.