The field of invention pertains to camera systems generally, and, more specifically, to an integrated camera system having two dimensional image capture and three dimensional time-of-flight capture with a partitioned field of view
Many existing computing systems include one or more traditional image capturing cameras as an integrated peripheral device. A current trend is to enhance computing system imaging capability by integrating depth capturing into its imaging components. Depth capturing may be used, for example, to perform various intelligent object recognition functions such as facial recognition (e.g., for secure system un-lock) or hand gesture recognition (e.g., for touchless user interface functions).
One depth information capturing approach, referred to as “time-of-flight” imaging, emits light from a system onto an object and measures, for each of multiple pixels of an image sensor, the time between the emission of the light and the reception of its reflected image upon the sensor. The image produced by the time of flight pixels corresponds to a three-dimensional profile of the object as characterized by a unique depth measurement (z) at each of the different (x,y) pixel locations.
As many computing systems with imaging capability are mobile in nature (e.g., laptop computers, tablet computers, smartphones, etc.), the integration of a light source (“illuminator”) into the system to achieve time-of-flight operation presents a number of design challenges such as cost challenges, packaging challenges and/or power consumption challenges.
An apparatus is described that includes an integrated two-dimensional image capture and three-dimensional time-of-flight depth capture system. The three-dimensional time-of-flight depth capture system includes an illuminator to generate light. The illuminator includes arrays of light sources. Each of the arrays is dedicated to a particular different partition within a partitioned field of view of the illuminator.
An apparatus is described that includes means for receiving a command to illuminate a particular partition of a partitioned field of view of an illuminator. The apparatus additionally includes means for enabling an array of light sources that is dedicated to the particular partition. The apparatus additionally includes means for collecting light from the light source array and directing the collected light toward the partition to illuminate the partition. The apparatus additionally includes means for detecting at least a portion of the light after it has been reflected from an object of interest within the partition and comparing respective arrival times of the light against emission times of the light to generate depth information of the object of interest.
The following description and accompanying drawings are used to illustrate embodiments of the invention. In the drawings:
A “smart illumination” time-of-flight system addresses some of the design challenges referred to in the Background section. As will be made more clear in the following discussion, a “smart illumination” time-of-flight system can emit light on only a “region of interest” within the illuminator's field of view. As a consequence, the intensity of the emitted optical signal is strong enough to generate a detectable signal at the image sensor, while, at the same time, the illuminator's power consumption does not appreciably draw from the computer system's power supply.
One smart illumination approach is to segment the illuminator's field of view into different partitions and to reserve a separate and distinct array of light sources for each different partition.
Referring to
The reservation of an entire light source array for only a distinct partition of the field of view 102 ensures that light of sufficient intensity is emitted from the illuminator 101, which, in turn, ensures that a signal of appreciable strength will be received at the image sensor. The use of an array of light sources is known in the art. However, a single array is typically used to illuminate an entire field of view rather than just a section of it.
In many use cases it is expected that only a portion of the field of view 102 will be “of interest” to the application that is using the time-of-flight system. For example, in the case of a system designed to recognize hand gestures, only the portion of the field of view consumed by the hand needs to be illuminated. Thus, the system has the ability to direct the full intensity of an entire light source array onto only a smaller region of interest.
In cases where the region of interest consumes more than one partitioned section, the sections can be illuminated in sequence to keep the power consumption of the overall system limited to no more than a single light source array. For example, referring to
That is, across times t1 through t4, different partitions are turned on and off in sequence to effectively “scan” an amount of light equal to a partition across the region of interest. A region of interest that is larger than any one partition has therefore been effectively illuminated. Importantly, at any one of moments of time t1 through t4, only one light source array is “on”. As such, over the course of the scanning over the larger region of interest, the power consumption remains approximately that of only a single array. In various other use cases more than one light source array may be simultaneously enabled with the understanding that power consumption will scale with the number of simultaneously enabled arrays. That is, there may be use cases in which the power consumption expense is permissible for a particular application that desires simultaneous illumination of multiple partitions.
As observed in
Each light source array 106_1 through 106_9 may be implemented, for example, as an array of light-emitted-diodes (LEDs) or lasers such as vertical cavity surface emitting lasers (VCSELs). In a typical implementation the respective light sources of each array emit non-visible (e.g., infra-red (IR)) light so that the reflected time-of-flight signal does not interfere with the traditional visible light image capture function of the computing system. Additionally, in various embodiments, each of the light sources within a particular array may be connected to the same anode and same cathode so that all of the light sources within the array are either all on or all off (alternative embodiments could conceivably be designed to permit subsets of light sources within an array to be turned on/off together (e.g., to illuminate sub-regions within a partition).
An array of light sources permits, e.g., the entire illuminator power budget to be expended illuminating only a single partition. For example, in one mode of operation, a single light source array is on and all other light source arrays are off so that the entire power budget made available to the illuminator is expended illuminating only the light source array's particular partition. The ability to direct the illuminator's full power to only a single partition is useable, e.g., to ensure that any partition can receive light of sufficient intensity for a time-of-flight measurement. Other modes of operation may scale down accordingly (e.g., two partitions are simultaneously illuminated where the light source array for each consumes half of the illuminator's power budget by itself). That is, as the number of partitions that are simultaneously illuminated grows, the amount of optical intensity emitted towards each partition declines. Referring to
The micro-lens array 108 enhances optical efficiency by capturing most of the emitted optical light from the underlying laser array and forming a more concentrated beam. Here, the individual light sources of the various arrays typically have a wide emitted light divergence angle. The micro-lens array 108 is able to collect most/all of the diverging light from the light sources of an array and help form an emitted beam of light having a smaller divergence angle.
Collecting most/all of the light from the light source array and forming a beam of lower divergence angle essentially forms a higher optical bower beam (that is, optical intensity per unit of surface area is increased) resulting in a stronger received signal at the sensor for the region of interest that is illuminated by the beam. According to one calculation, if the divergence angle from the light source array is 60° , reducing the emitted beam's divergence angle to 30° will increase the signal strength at the sensor by a factor of 4.6. Reducing the emitted beam's divergence angle to 20° will increase the signal strength at the sensor by a factor of 10.7.
Boosting received signal strength at the sensor through optical concentration of emitted light from the light source array, as opposed to simply emitting higher intensity light from the light source array, preserves battery life as the light source array will be able to sufficiently illuminate an object of interest without consuming significant amounts of power.
The design of optical element 107 as observed in
The embodiment of
It is pertinent to recognize that with any of the partition designs of
The system 300 has a connector 301 for making electrical contact, e.g., with a larger system/mother board, such as the system/mother board of a laptop computer, tablet computer or smartphone. Depending on layout and implementation, the connector 301 may connect to a flex cable that, e.g., makes actual connection to the system/mother board, or, the connector 301 may make contact to the system/mother board directly.
The connector 301 is affixed to a planar board 302 that may be implemented as a multi-layered structure of alternating conductive and insulating layers where the conductive layers are patterned to form electronic traces that support the internal electrical connections of the system 300. Through the connector 301 commands are received from the larger system to turn specific ones of the light source arrays on and turn specific ones of the light source arrays off.
An integrated “RGBZ” image sensor 303 is mounted to the planar board 302. The integrated RGBZ sensor includes different kinds of pixels, some of which are sensitive to visible light (specifically, a subset of R pixels that are sensitive to visible red light, a subset of G pixels that are sensitive to visible green light and a subset of B pixels that are sensitive to blue light) and others of which are sensitive to IR light. The RGB pixels are used to support traditional “2D” visible image capture (traditional picture taking) functions. The IR sensitive pixels are used to support 2D IR image capture and 3D depth profile imaging using time-of-flight techniques. Although a basic embodiment includes RGB pixels for the visible image capture, other embodiments may use different colored pixel schemes (e.g., Cyan, Magenta and Yellow).
The integrated image sensor 303 may also include, for the IR sensitive pixels, special signaling lines or other circuitry to support time-of-flight detection including, e.g., clocking signal lines and/or other signal lines that indicate the timing of the reception of IR light (in view of the timing of the emission of the IR light from the light source array 305).
The integrated image sensor 303 may also include a number or analog-to-digital converters (ADCs) to convert the analog signals received from the sensor's RGB pixels into digital data that is representative of the visible imagery in front of the camera lens module 304. The planar board 302 may likewise include signal traces to carry digital information provided by the ADCs to the connector 301 for processing by a higher end component of the computing system, such as an image signal processing pipeline (e.g., that is integrated on an applications processor).
A camera lens module 304 is integrated above the integrated RGBZ image sensor 303. The camera lens module 304 contains a system of one or more lenses to focus light received through an aperture onto the image sensor 303. As the camera lens module's reception of visible light may interfere with the reception of IR light by the image sensor's time-of-flight pixels, and, contra-wise, as the camera module's reception of IR light may interfere with the reception of visible light by the image sensor's RGB pixels, either or both of the image sensor 302 and lens module 303 may contain a system of filters (e.g., filter 310) arranged to substantially block IR light that is to be received by RGB pixels, and, substantially block visible light that is to be received by time-of-flight pixels.
An illuminator 307 composed of a plurality of light source arrays 305 beneath an optical element 306 that partitions the illuminator's field of view is also mounted on the planar board 302. The plurality of light source arrays 305 may be implemented on a semiconductor chip that is mounted to the planar board 301. Embodiments of the light source arrays 305 and partitioning of the optical element 306 have been discussed above with respect to
Notably, one or more supporting integrated circuits for the light source array (not shown in
In an embodiment, the integrated system 300 of
An applications processor or multi-core processor 450 may include one or more general purpose processing cores 415 within its CPU 401, one or more graphical processing units 416, a main memory controller 417, an I/O control function 418 and one or more image signal processor pipelines 419. The general purpose processing cores 415 typically execute the operating system and application software of the computing system. The graphics processing units 416 typically execute graphics intensive functions to, e.g., generate graphics information that is presented on the display 403. The memory control function 417 interfaces with the system memory 402. The image signal processing pipelines 419 receive image information from the camera and process the raw image information for downstream uses. The power management control unit 412 generally controls the power consumption of the system 400.
Each of the touchscreen display 403, the communication interfaces 404-407, the GPS interface 408, the sensors 409, the camera 410, and the speaker/microphone codec 413, 414 all can be viewed as various forms of I/O (input and/or output) relative to the overall computing system including, where appropriate, an integrated peripheral device as well (e.g., the one or more cameras 410). Depending on implementation, various ones of these I/O components may be integrated on the applications processor/multi-core processor 450 or may be located off the die or outside the package of the applications processor/multi-core processor 450.
In an embodiment one or more cameras 410 includes an integrated traditional visible image capture and time-of-flight depth measurement system such as the system 300 described above with respect to
In the case of commands, the commands may include entrance into or exit from any of the 2D, 3D or 2D/3D system states discussed above with respect to
Embodiments of the invention may include various processes as set forth above. The processes may be embodied in machine-executable instructions. The instructions can be used to cause a general-purpose or special-purpose processor to perform certain processes. Alternatively, these processes may be performed by specific hardware components that contain hardwired logic for performing the processes, or by any combination of programmed computer components and custom hardware components.
Elements of the present invention may also be provided as a machine-readable medium for storing the machine-executable instructions. The machine-readable medium may include, but is not limited to, floppy diskettes, optical disks, CD-ROMs, and magneto-optical disks, FLASH memory, ROMs, RAMs, EPROMs, EEPROMs, magnetic or optical cards, propagation media or other type of media/machine-readable medium suitable for storing electronic instructions. For example, the present invention may be downloaded as a computer program which may be transferred from a remote computer (e.g., a server) to a requesting computer (e.g., a client) by way of data signals embodied in a carrier wave or other propagation medium via a communication link (e.g., a modem or network connection).
In the foregoing specification, the invention has been described with reference to specific exemplary embodiments thereof. It will, however, be evident that various modifications and changes may be made thereto without departing from the broader spirit and scope of the invention as set forth in the appended claims. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.
This application is a continuation of U.S. application Ser. No. 15/694,039, filed Sep. 1, 2017, which is a divisional of U.S. application Ser. No. 15/693,553, filed Sep. 1, 2017, which is a continuation of U.S. application Ser. No. 14/579,732, filed Dec. 22, 2014, the contents of which are hereby incorporated by reference.
Number | Date | Country | |
---|---|---|---|
Parent | 15693553 | Sep 2017 | US |
Child | 15694039 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15694039 | Sep 2017 | US |
Child | 15695837 | US | |
Parent | 14579732 | Dec 2014 | US |
Child | 15693553 | US |