The present disclosure relates to a system and, more particularly, to a depth information construction system.
Conventional devices for constructing depth information of an object usually require a large area and a high computing power to build a three dimensional (3D) image. With high computing power, however, the battery life for such devices is accordingly limited. However, due to growing demand tier a light and thin device having long battery life, a novel design for generating the depth information is required to solve the aforementioned problem.
One of the objectives of the present disclosure is to provide a depth information construction system, associated electronic device and method.
According to an embodiment of the present disclosure, a depth information construction system is disclosed. The depth information construction system is configured to generate an output signal for a processing circuit to construct a depth information of an object according to the output signal. The depth information construction system includes a structured light generator, a diffuser lens assembly, and a sensor. The structured light generator is arranged to project a structured light onto the object to generate a reflected structured light. The diffuser lens assembly is disposed adjacent to the structured light generator, and arranged to receive the reflected structured light and generate a filtered light. The sensor is arranged to sense the filtered light to generate the output signal.
According to an embodiment of the present disclosure, an electronic device for constructing a depth information of an object is disclosed. The electronic device includes a structured light generator, a diffuser lens assembly, a sensor and a processor. The structured light generator is arranged to project a structured light onto the object to generate a reflected structured light. The diffuser lens assembly is disposed adjacent to the structured light generator, and arranged to receive the reflected structured light and generate a filtered light. The sensor is arranged to sense the filtered light to generate an output signal. The processor is arranged to construct the depth information according to the output signal.
According to an embodiment of the present disclosure, a depth information constructing method for constructing a depth information of an object is disclosed. The depth information constructing method comprises: projecting a structured light onto the object to generate a reflected structured light; receiving the reflected structured light; filtering the reflected structured light to generate a filtered light; sensing the filtered light to generate an output signal: and constructing the depth information according to the output signal.
Aspects of the present disclosure are best understood from the following detailed description when read with the accompanying figures. It should noted that, in accordance with the standard practice in the industry, various features are not drawn to scale. In fact, the dimensions of the various features may be arbitrarily increased or reduced for clarity of discussion.
The following disclosure provides many different embodiments, or examples, for implementing different features of the disclosure. Specific examples of components and arrangements are described below to simplify the present disclosure. These are, of course, merely examples and are not intended to be limiting. For example, the formation of a first feature over or on a second feature in the description that follows may include embodiments in which the first and second features are formed in direct contact, and may also include embodiments in which additional features may be formed between the first and second features, such that the first and second features may not be in direct contact. In addition, the present disclosure may repeat reference numerals and/or letters in the various examples. This repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various embodiments and/or configurations discussed.
Further, spatially relative terms, such as “beneath,” “below,” “lower,” “above,” “upper” and the like, may be used herein for ease of description to describe one element or feature's relationship to another elements) or feature(s) as illustrated in the figures. The spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. The apparatus may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein may likewise be interpreted accordingly.
Notwithstanding that the numerical ranges and parameters setting forth the broad scope of the disclosure are approximations, the numerical values set forth in the specific examples are reported as precisely as possible. Any numerical value, however, inherently contains certain errors necessarily resulting from the standard deviation found in the respective testing measurements. Also, as used herein, the term “about” generally means within 10%, 5%, 1%, or 0.5% of a given value or range. Alternatively, the term “about” means within an acceptable standard error of the mean when considered by one of ordinary skill in the art. Other than in the operating/working examples, or unless otherwise expressly specified, all of the numerical ranges, amounts, values and percentages such as those for quantities of materials, durations of times, temperatures, operating conditions, ratios of amounts, and the likes thereof disclosed herein should be understood as modified in all instances by the term “about.” Accordingly, unless indicated to the contrary, the numerical parameters set forth in the present disclosure and attached claims are approximations that can vary as desired. At the very least, each numerical parameter should at least be construed in light of the number of reported significant digits and by applying ordinary rounding techniques. Ranges can be expressed herein as from one endpoint to another endpoint or between two endpoints. All ranges disclosed herein are inclusive of the endpoints, unless specified otherwise.
The depth information construction system disclosed by the present disclosure for generating the depth information of an object does not require a processor having high computing power, and consumes less area on a device. In addition, the electronic device applying the depth information construction system disclosed by the present disclosure consumes less battery power, and battery life is extended accordingly.
Referring to
When the monochromatic light L1 emitted by the plurality of light source units 111, 112 and 113 reaches the collimating lens 12, the collimating lens 12 collimates the monochromatic light L1 in parallel, forming the collimated monochromatic light L2. The collimating lens 12 can be optional in the present disclosure. The collimated monochromatic light L2 is projected toward the DOE 13. With the DOE 13, the collimated monochromatic light L2 is diffracted as the structured light L3.
In this embodiment, the DOE 13 has a pattern which is pseudorandomly arranged. For example, the pattern of the DOE 13 is a pseudorandom optical imprint 131 as shown in
In other embodiments, the DOE 13 has a pattern uniformly arranged, such as an n*n array, wherein n is a positive integer. Those skilled in the art should readily understand the detail of this alternative design. The detailed description is omitted here for brevity.
Referring back to
In the conventional devices applying a structured light generator, the sensor must be distant from the structured light generator to obtain the depth information of the object accurately. In contrast, with the help of the diffuser lens assemblies 21, more accurate depth information is obtained when the sensors 20 are closer to the structured light generator 10. Therefore, the depth information construction system 1 provided by the present disclosure can be much smaller than the conventional structured light 3D devices, and occupied area is reduced as a result.
Conventional devices adapting a diffuser lens usually cooperate with an illuminating system adapting uniform light to construct the depth information. For example, a diffuser lens is placed in front of a sensor, and the device encodes a 3D scene into a 2D image on the sensor. A one-time calibration consists of scanning a point source on an object axially while capturing images. The point source can be formed by reflecting uniform light (e.g., the natural light). The images are reconstructed computationally by solving nonlinear inverse problem with a sparsity prior. Since the object is composed by infinite point sources, the backend processing circuit has to process infinite light information reflected by the object, and processing the infinite light information greatly increases the burden of the processing circuit.
In contrast, since the diffuser lens assemblies 21 filter most of the natural light reflected by the object 30, and the diffuser lens assemblies 21 allow only the light with 940 nm wavelength to enter, the resolution of the output signal OUT generated by the sensors 20 is defined by the pattern on the DOE 13. For example, when the DOE 13 has a pattern with a 100*100 array and the light source 11 has only one light source unit, the structured light L3 with 940 nm wavelength includes information of only 10,000 lights projected on the object 30. The output signal OUT includes information of 10,000 lights reflected by the object 30 and filtered by the diffuser lens assembly 21, which defines the resolution of the output signal OUT. Therefore, the processing circuit 22 needs to process information of only 10,000 lights to generate the depth information. As a result, the computing power of the processing circuit 22 is not required to be very high to generate the depth information. Therefore, for an electronic device applying the processing circuit 22, the battery life of the electronic device can be extended.
Compared to the distance between the sensors 20 and the structured light generator 10, the distance between the depth information construction system 1 and the object 30 is much greater. As a result, an angle between the structured light L3 and the reflected structured light LA is approximately 0 degrees.
The arrangement of the structured light generator 10 and the sensors 20 are not limited to that shown in
Step 601: a structured light is projected onto an object to generate a reflected structured light.
Step 602: the reflected structured light is received.
Step 603: the reflected structured light is filtered to generate a filtered light.
Step 604: the filtered light is sensed to generate an output signal.
Step 605: the depth information is constructed according to the output signal.
Those skilled in the art should readily understand the detail of the depth information constructing method 600 after reading the abovementioned embodiments. The detailed description is omitted here for brevity.
This application claims the benefit of U.S. provisional application 62/732,935, filed on Sep. 18, 2018, which is incorporated by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
62732935 | Sep 2018 | US |