The disclosure relates to the field of consumer electronics technologies, and more particularly to, a method for acquiring an image, an apparatus for acquiring an image, a structured light assembly, and an electronic device.
Mobile terminals may be equipped with depth cameras and display screens. The depth cameras may be configured to acquire depth information of objects. The display screens may be configured to display text, patterns and other content. An opening is usually formed on the display screen, such as a notch formed, which makes a display area of the display screen to be staggered with a position of the depth camera.
Embodiments of the disclosure provide a method for acquiring an image, an apparatus for acquiring an image, a structured light assembly, and an electronic device.
The method for acquiring the image in embodiments of the disclosure may include: controlling a structured light camera to receive structured light that is diffracted by a display area of a display screen when exiting, then reflected by a target object, and diffracted by the display area again when entering to acquire a speckle image, in which the speckle image may include a plurality of measurement spots, and the plurality of measurement spots may include first measurement spots formed when laser light is diffracted by a diffractive optical element of a structured light projector and reflected by the target object, second measurement spots formed when the laser light is diffracted by the diffractive optical element, then diffracted by the display screen, and reflected by the target object, and third measurement spots formed when the laser light is diffracted by the diffractive optical element, diffracted by the display screen, then reflected by the target object, and diffracted by the display screen again; filtering out the second measurement spots and the third measurement spots from the speckle image to acquire the first measurement spots; and acquiring a depth image based on the first measurement spots and reference spots in a reference image.
The structured light assembly in embodiments of the disclosure may include a structured light projector, a structured light camera, and a processor. The processor is configured to: control the structured light camera to receive structured light that is diffracted by a display area of a display screen when exiting, then reflected by a target object, and diffracted by the display area again when entering to acquire a speckle image, in which the speckle image may include a plurality of measurement spots, and the plurality of measurement spots may include first measurement spots formed when laser light is diffracted by a diffractive optical element of the structured light projector and reflected by the target object, second measurement spots formed when the laser light is diffracted by the diffractive optical element, then diffracted by the display screen, and reflected by the target object, and third measurement spots formed when the laser light is diffracted by the diffractive optical element, diffracted by the display screen, then reflected by the target object, and diffracted by the display screen again; filter out the second measurement spots and the third measurement spots from the speckle image to acquire the first measurement spots; and acquire a depth image based on the first measurement spots and reference spots in a reference image.
The electronic device in embodiments of the disclosure may include a housing, a display screen, and a structured light assembly. The display screen is arranged on the housing. The structured light assembly is arranged on the housing. The structured light assembly may include a structured light projector, a structured light camera, and a processor. The processor is configured to: control the structured light camera to receive structured light that is diffracted by a display area of the display screen when exiting, then reflected by a target object, and diffracted by the display area again when entering to acquire a speckle image, in which the speckle image may include a plurality of measurement spots, and the plurality of measurement spots may include first measurement spots formed when laser light is diffracted by a diffractive optical element of the structured light projector and reflected by the target object, second measurement spots formed when the laser light is diffracted by the diffractive optical element, then diffracted by the display screen, and reflected by the target object, and third measurement spots formed when the laser light is diffracted by the diffractive optical element, diffracted by the display screen, then reflected by the target object, and diffracted by the display screen again; filter out the second measurement spots and the third measurement spots from the speckle image to acquire the first measurement spots; and acquire a depth image based on the first measurement spots and reference spots in a reference image.
Additional aspects and advantages of embodiments of the disclosure will be given in part in the following descriptions, become apparent in part from the following descriptions, or be learned from the practice of the embodiments of the disclosure.
The above and/or additional aspects and advantages of the disclosure will become apparent and more readily from the following descriptions made with reference to the drawings, in which:
Description will be made in detail below to embodiments of the disclosure with reference to the drawings. The same or similar reference numerals in the drawings indicate the same or similar elements or elements with the same or similar functions throughout.
In addition, the embodiments described herein with reference to the drawings are explanatory, illustrative, and only used to explain the embodiments of the disclosure, and cannot be understood as a limitation of the disclosure.
In the description of the disclosure, unless specified or limited otherwise, the first characteristic is “on” or “under” the second characteristic refers to the first characteristic and the second characteristic can contact with each other directly or indirectly via media. And, the first characteristic is “on”, “above”, “over” the second characteristic may refer to the first characteristic is right over the second characteristic or is diagonal above the second characteristic, or just refer to the horizontal height of the first characteristic is higher than the horizontal height of the second characteristic. The first characteristic is “below” or “under” the second characteristic may refer to the first characteristic is right over the second characteristic or is diagonal under the second characteristic, or just refer to the horizontal height of the first characteristic is lower than the horizontal height of the second characteristic.
With reference to
The display screen 10 may be arranged on the housing 30. In detail, the display screen 10 may be arranged on one surface of the housing 30 or on two opposite surfaces of the housing 30 at the same time. In an example illustrated in
With reference to
In some examples, the display screen 10 may further include a non-display area. The non-display area may be formed on the periphery of the display area 11. The non-display area may be configured to not display. The non-display area may be configured to be combined with the housing 30 or configured for wiring. For example, the non-display area and the housing 30 may be combined with glue without affecting the display function of the display area 11. The display screen 10 may also be a touch display screen integrated with a touch function. After the user acquires the image information displayed on the display screen 10, the user may perform touch controls on the display screen 10 to implement predetermined interactive operations.
The structured light assembly 20 may employ structured light to acquire depth information of a target object, for three-dimensional (3D) modeling, generating 3D images, ranging and the like. The structured light assembly 20 may be arranged in the housing 30 of the electronic device 1000. In detail, the structured light assembly 20 may be arranged on a bracket, and the structured light assembly 20 and the bracket together may be arranged in the housing 30. The structured light assembly 20 may include a structured light projector 21, a structured light camera 22 and a floodlight 23.
With reference to
The structured light passing through the display area 11 and entering the outside may include a pattern diffracted by the diffractive optical element 213 (the pattern may include a plurality of spots diffracted by the diffractive optical element 213) and a pattern diffracted by microscopic gaps of the display screen 10 (the pattern may include a plurality of spots diffracted by the diffractive optical element 213 and then diffracted by the display screen 10) at the same time. Therefore, the speckle pattern after passing through the display area 11 has a higher irrelevance, which is beneficial to the subsequent processing of the acquired speckle pattern. In an example, the transmittance of the display area 11 may reach 60% or above, so that the structured light emitted by the structured light projector 21 passes through the display area 11 with less loss.
The structured light camera 22 may be an infrared camera. The structured light is emitted to the target object. After being modulated by the target object, the structured light may be acquired by the structured light camera 22. The structured light camera 22 receives the modulated structured light to acquire a speckle image. The depth data of the target object is obtained after the speckle image is processed. The structured light camera 22 may also be arranged on the side where the rear surface 13 of the display screen 10 is located, that is, under the display screen 10. In detail, the structured light camera 22 may be arranged on the same bracket as the structured light projector 21, or the structured light camera 22 may be directly arranged on the housing 30. At this time, a light incident surface of the structured light camera 22 may be aligned with the display area 11. The structured light modulated by the target object passes through the display area 11 and is received by the structured light camera 22. In detail, the structured light modulated by the target object may be diffracted by the microscopic gaps of the display screen 10, and received by the structured light camera 22.
The floodlight 23 may be configured to emit supplementary light outward. The supplementary light may be configured to supplement the light intensity in the environment when the ambient light is weak. In an example, the supplementary light may be infrared light. After the supplementary light is emitted to the target object and reflected by the target object, it may be acquired by the structured light camera 22 to acquire a two-dimensional (2D) image of the target object. The 2D image information may be configured for identity recognition. The floodlight 23 may also be arranged on the side where the rear surface 13 of the display screen 10 is located, that is, under the display screen 10. In detail, the floodlight 23 may be arranged on the same bracket as the structured light projector 21 and the structured light camera 22. In addition, the supplementary light emitted by the floodlight 23 enters the external environment after passing through the microscopic gaps of the display area 11, and the reflected supplementary light may pass through the microscopic gaps again to be received by the structured light camera 22.
In summary, since the structured light projector 21 is arranged on the side where the rear surface 13 of the display screen 10 is located, and the structured light emitted by the structured light projector 21 passes through the display area 11 and enters the external environment, there is no need to set the opening in the display screen 10 for aligning with the structured light projector 21. Therefore, the electronic device 1000 has a screen-to-body ratio.
With reference to
At this time, the light incident surface of the structured light camera 22 may be aligned with the through slot 14. The structured light modulated by the target object passes through the through slot 14 and received by the structured light camera 22. In embodiments, since the modulated structured light does not need to pass through the microscopic gaps of the display area 11, the modulated structured light may not be diffracted by the microscopic gaps again, and the speckle image acquired by the structured light camera 22 is the speckle image modulated by the target object. The processing difficulty of subsequent calculating the depth image based on the speckle image may be reduced.
In detail, in an example illustrated in
In an example illustrated in
In some embodiments, the through slot 14 may also include the above-mentioned opening 141 and through hole 142 at the same time. The number of openings 141 and the number of through holes 142 may be equal or not equal.
Referring to
In addition, the supplementary light is directly emitted to the outside after passing through the through slot 14. The supplementary light will not be weakened during the process of passing through the display area 11, ensuring that the target object receives more supplemental light.
Similar to the structured light camera 22, as illustrated in
Alternatively, as illustrated in
In addition, in examples illustrated in
Referring to
The cover plate 40 may be made of material with good light transmission properties such as glass or sapphire. The infrared transmission layer 50 may be an infrared transmission ink or an infrared transmission film. The infrared transmission layer 50 may have a high transmittance to infrared light (for example, light with a wavelength of 940 nm), for example, the transmittance may reach 85% or more, but have a low transmittance to light other than the infrared light or the light other than the infrared light cannot be transmitted at all. Therefore, it is difficult for the user to see the structured light camera 22 or the floodlight 23 aligned with the through slot 14 through the cover plate 40. The appearance of the electronic device 1000 is more beautiful.
With reference to
The pixel density of the first display subarea 111 is less than the pixel density of the second display subarea 112, that is, the microscopic gap of the first display subarea 111 is larger than the microscopic gap of the second display subarea 112. The blocking effect of light on the first display subarea 111 is small. The transmittance of light passing through the first display subarea 111 is high. Therefore, the transmittance of the structured light emitted by the structured light projector 21 through the first display subarea 111 is relatively high.
In an example, the first display subarea 111 may be configured to display state icons of the electronic device 1000, for example, display the battery power, the network connection state, the system time, etc. of the electronic device 1000. The first display subarea 111 may be located near the edge of the display area 11, and the second display subarea 112 may be located in the middle of the display area 11.
With reference to
Different display states may be powering on, powering off, displaying with different brightness, and displaying with different refresh frequencies, and the like. The display states of the first display subarea 111 and the second display subarea 112 may be independently controlled. The user may control the second display subarea 112 to display normally according to actual needs, and the first display subarea 111 to cooperate with the structured light projector 21. For example, when the structured light projector 21 emits the structured light, the first display subarea 111 may be powered off, the display brightness of the first display subarea 111 may be lowered, or the refresh frequency of the first display subarea 111 may be adjusted so that the powering time of the first display subarea 111 and the powering time of the structured light projector 21 are staggered to reduce the influence of the structured light projector 21 projecting the speckle pattern to the scene when the first display subarea 111 is displayed. When the structured light projector 21 is not activated, the first display subarea 111 and the second display subarea 112 may both be powered on and displayed at the same refresh frequency.
Referring to
The infrared antireflection film 60 may increase the transmittance of infrared light. When the structured light projector 21 projects infrared laser light, the infrared antireflection film 60 may increase the transmittance of the infrared laser light passing through the cover plate 40 to reduce the loss when the infrared laser light passes through the cover plate 40 and further reduce the power consumption of the electronic device 1000. In detail, the infrared antireflection film 60 may be plated on the upper surface, or the lower surface of the cover plate 40, or both.
Of course, an area on the cover plate 40 corresponding to the structured light camera 22 may also be formed with an infrared antireflection film 60 to reduce the loss when the external infrared light passes through the cover plate 40 before reaching the structured light camera 22. An area on the cover plate 40 corresponding to the floodlight 23 may also be formed with an infrared antireflection film 60 to reduce the loss when the supplementary light emitted by the floodlight 23 passes through the cover plate 40. At this time, an area on the cover plate 40, which does not correspond to the structured light projector 21, the structured light camera 22, and the floodlight 23, may be formed with a visible light antireflection film 80 to improve the transmittance of the visible light emitted by the display screen 10 when passing through the cover plate 40.
Referring to
The infrared antireflection film 60 may increase the transmittance of infrared light. When the structured light projector 21 projects infrared laser light, the infrared antireflection film 60 may increase the transmittance of the infrared laser light passing through the display screen 10 to reduce the loss when the infrared laser light passes through the display screen 10 and further reduce the power consumption of the electronic device 1000. In detail, the infrared antireflection film 60 may be formed on the front surface 12 or the rear surface 13 of the display area 11, or at the same time the front surface 12 or the rear surface 13 of the display area 11. In an example, the infrared antireflection film 60 may also be formed inside the display screen 10. For example, when the display screen 10 is a liquid crystal display, the infrared antireflection film 60 may be formed on a polarizer in the display screen 10, or formed on an electrode plate of the display screen 10 and so on.
Of course, when the position of the display screen 10 corresponding to the structured light camera 22 is not provided with the through slot 14, an area of the display screen 10 corresponding to the structured light camera 22 may also form an infrared antireflection film 60. When the position of the display screen 10 corresponding to the floodlight 23 is not provided with the through slot 14, an area of the display screen 10 corresponding to the floodlight 23 may also form an infrared antireflection film 60.
Referring to
At the same time, when the position of the display screen 10 corresponding to the structured light camera 22 is not provided with the through slot 14, an area of the display screen 10 corresponding to the structured light camera 22 may also form an infrared transmission layer 50 to reduce the influence of the light other than the infrared light, which passes through the display screen 10, to the structured light camera 22. When the position of the display screen 10 corresponding to the floodlight 23 is not provided with the through slot 14, an area of the display screen 10 corresponding to the floodlight 23 may also form an infrared transmission layer 50.
With reference to
The visible light camera 70 may be configured to receive the visible light passing through the cover plate 40 and the through slot 14 to obtain images. The visible light antireflection film 80, formed on the area of the cover plate 40 corresponding to the through slot 14, may increase the transmittance of the visible light passing through the cover plate 40 so as to improve the imaging quality of the visible light camera 70. The infrared cutoff film 90, formed on the area of the cover plate 40 corresponding to the through slot 14, may reduce the transmittance of the infrared light passing through the cover plate 40, or completely prevent the infrared light from entering the visible light camera 70, so as to reduce the influence of the infrared light on the imaging effect of the visible light camera 70.
With reference to
00: the structured light projector 21 is controlled to emit structured light toward the display area 11 of the display screen 10.
01: the structured light camera 22 is controlled to capture a speckle image generated by the structured light.
02: a depth image is acquired based on measurement spots in the speckle image and reference spots in a reference image.
With reference to
With reference to
In detail, the structured light projector 21 is powered on and projects the structured light into the scene. The structured light projected into the scene may form a speckle pattern with a plurality of spots. Since a plurality of target objects in the scene may have different distances with the structured light projector 21, the speckle pattern projected on the target objects may be modulated due to differences in the surface height of the target objects, and the spots in the speckle pattern may be shifted to different degrees. The shifted spots may be collected by the structured light camera 22 to form the speckle image including the measurement spots. The processor 200 may acquire the speckle image and calculate depth data of a plurality of pixels based on the offsets of the measurement spots in the speckle image relative to the reference spots in the reference image. The plurality of pixels with depth data may form the depth image. The reference image may be acquired by pre-calibration.
In the method for acquiring the image and the electronic device 1000 according to embodiments of the disclosure, the structured light projector 21 may be arranged on the side where the rear surface 13 of the display screen 10 is located, that is, the structured light projector 21 may be arranged under the display screen 10. The display screen 10 may not need to set the through slot 14 to which the structured light projector 21 is aligned. Therefore, the electronic device 1000 may have a relatively high screen-to-body ratio while the acquisition of the depth image may not be affected.
With reference to
011: the structured light camera 22 is controlled to receive structured light that is diffracted by the display area 11 when exiting, then reflected by the target object, and enters directly, to acquire a speckle image. The speckle image may include a plurality of measurement spots. The plurality of measurement spots may include first measurement spots formed when the laser light is diffracted by the diffractive optical element 213 (as illustrated in
The action 02 of acquiring the depth image based on the measurement spots in the speckle image and the reference spots in the reference image may include the following.
021: the depth image is acquired based on the first measurement spots and second measurement spots in the speckle image and the reference spots in the reference image.
With reference to
With reference to
In detail, with reference to
LCD (Liquid Crystal Display) screens, OLED screens, Micro LED screens and other types of display screens 10 may usually have fixed pixel arrangement structures formed on the display areas 11. The microscopic gap may be formed between adjacent pixels. When single-point laser light passes through these microscopic gaps, it will be diffracted to produce a series of spots. When the pixel arrangement structures in the display areas 11 are different, the spot arrangements of the speckle patterns formed after the single-point laser light passes through the display areas 11 may also be different. The structured light emitted by the structured light projector 21 is usually infrared laser light. In this way, when the structured light projector 21 is arranged on the side where the rear surface 13 of the display screen 10 is located, that is, under the display screen 10, the infrared laser light emitted by the structured light projector 21 may also be diffracted by the microscopic gaps of the display area 11 when it passes through the display area 11 to produce the speckle pattern with the plurality of spots. Thus, the plurality of spots in the speckle pattern projected by the structured light projector 21 into the space may simultaneously include first spots formed by the laser light being diffracted by the diffractive optical element 213 and second spots formed by the laser light being diffracted by the diffractive optical element 213 and then diffracted by the display screen 10.
When the structured light camera 22 is imaging, the structured light camera 22 may receive the structured light reflected by the target object in the scene to form the speckle image. In embodiments of the disclosure, since the display screen 10 is provided with the through slot 14, the light incident surface of the structured light camera 22 is aligned with the through slot 14, and the through slot 14 does not have microscopic gaps, the laser light that is diffracted by the diffractive optical element 213, then diffracted by the display screen 10, and modulated by the target object and reflected, will not be diffracted when it passes through the through slot 14. The structured light camera 22 may receive the structured light that is diffracted by the display area 11 and reflected by the target object and directly enters. The plurality of measurement spots in the formed speckle image also may include the first measurement spots formed by the laser light being diffracted by the diffractive optical element 213 and reflected by the target object, and the second measurement spots formed by the laser light being diffracted by the diffractive optical element 213, then diffracted by the display screen 10, and reflected by the target object.
After the structured light camera 22 captures the speckle image, the processor 200 may directly calculate the depth image based on the first measurement spots and second measurement spots in the speckle image and the reference spots in the reference image. The manner of calculating the depth image may include the two following manners.
With reference to
0211: offsets of all measurement spots relative to all reference spots is calculated.
0212: depth data is calculated based on the offsets to acquire the depth image.
Correspondingly, the method for acquiring the image may further include the following.
031, when calibrating the reference image, the structured light camera 22 is controlled to receive structured light that is diffracted by the display area 11 when exiting, then reflected by a calibration object, and enters directly, to acquire the reference image. The reference image may include a plurality of reference spots.
With reference to
With reference to
In detail, with reference to
With reference to
0213: offsets of the first measurement spots relative to the first reference spots and offsets of the second measurement spots relative to the second reference spots are calculated.
0214: depth data is calculated based on the offsets to acquire the depth image.
At this time, the method for acquiring the image may further include the following.
032: when calibrating the reference image, the structured light camera 22 is controlled to receive structured light that is emitted from the structured light projector 21, directly reflected by a calibration object, and enters directly, to acquire a first reference image. The first reference image may include a plurality of reference spots. The plurality of reference spots may include first reference spots formed by the laser light being diffracted by the diffractive optical element 213 and reflected by the calibration object.
033: when calibrating the reference image, the structured light camera 22 is controlled to receive structured light that is diffracted by the display area 11 when exiting, then reflected by a calibration object, and enters directly, to acquire a second reference image. The second reference image may include a plurality of reference spots. The plurality of reference spots may include the first reference spots formed by the laser light being diffracted by the diffractive optical element 213 and reflected by the calibration object, and second reference spots formed by the laser light being diffracted by the diffractive optical element 213, then diffracted by the display screen 10, and reflected by the calibration object.
041: the first reference image is compared with the second reference image to acquire the second reference spots.
051: a ratio between an average brightness value of the second reference spots and an average brightness value of the first reference spots is calculated as the preset ratio, and an average brightness value of the first reference spots is calculated as the preset brightness.
061: an actual ratio between each measurement spot and the preset brightness is calculated.
071: the measurement spots with the actual ratio greater than the preset ratio are classified as the first measurement spots, and the measurement spots with the actual ratio less than the preset ratio are classified as the second measurement spots.
Referring to
Referring to
In this calculation method, the processor 200 may need to calibrate the first reference image and the second reference image. In detail, the processor 200 first controls the structured light projector 21 to emit the structured light to the calibration board in a scene without the obstruction of the display screen 10, and then controls the structured light camera 22 to receive the structured light that directly enters after being reflected by the calibration board to acquire the first reference image. The reference spots included in the first reference image may be the first reference spots. The first reference spots may be formed when the laser light is diffracted by the diffractive optical element 213 when it passes through the diffractive optical element 213, directly emitted to the calibration board, and modulated and reflected by the calibration board. Subsequently, the processor 200 calibrates the second reference image based on the first calculation method, that is, the calibration method of the reference image in the action 031. At this time, the second reference image may include both first reference spots corresponding to the first measurement spots and second reference spots corresponding to the second measurement spots. In the calibration scene of the first reference image and the second reference image, the relative position between the calibration board and the structured light projector 21 or between the calibration board and the structured light camera 22 remains unchanged. The relative position between the structured light projector 21 and the structured light camera 22 also remains unchanged. Subsequently, the processor 200 may mark coordinates of the first reference spots in the first reference image, and filter out the first reference spots in the second reference image based on the coordinates of the first reference spots. The remaining reference spots in the second reference image may be the second reference spots. In this way, the processor 200 may distinguish the first reference spots and the second reference spots among all the reference spots in the second reference image.
In the subsequent calculation of the depth data, the measurement spots in the speckle image also need to be distinguished. In detail, the first measurement spots and the second measurement spots may be distinguished by brightness. It may be understood that the first measurement spots are formed when the laser light is diffracted by the diffractive optical element 213, and the second measurement spots are formed when the laser light is diffracted by the diffractive optical element 213, and then diffracted by the display screen 10. The times of diffraction of the laser light for forming the second measurement spots is more than the times of diffraction of the laser light for forming the first measurement spots. Therefore, the energy loss of the laser light for forming the first measurement spots is smaller, and the energy loss of the laser light for forming the second measurement spots is larger. The brightness of the second measurement spots will be lower than the brightness of the first measurement spots. In this way, it is feasible to distinguish the first measurement spots and the second measurement spots based on the brightness. After the calibration of the reference images is completed, it is necessary to further calibrate the preset brightness and the preset ratio for distinguishing the first measurement spots from the second measurement spots. In detail, after the processor 200 distinguishes the first reference spots and the second reference spots, the processor 200 may calculate the average brightness value of the first reference spots in the second reference image, and calculate the average brightness value of the second reference spots in the second reference image. Subsequently, the processor 200 may use the average brightness value of the first reference spots as the preset brightness, and calculate the ratio between the average brightness value of the second reference spots and the average brightness value of the first reference spots as the preset ratio.
In the subsequent depth data calculation, the processor 200 may first calculate the brightness of each measurement spot. Subsequently, the processor 200 may calculate the actual ratio between each measurement spot and the preset brightness, and classify the measurement spots with the actual ratio greater than or equal to the preset ratio as the first measurement spots, and classify the measurement spots with the actual ratio less than the preset ratio as the second measurement spots. Therefore, the first measurement spots and the second measurement spots are distinguished. For example, as illustrated in
After the processor 200 distinguishes the first measurement spots and the second measurement spots, since the first reference spots and the second reference spots in the second reference image have also been distinguished, the processor 200 may use the speckle image and the second reference image to calculate the depth data. In detail, the processor 200 first may calculate the offsets of the first measurement spots relative to the first reference spots, and the offsets of the second measurement spots relative to the second reference spots. Subsequently, the processor 200 may calculate pieces of depth data based on the offsets. The pieces of depth data may form the depth image.
Compared with the first calculation method, the second calculation method distinguishes the first measurement spots from the second measurement spots, and distinguishes the first reference spots from the second reference spots. Therefore, the more accurate offsets may be acquired based on the more accurate correspondence relationship between the first measurement spots and the first reference spots, and the more accurate correspondence relationship between the second measurement spots and the second reference spots. Furthermore, the more accurate depth data may be acquired. Therefore, the accuracy of the acquired depth image may be improved.
In some embodiments, the preset brightness and the preset ratio may be determined by the ambient brightness of the scene and the luminous power of the structured light projector 21. It may be understood that there may be an infrared light component in the ambient light. This infrared light component in the ambient light may be superimposed on the measurement spots to increase the brightness of the measurement spots. The luminous power of the structured light projector 21 is closely related to the brightness of the measurement spots. When the luminous power of the structured light projector 21 is larger, the brightness of the measurement spots may be correspondingly higher. When the luminous power of the structured light projector 21 is smaller, the brightness of the measurement spots may be correspondingly lower. Therefore, different ambient brightness and luminous power should have different preset brightness and preset ratio. The preset brightness and preset ratio under different ambient brightness and different luminous power may also be calibrated based on the calibration process of the action 032 and the action 033. In the calibration process, the ambient brightness of the calibration scene and the luminous power of the structured light projector 21 are changed to acquire the preset brightness and the preset ratio corresponding to the ambient brightness and luminous power. The luminous power of the structured light projector 21 is changed by changing the driving current of the light source 211. The correspondence among the ambient brightness, the luminous power, the preset brightness, and the preset ratio may be stored in the memory 300 (illustrated in
In some embodiments, the diffractive optical element 213 may be configured to diffract the laser light emitted by the light source 211 of the structured light projector 21 to increase the number of measurement spots or reference spots, and may also be configured to compensate the brightness uniformity of the structured light diffracted by the display screen 10. Therefore, the brightness uniformity of the spots in the speckle pattern projected into the scene may be better, which is beneficial to improve the accuracy of acquiring the depth image. In detail, the convex or concave structure in the diffractive optical element 213 may be arranged densely on the middle and sparse on both edges (sides). Therefore, the diffraction effect of the middle part of the diffractive optical element 213 is stronger than that of the edge part of the diffractive optical element 213. In this way, the laser light that enters on the middle part of the diffractive optical element 213 may be diffracted out of more light beams, and the laser light that enters on the edge part of the diffractive optical element 213 may be diffracted out of fewer light beams, thereby causing the brightness of the speckle pattern projected on the scene has the high uniformity.
In summary, in the method for acquiring the image according to embodiments of the disclosure, the structured light projector 21 and the structured light camera 22 are both located on the side of the rear surface 13 of the display screen 10, and the processor 200 may directly calculate the depth image based on the first measurement spots and the second measurement spots when the structured light received by the structured light camera 22 is the modulated structured light passing through the through slot 14. Compared with the method of calculating the depth image using the first measurement spots, the diffraction effect of the display screen 10 increases the number of the measurement spots and the randomness of the arrangement of the measurement spots, which is helpful to improve the accuracy of acquiring the depth image. Further, in the method for acquiring the image according to embodiments of the disclosure, it may appropriately simplify the complexity of the structure of the diffraction grating in the diffractive optical element 213, and instead use the diffraction effect of the display screen 10 to increase the number of the measurement spots and the randomness of the arrangement of the measurement spots, which ensures the accuracy of acquiring the depth image while simplifies the manufacturing process of the structured light projector 21.
With reference to
011: the structured light camera 22 is controlled to receive structured light that is diffracted by the display area 11 when exiting, then reflected by the target object, and enters directly, to acquire a speckle image. The speckle image may include a plurality of measurement spots. The plurality of measurement spots may include first measurement spots formed when the laser light is diffracted by the diffractive optical element 213 and reflected by the target object, and second measurement spots formed when the laser light is diffracted by the diffractive optical element 213, then diffracted by the display screen 10, and reflected by the target object. In detail, the first measurement spots may be formed when the laser light is diffracted by the diffractive optical element 213, not diffracted by the display screen 10 when passing through the display screen 10 (that is, the laser light is directly projected to the target object without encountering the microscopic gaps), and modulated and reflected by the target object. The second measurement spots may be formed when the laser light is diffracted by the diffractive optical element 213, diffracted by the display screen 10 when passing through the display screen 10 (that is, the laser light is projected to the target object after encountering the microscopic gaps), and modulated and reflected by the target object.
The action 02 may include the following.
022: the second measurement spots are filtered out from the speckle image to acquire the first measurement spots.
023: the depth image is acquired based on the first measurement spots and the reference spots in the reference image.
With reference to
With reference to
In detail, when the structured light projector 21 and the structured light camera 22 are arranged on the side of the rear surface 13 of the display screen 10, and the display screen 10 is provided with the through slot 14 that is aligned with the light incident surface of the structured light camera 22, the structured light camera 22 may capture the speckle image including the first measurement spots and the second measurement spots. In the subsequent calculation of the depth image, the processor 200 may filter out the second measurement spots from the speckle image, and calculate the depth image with the reference spots in the reference image based on the remaining first measurement spots. At this time, the reference spots in the reference image should include the first reference spots formed by the laser light being diffracted by the diffractive optical element 213 and reflected by the calibration object. Therefore, the influence of the display screen 10 on the structured light may be eliminated by filtering out the second measurement spots from the speckle image. Therefore, it may be ensured that the depth image acquired by the electronic device 1000 has a relatively higher accuracy under the condition that the electronic device 1000 has a relatively higher screen-to-body ratio.
In other words, with reference to
032: when calibrating the reference image, the structured light camera 22 is controlled to receive structured light that is emitted from the structured light projector 21, directly reflected by a calibration object, and enters directly, to acquire a first reference image. The first reference image may include a plurality of reference spots. The plurality of reference spots may include first reference spots formed by the laser light being diffracted by the diffractive optical element 213 and reflected by the calibration object.
The action 023 may include the following.
0231: offsets of the first measurement spots relative to the first reference spots are calculated.
0232: depth data is calculated based on the offsets to acquire the depth image.
With reference to
With reference to
In detail, after the processor 200 filters out the second measurement spots, the first measurement spots remain in the speckle image. The depth image should be calculated based in the speckle image and the first reference image containing the first reference spots corresponding to the first measurement spots. The calibration process of the first reference image is consistent with the calibration process of placing the structured light projector 21 in a scene without the obstruction of the display screen 10, which is not repeated herein. The reference spots in the first reference image captured by the structured light may be the first reference spots formed by the laser light being diffracted by the diffractive optical element 213 and reflected by the calibration object. In this way, the processor 200 may calculate the offsets of the first measurement spots relative to the first reference spots, and then calculate pieces of depth data based on the offsets to obtain the depth image.
The processor 200 may filter out the second measurement spots through the brightness. In other words, with reference to
032: when calibrating the reference image, the structured light camera 22 is controlled to receive structured light that is emitted from the structured light projector 21, directly reflected by a calibration object, and enters directly, to acquire a first reference image. The first reference image may include a plurality of reference spots. The plurality of reference spots may include first reference spots formed by the laser light being diffracted by the diffractive optical element 213 and reflected by the calibration object.
033: when calibrating the reference image, the structured light camera 22 is controlled to receive structured light that is diffracted by the display area 11 when exiting, then reflected by a calibration object, and enters directly, to acquire a second reference image. The second reference image may include a plurality of reference spots. The plurality of reference spots may include the first reference spots formed by the laser light being diffracted by the diffractive optical element 213 and reflected by the calibration object, and second reference spots formed by the laser light being diffracted by the diffractive optical element 213, then diffracted by the display screen 10, and reflected by the calibration object.
041: the first reference image is compared with the second reference image to acquire the second reference spots.
051: a ratio between an average brightness value of the second reference spots and an average brightness value of the first reference spots is calculated as the preset ratio, and an average brightness value of the first reference spots is calculated as the preset brightness.
The action 022 may include the following.
0221: an actual ratio between each measurement spot and the preset brightness is calculated.
0222: the measurement spots with the actual ratio greater than the preset ratio are classified as the first measurement spot, and the measurement spots with the actual ratio less than the preset ratio are classified as the second measurement spots.
0223: the second measurement spots are filtered out from all measurement spots to acquire the first measurement spots.
With reference to
With reference to
The process described in the action 032 for calibrating the first reference image is consistent with the calibration process of the scene where the structured light projector 21 is arranged without the obstruction of the display screen 10 in the aforementioned action 032, which is not repeated herein. The process described in the action 033 for calibrating the second reference image is consistent with the calibration process of the scene where the structured light projector 21 and the structured light camera 22 are arranged on the side of the rear surface 13 of the display screen 10, and the light incident surface of the structured light camera 22 is aligned with the through slot 14 of the display screen 10, in the aforementioned action 033, which is not repeated herein.
After obtaining the first reference image and the second reference image, the processor 200 may determine the first reference spots in the second reference image using the same manner as the foregoing action 041, that is, based on the coordinates of the first reference spots in the first reference image, and the remaining reference spots are the second reference spots, so that the first reference spots and the second reference spots may be distinguished. Subsequently, the processor 200 may calibrate and calculate the preset brightness and the preset ratio based on the distinguished first reference spots and second reference spots by the same method as the aforementioned action 051.
Similarly, in the subsequent calculation of the depth image, the processor 200 may employ the same manner as the foregoing action 061 and the foregoing action 071, that is, distinguish the first measurement spots from the second measurement spots based on the calibrated preset ratio and preset brightness. Subsequently, the processor 200 may filter out the second measurement spots to remain the first measurement spots, calculate the offsets of the first measurement spots relative to the first reference spots, and finally calculate the depth data based on the offsets to acquire the depth image.
In some embodiments, the preset brightness and the preset ratio may also be determined by the ambient brightness of the scene and the luminous power of the structured light projector 21. In this way, the accuracy of filtering out the second measurement spots may be improved.
In some embodiments, the diffractive optical element 213 may be configured to diffract the laser light emitted by the light source 211 of the structured light projector 21 to increase the number of measurement spots or reference spots, and may also be configured to compensate the brightness uniformity of the structured light diffracted by the display screen 10. Therefore, the brightness uniformity of the spots in the speckle pattern projected into the scene may be better, which is beneficial to improve the accuracy of acquiring the depth image.
In summary, in the method for acquiring the image according to embodiments of the disclosure, when the structured light projector 21 and the structured light camera 22 are both arranged under the display screen 10, and the structured light camera 22 receives the modulated structured light passing through the through slot 14, the second measurement spots may be filtered out first, and the depth image may be calculated based on the remaining first measurement spots. Therefore, the amount of data processing of the processor 200 may be reduced and it is beneficial to speed up the process of acquiring the depth image.
With reference to
012: the structured light camera 22 is controlled to receive structured light that is diffracted by the display area 11 when exiting, then reflected by a target object, and diffracted by the display area 11 again when entering to acquire a speckle image. The speckle image may include a plurality of measurement spots. The plurality of measurement spots may include first measurement spots formed when laser light is diffracted by the diffractive optical element 213 (illustrated in
The action 02 may include the following.
024: the depth image is calculated based on the first measurement spots, the second measurement spots, and the third measurement spots in the speckle image, and reference spots in a reference image.
With reference to
With reference to
In detail, with reference to
When the structured light camera 22 is imaging, the structured light camera 22 may receive the structured light reflected by the target object in the scene to form the speckle image. In some embodiment of the disclosure, since the display screen 10 is not provided with the through slot 14, the laser light that is diffracted by the diffractive optical element 213 and then by the display screen 10 and modulated by the target object to reflect back, may be diffracted again by the display area 11 in the display screen 10 when passing through the display screen 10. The structured light camera 22 may receive the structured light that is diffracted by the display area 11 when exiting and passing through the display area 11, then reflected by the target object, and diffracted by the display area 11 again when passing through the display area 11 again. The speckle image formed may include a plurality of measurement spots. The plurality of measurement spots may include the first measurement spots formed when the laser light is diffracted by the diffractive optical element 213 and reflected by the target object, the second measurement spots formed when the laser light is diffracted by the diffractive optical element 213, then diffracted by the display screen 10, and reflected by the target object, and the third measurement spots formed when the laser light is diffracted by the diffractive optical element 213, diffracted by the display screen 10, then reflected by the target object, and diffracted by the display screen again.
After the structured light camera 22 captures the speckle image, the processor 200 may directly calculate the depth image based on the first measurement spots, second measurement spots, and third measurement spots in the speckle image and the reference image. At this time, the reference spots in the reference image need to include first reference spots, second reference spots, and third reference spots. The depth image calculation method may include the following two manners.
With reference to
0241: offsets of all measurement spots relative to all reference spots are calculated.
0242: depth data is calculated based on the offsets to acquire the depth image.
Correspondingly, the method for acquiring the image may further include the following.
034: when calibrating the reference image, the structured light camera 22 is controlled to receive structured light that is diffracted by the display area 11 when exiting, then reflected by a calibration object, and diffracted by the display area 11 again when entering to acquire the reference image. The reference image may include references spots.
With reference to
With reference to
In detail, with reference to
Although the speckle image may include the first measurement spots, the second measurement spots, and the third measurement spots at the same time, the reference image includes the first reference spots, the second reference spots, and the third reference spots at the same time, the processor 200 may not distinguish the first measurement spots, the second measurement spots, and the third measurement spots in the speckle image, and also not distinguish the first reference spots, the second reference spots and the third reference spots in the reference image in this calculation method. The processor 200 may calculate the depth image based on all the measurements spots and all the reference spots. In detail, the processor 200 first calculates the offsets of all measurement spots relative to all reference spots, and then calculates pieces of depth data based on the offsets, so as to acquire the depth image.
With reference to
0243: offsets of the first measurement spots relative to the first reference spots are calculated, offsets of the second measurement spots relative to the second reference spots are calculated, and offsets of the third measurement spots relative to the third reference spots are calculated.
0244: depth data is calculated based on the offsets to acquire the depth image.
Correspondingly, the method for acquiring the image may further include the following.
035: when calibrating the reference image, the structured light camera 22 is controlled to receive structured light that is directly reflected by a calibration object after exiting from the structured light projector 21 and directly enters to acquire a first reference image. The first reference image may include a plurality of reference spots. The plurality of reference spots may include first reference spots formed when the laser light is diffracted by the diffractive optical element 213 and reflected by the calibration object.
036: when calibrating the reference image, the structured light camera 22 is controlled to receive structured light that is diffracted by the display area 11 when exiting and directly enters after being reflected by the calibration object to acquire a second reference image. The second reference image may include a plurality of reference spots. The plurality of reference spots may include first reference spots formed when the laser light is diffracted by the diffractive optical element 213 and reflected by the calibration object, and second reference spots formed when the laser light is diffracted by the diffractive optical element 213, diffracted by the display area 11, and reflected by the calibration object.
037: when calibrating the reference image, the structured light camera 22 is controlled to receive structured light that is diffracted by the display area 11 when exiting, reflected by the calibration object, and diffracted by the display area 11 when enters through the display area 11 to acquire a third reference image. The third reference image may include a plurality of reference spots. The plurality of reference spots may include first reference spots formed when the laser light is diffracted by the diffractive optical element 213 and reflected by the calibration object, second reference spots formed when the laser light is diffracted by the diffractive optical element 213, diffracted by the display area 11, and reflected by the calibration object, and third reference spots formed when the laser light is diffracted by the diffractive optical element 213, diffracted by the display area 11, reflected by the calibration object, and diffracted by the display area 11 again.
042: the first reference image is compared with the second reference image to acquire the second reference spots, and the third reference image is compared with the second reference image to acquire the third reference spots.
052: a ratio between an average brightness value of the second reference spots and an average brightness value of the first reference spots is calculated as the first preset ratio, a ratio between an average brightness value of the third reference spots and an average brightness value of the first reference spots is calculated as the second preset ratio, and an average brightness value of the first reference spots is calculated as the preset brightness.
062: an actual ratio between each of the plurality of measurement spots and a preset brightness is calculated.
072: measurement spots whose actual ratios are greater than the first preset ratio are classified as the first measurement spots, measurement spots whose actual ratios are less than the first preset ratio and greater than the second preset ratio are classified as the second measurement spots, measurement spots whose actual ratios are less than the second preset ratio are classified as the third measurement spots.
With reference to
With reference to
In this calculation method, the processor 200 needs to calibrate the first reference image, the second reference image, and the third reference image.
In detail, the processor 200 first controls the structured light projector 21 to emit structured light to the calibration board in a scene without the display screen 10, and then controls the structured light camera 22 to receive the structured light directly enters after being reflected by the calibration board to acquire the first reference image. The reference spots included in the first reference image are the first reference spots. The first reference spots may be formed when the laser light is diffracted by the diffractive optical element 213 when passing through the diffractive optical element 213, then directly emitted to the calibration board and then modulated and reflected by the calibration board.
Subsequently, the structured light projector 21 and the structured light camera 22 both are arranged on the side of the rear surface 13 of the display screen 10, and the display screen 10 is provided with the through slot 14 aligned with the light incident surface of the structured light camera 22. In a scene where the structured light camera 22 may receive the modulated structured light passing through the through slot 14, the processor 200 controls the structured light projector 21 to emit structured light. The structured light passes through the display area 11 and is projected to the calibration board separated from the structured light assembly 20 by a predetermined distance. The structured light reflected by the calibration board passes through the through slot 14 and is received by the structured light camera 22 to acquire the second reference image. The reference spots in the second reference image may include the first reference spots and the second reference spots at the same time. The first reference spots may be formed when the laser light is diffracted by the diffractive optical element 213 when it passes through the diffractive optical element 213, not diffracted by the display screen 10 when it passes through the display screen 10, and modulated and reflected by the calibration board. The second reference spots may be formed when the laser light is diffracted by the diffractive optical element 213 for the first time when it passes through the diffractive optical element 213, diffracted by the display screen 10 when it passes through the display screen 10 for the second time, and modulated and reflected by the calibration board.
Subsequently the processor 200 calibrates the third reference image based on the first calculation method, that is, the method for calibrating the reference image described in the action 034. At this time, the third reference image may include the first reference spots corresponding to the first measurement spots, the second reference spots corresponding to the second measurement spots, and the third reference spots corresponding to the third measurement spots.
In the calibration scene of the first reference image, the second reference image, and the third reference image, the positions of the calibration board relative to the structured light projector 21 and the structured light camera 22 remain unchanged. The position of the structured light projector 21 and the structured light camera 22 remains unchanged.
Subsequently, the processor 200 marks the first coordinates of the first reference spots in the first reference image, and then filters out the first reference spots from the second reference image based on the coordinates of the first reference spots. The remaining reference spots in the second reference image may be the second reference spots. The processor 200 marks the second coordinates of the second reference spots in the second reference image. The processor 200 filters out the first reference spots and the second reference spots from the third reference image based on the first coordinates and the second coordinates in the second reference image respectively. The remaining reference spots in the third reference image may be the third reference spots. In this way, the processor 200 may distinguish the first reference spots, the second reference spots, and the third reference spots among all the reference spots in the third reference image.
The measurement spots in the speckle image also need to be distinguished in the subsequent calculation of the depth data. In detail, the first measurement spots, the second measurement spots, and the third measurement spots may be distinguished by brightness. It may be understood that the first measurement spots may be formed when the laser light is diffracted by the diffractive optical element 213 for the first time, the second measurement spots may be formed when the laser light is diffracted by the diffractive optical element 213 for the first time and then diffracted by the display screen 10 for the second time, and the third measurement spots may be formed when the laser light is diffracted by the diffractive optical element 213 and then diffracted by the display screen 10 for the second time and the third time. The times of diffraction of the laser light for forming the second measurement spots is more than the times of diffraction of the laser light for forming the first measurement spots. The times of diffraction of the laser light for forming the third measurement spots is more than the times of diffraction of the laser light for forming the second measurement spots. Therefore, the energy loss of the laser light for forming the first measurement spots is smallest, and the energy loss of the laser light for forming the third measurement spots is largest. The brightness of the second measurement spots will be lower than the brightness of the first measurement spots. The brightness of the third measurement spots will be lower than the brightness of the second measurement spots. In this way, it is feasible to distinguish the first measurement spots, the second measurement spots, and the third measurement spots based on the brightness. After the calibration of the reference images is completed, it is necessary to further calibrate the preset brightness and the preset ratio for distinguishing the first measurement spots, the second measurement spots, and the third measurement spots. In detail, after the processor 200 distinguishes the first reference spots, the second reference spots, and the third reference spots, the processor 200 may calculate the average brightness value of the first reference spots in the third reference image, the average brightness value of the second reference spots in the third reference image, and the average brightness value of the third reference spots in the third reference image. The processor 200 may use the average brightness value of the first reference spots as the preset brightness, calculate the ratio between the average brightness value of the second reference spots and the average brightness value of the first reference spots as the first preset ratio, and calculate the ratio between the average brightness value of the third reference spots and the average brightness value of the first reference spots as the second preset ratio.
In the subsequent depth data calculation, the processor 200 first calculates the brightness of each measurement spot. Subsequently, the processor 200 calculates the actual ratio between each measurement spot and the preset brightness. The measurement spots whose actual ratio is greater than or equal to the first preset ratio are classified as the first measurement spots. The measurement spots whose actual ratio is less than the first preset ratio and greater than the second preset ratio are classified as the second measurement spots. The measurement spots whose actual ratio is less than the second preset ratio are classified as the third measurement spots. Therefore, the first measurement spots, the second measurement spots, and the third measurement spots may be distinguished. For example, as illustrated in
After the processor 200 distinguishes the first measurement spots, the second measurement spots and the third measurement spots, the processor 200 may calculate the depth data based on the speckle image and the third reference image because the first reference spots, the second reference spots, and the third reference spots in the third reference image have also been distinguished. In detail, the processor 200 first calculates the offsets of the first measurement spots relative to the first reference spots, the offsets of the second measurement spots relative to the second reference spots, and the offsets of the third measurement spots relative to the third reference spots. The processor 200 calculates pieces of depth data based on the offsets. The pieces of depth data may form the depth image.
Compared with the first calculation method, the second calculation method distinguishes the first measurement spots, the second measurement spots, and the third measurement spots, and distinguishes the first reference spots, the second reference spots, and the third reference spots. Based on the more accurate correspondence between the first measurement spots and the first reference spots, correspondence between the second measurement spots and the second reference spots, and correspondence between the third measurement spots and the third reference spots, the more accurate offsets may be acquired. The accurate depth data may be further acquired based on the more accurate offsets. Therefore, the accuracy of the acquired depth image may be improved.
In some embodiments, the preset brightness, the first preset ratio, and the second preset ratio may be determined by the ambient brightness of the scene and the luminous power of the structured light projector 21. In this way, the accuracy of the distinction among the first measurement spots, the second measurement spots, and the third measurement spots may be improved.
In some embodiments, the diffractive optical element 213 may be configured to diffract the laser light emitted by the light source 211 of the structured light projector 21 to increase the number of measurement spots or reference spots, and may also be configured to compensate the brightness uniformity of the structured light diffracted by the display screen 10. Therefore, the brightness uniformity of the spots in the speckle pattern projected into the scene may be better, which is beneficial to improve the accuracy of acquiring the depth image.
In summary, in the method for acquiring the image according to embodiments of the disclosure, the structured light projector 21 and the structured light camera 22 are both arranged on the side of the rear surface 13 of the display screen 10, and the structured light camera 22 receives the modulated structured light that passes through the display area 11 twice. Therefore, the processor 200 may directly calculate the depth image based on the first measurement spots, the second measurement spots, and the third measurement spots. Compared with the method of calculating the depth image using the first measurement spots, the diffraction effect of the display screen 10 increases the number of the measurement spots and the randomness of the arrangement of the measurement spots, which is helpful to improve the accuracy of acquiring the depth image. Further, in the method for acquiring the image according to embodiments of the disclosure, it may appropriately simplify the complexity of the structure of the diffraction grating in the diffractive optical element 213, and instead use the diffraction effect of the display screen 10 to increase the number of the measurement spots and the randomness of the arrangement of the measurement spots, which ensures the accuracy of acquiring the depth image while simplifies the manufacturing process of the structured light projector 21.
With reference to
012: the structured light camera 22 is controlled to receive structured light that is diffracted by the display area 11 when exiting, then reflected by a target object, and diffracted by the display area 11 again when entering to acquire a speckle image. The speckle image may include a plurality of measurement spots. The plurality of measurement spots may include first measurement spots formed when laser light is diffracted by the diffractive optical element 213 and reflected by the target object, second measurement spots formed when the laser light is diffracted by the diffractive optical element 213, then diffracted by the display screen 10, and reflected by the target object, and third measurement spots formed when the laser light is diffracted by the diffractive optical element 213, diffracted by the display screen 10, then reflected by the target object, and diffracted by the display screen 10 again. In detail, the first measurement spots may be formed when the laser light is diffracted by the diffractive optical element 213, not diffracted by the display screen 10 when passing through the display screen 10 (that is, it is directly projected to the target object without encountering the microscopic gaps), and modulated and reflected by the target object. The second measurement spots may be formed when the laser light is diffracted by the diffractive optical element 213, diffracted by the display screen 10 when passing through the display screen 10 (that is, it is projected to the target object after encountering the microscopic gaps), modulated and reflected by the target object, and not diffracted by the display screen 10 when passing through the display screen 10 again. The third measurement spots may be formed when the laser light is diffracted by the diffractive optical element 213, diffracted by the display screen 10 when passing through the display screen 10 (that is, it is projected to the target object after encountering the microscopic gaps), modulated and reflected by the target object, and diffracted by the microscopic gaps in the display screen 10 again when passing through the display screen 10 again.
The action 02 may include the following.
025: the second measurement spots and the third measurement spots are filtered out from the speckle image to acquire the first measurement spots.
026: the depth image is acquired based on the first measurement spots and reference spots in a reference image.
With reference to
With reference to
In detail, when the structured light projector 21 and the structured light camera 22 are arranged on the side of the rear surface 13 of the display screen 10 together, and the display screen 10 is not provided with the through slot 14, the structured light camera 22 may capture the speckle image including the first measurement spots, the second measurement spots, and the third measurement spots. In the subsequent calculation of the depth image, the processor 200 may filter out the second measurement spots and the third measurement spots from the speckle image, and employs the remaining first measurement spots and the reference spots in the reference image to calculate the depth image. At this time, the reference spots in the reference image should include the first reference spots formed when the laser light is diffracted by the diffractive optical element 213 and reflected by the calibration object. Therefore, the influence of the display screen 10 on the structured light may be eliminated by filtering out the second measurement spots and the third measurement spots from the speckle image, so that the accuracy of the acquired depth image is higher under the condition that the electronic device 1000 has a relatively high screen-to-body ratio.
In other words, with reference to
035: when calibrating the reference image, the structured light camera 22 is controlled to receive structured light that is directly reflected by a calibration object after exiting from the structured light projector 21 and directly enters to acquire a first reference image. The first reference image may include a plurality of reference spots. The plurality of reference spots may include first reference spots formed when the laser light is diffracted by the diffractive optical element 213 and reflected by the calibration object.
The action 026 may include the following.
0261: offsets of the first measurement spots relative to the first reference spots are calculated.
0262: depth data is calculated based on the offsets to acquire the depth image.
With reference to
With reference to
In detail, after the processor 200 filters out the second measurement spots and the third measurement spots, the first measurement spots remain in the speckle image. At this time, the speckle image should be calculated with the first reference image including the first reference spots corresponding to the first measurement spots to calculate the depth image. The process for calibrating the first reference image is consistent with the calibration process of the scene described in the action 035 where the structured light projector 21 is arranged without the obstruction of the display screen 10, which is not repeated herein. The reference spots in the first reference image captured by the structured light camera are the first reference spots formed when the laser light is diffracted by the diffractive optical element 213 and reflected by the calibration object. In this way, the processor 200 may calculate the offsets of the first measurement spots relative to the first reference spots and calculate pieces of depth data based on the offsets to acquire the depth image.
The processor 200 may filter out the second measurement spots and the third measurement spots by brightness. In other words, with reference to
035: when calibrating the reference image, the structured light camera 22 is controlled to receive structured light that is directly reflected by a calibration object after exiting from the structured light projector 21 and directly enters to acquire a first reference image. The first reference image may include a plurality of reference spots. The plurality of reference spots may include first reference spots formed when the laser light is diffracted by the diffractive optical element 213 and reflected by the calibration object.
036: when calibrating the reference image, the structured light camera 22 is controlled to receive structured light that is diffracted by the display area 11 when exiting and directly enters after being reflected by the calibration object to acquire a second reference image. The second reference image may include a plurality of reference spots. The plurality of reference spots may include first reference spots formed when the laser light is diffracted by the diffractive optical element 213 and reflected by the calibration object, and second reference spots formed when the laser light is diffracted by the diffractive optical element 213, diffracted by the display area 11, and reflected by the calibration object.
037: when calibrating the reference image, the structured light camera 22 is controlled to receive structured light that is diffracted by the display area 11 when exiting, reflected by the calibration object, and diffracted by the display area 11 when enters through the display area 11 to acquire a third reference image. The third reference image may include a plurality of reference spots. The plurality of reference spots may include first reference spots formed when the laser light is diffracted by the diffractive optical element 213 and reflected by the calibration object, second reference spots formed when the laser light is diffracted by the diffractive optical element 213, diffracted by the display area 11, and reflected by the calibration object, and third reference spots formed when the laser light is diffracted by the diffractive optical element 213, diffracted by the display area 11, reflected by the calibration object, and diffracted by the display area 11 again.
042: the first reference image is compared with the second reference image to acquire the second reference spots, and the third reference image is compared with the second reference image to acquire the third reference spots.
052: a ratio between an average brightness value of the second reference spots and an average brightness value of the first reference spots is calculated as the first preset ratio, a ratio between an average brightness value of the third reference spots and an average brightness value of the first reference spots is calculated as the second preset ratio, and an average brightness value of the first reference spots is calculated as the preset brightness.
The action 025 may include the following.
0251: an actual ratio between each of the plurality of measurement spots and the preset brightness is calculated.
0252: measurement spots whose actual ratios are greater than the first preset ratio are classified as the first measurement spots, measurement spots whose actual ratios are less than the first preset ratio and greater than the second preset ratio are classified as the second measurement spots, measurement spots whose actual ratios are less than the second preset ratio are classified as the third measurement spots.
0253: the second measurement spots and the third measurement spots are filtered out from all measurement spots to acquire the first measurement spots.
With reference to
With reference to
The process for calibrating the first reference image described in the action 035 is consistent with the calibration process of the scene described in the above-mentioned action 035 where the structured light projector 21 is arranged without the obstruction of the display screen 10. The process for calibrating the second reference image described in the action 036 is consistent with the calibration process of the scene described in the above-mentioned action 036 where the structured light projector 21 and the structured light camera 22 are arranged on the side of the rear surface 13 of the display screen 10, and the light incident surface of the structured light camera 22 is aligned with the through slot 14 of the display screen 10. The process for calibrating the third reference image described in the action 037 is consistent with the calibration process of the scene described in the above-mentioned action 037 where the structured light projector 21 and the structured light camera 22 are arranged on the side of the rear surface 13 of the display screen 10, and the display screen 10 is not provided with the through slot 14. All of the above is not repeated herein.
After acquiring the first reference image, the second reference image, and the third reference image, the processor 200 may employ the same manner as the foregoing action 042. That is, the first reference spots in the second reference image may be determined based on the first coordinates of the first reference spots in the first reference image, and the remaining reference spots in the second reference image are the second reference spots. The second coordinates of the second reference spots are marked, so that the first reference spots and the second reference spots in the second reference image are distinguished. Subsequently, the processor 200 may determine the first reference spots and the second reference spots in the third reference image based on the first coordinates and the second coordinates, and the remaining reference spots in the third reference image are the third reference spots. Therefore, the first reference spots, the second reference spots, and the third reference spots in the third reference image may be distinguished. Subsequently, the processor 200 may employ the same manner as the aforementioned action 052 to calibrate based on the distinguished first reference spots, second reference spots, and third reference spots to acquire the preset brightness, the first preset ratio, and the second preset ratio.
Similarly, in the subsequent calculation of the depth image, the processor 200 may employ the same manners as the foregoing action 062 and the foregoing action 072. That is, the first measurement spots, the second measurement spots, and the third measurement spots may be distinguished based on the calibrated first preset ratio, second preset ratio, and preset brightness. The second measurement spots and the third measurement spots may be filtered out to remain the first measurement spots. Offsets of the first measurement spots relative to the first reference spots may be calculated, and finally the depth data may be calculated based on the offsets to acquire the depth image.
In some embodiments, the preset brightness, the first preset ratio, and the second preset ratio may also be determined by the ambient brightness of the scene and the luminous power of the structured light projector 21. In this way, the accuracy of filtering out the second measurement spots and the third measurement spots may be improved.
In some embodiments, the diffractive optical element 213 may be configured to diffract the laser light emitted by the light source 211 of the structured light projector 21 to increase the number of measurement spots or reference spots, and may also be configured to compensate the brightness uniformity of the structured light diffracted by the display screen 10. Therefore, the brightness uniformity of the spots in the speckle pattern projected into the scene may be better, which is beneficial to improve the accuracy of acquiring the depth image.
In summary, in the method for acquiring the image according to embodiments of the disclosure, the structured light projector 21 and the structured light camera 22 are both located under the display screen 10, and the display screen 10 is not provided with the through slot 14. The second measurement spots and the third measurement spots are filtered out first. The depth image may be calculated based on the remaining first measurement spots, to reduce the amount of data processing of the processor 200, which is beneficial to speed up the process of acquiring the depth image.
With reference to
The action 01 may include the following.
013: the structured light camera 22 is controlled to receive structured light that passes through the compensation optical element 500 and the display area 11 of the display screen 10 sequentially when existing and is reflected by the target object to acquire a speckle image. The compensation optical element 500 is configured to counteract the diffraction effect of the display screen 10. The speckle image may include measurement spots. The measurement spots may include measurement spots formed when the laser light is diffracted by the diffractive optical element 213 and reflected by the target object.
With reference to
With reference to
In the method for acquiring the image according to embodiments of the disclosure, the compensation optical element 500 may be arranged between the structured light projector 21 and the display screen 10 to counteract the diffraction effect of the display screen 10. The compensation optical element 500 and the display screen 10 may be arranged at a certain distance (as illustrated in
In detail, with reference to
0131: the structured light camera 22 is controlled to receive structured light that passes through the compensation optical element 500 and the display area 11 when exiting, reflected by the target object, and enters directly, to acquire a speckle image.
The method for acquire the image may include the following.
038: when calibrating the reference image, the structured light camera 22 is controlled to receive structured light that passes through the compensation optical element 500 and the display area 11 when exiting, reflected by the calibration object, and enters directly, to acquire the reference image. The reference image may include reference spots. The reference spots may include reference spots formed when the laser light is diffracted by the diffractive optical element 213 and reflected by the calibration object.
The action 02 may include the following.
0271: offsets of the measurement spots relative to the reference spots are calculated.
0272: depth data is calculated based on the offsets to acquire the depth image.
With reference to
With reference to
An area of the compensation optical element 500 should be slightly greater than or equal to a divergence area formed by the structured light emitted by the structured light projector 21. In this way, the structured light emitted by the structured light projector 21 may all pass through the compensation optical element 500, which may counteract the diffraction effect of the display screen 10. In addition, the compensation optical element 500 may not block the light incident surface of the structured light camera 22, that is, the compensation optical element 500 may not overlap with the through slot 14. It may be understood that the through slot 14 may not have the diffraction effect. The structured light reflected by the target object may not be diffracted when passing through the through slot 14. Therefore, there is no need to arrange the compensation optical element 500 at the position of the through slot 14 to counteract the diffraction effect of the display area 11. Conversely, if the compensation optical element 500 is provided at the position of the through slot 14, the structured light passing through the compensation optical element 500 may be diffracted by the compensation optical element 500, resulting in that the speckle image received by the structured light camera 22 may include measurement spots when the laser light is diffracted by the diffractive optical element 213 when passing through the diffractive optical element 213, passes through the flat mirror formed by the compensation optical element 500 and the part of the display screen 10 corresponding to the compensation optical element 500, and is diffracted by the compensation optical element 500 again.
When the structured light camera 22 is arranged on the side of the rear surface of the display screen 10, and the display screen 10 is provided with the through slot 14 aligned with the light incident surface of the structured light camera 22, the laser light emitted by the light source 211 of the structured light projector 21 may pass through the compensation optical element 500 and the display area 11 in turn. The structured light camera 22 may receive the structured light that passes through the flat mirror including the compensation optical element 500 and the display screen 10 when emitting, is modulated and reflected by the target object, and passes through the through slot 14. Since the compensation optical element 500 may counteract the diffraction effect of the display area 11, the speckle image captured by the structured light camera 22 may include the measurement spots formed when the laser light is diffracted by the diffractive optical element 213 and reflected by the target object, but not include the measurement spots formed when the laser light is diffracted by the diffractive optical element 213, diffracted by the display screen 10, and reflected by the target object.
Correspondingly, the reference spots in the reference image should also include reference spots formed when the laser light is diffracted by the diffractive optical element 213 and reflected by the calibration object. The calibration scene should be as follows. The structured light projector 21 and the structured light camera 22 may be arranged on the side of the rear surface 13 of the display screen 10 provided with the compensation optical element 500, and the light incident surface of the structured light camera 22 is aligned with the through slot 14 of the display screen 10. In this way, in the calibration scene and the actual use scene, the positions of the structured light projector 21 and the structured light camera 22 relative to the display screen 10 are consistent. The processor 200 controls the structured light projector 21 to emit structured light. The structured light passes through the compensation optical element 500 and the display screen 10 in turn, and then is projected to a calibration board separated from the structured light assembly 20 by a predetermined distance. The structured light reflected by the calibration board passes through the through slot 14 and is received by the structured light camera 22. At this time, the structured light camera 22 receives the laser light that is emitted by the light source 211, diffracted by the diffractive optical element 213, and then directly enters through the through slot 14 after being reflected by the calibration board. The reference spots included in the formed reference image may be formed when the laser light is diffracted by the diffractive optical element 213 and reflected the calibration object.
When calculating the depth image, the processor 200 does not need to filter out the measurement spots formed when the laser light is diffracted for two times. The processor 200 may directly calculate the depth image based on the measurement spots formed when the laser light is diffracted for one time and the reference spots in the reference image. In detail, the processor 200 may calculate offsets between the measurements spots and the reference spots, and calculate the depth data based on the offsets, thereby acquiring the depth image.
Similarly, with reference to
0132: the structured light camera 22 is controlled receive structured light that sequentially passes through the compensation optical element 500 and the display area 11 when emitting, is reflected by the target object, and sequentially passes through the display area 11 and the compensation optical element 500 when entering, to acquire a speckle image.
The method for acquiring the image may include the following.
039: when calibrating the reference image, the structured light camera 22 is controlled to receive structured light that sequentially passes through the compensation optical element 500 and the display area 11 when emitting, is reflected by the calibration object, and sequentially passes through the display area 11 and the compensation optical element 500 when entering, to acquire the reference image. The reference image may include reference spots. The reference spots may include reference spots formed when the laser light is diffracted by the diffractive optical element 213 and reflected by the calibration object.
The action 02 may include the following.
0271: offsets of the measurement spots relative to the reference spots are calculated.
0272: depth data is calculated based on the offsets to acquire the depth image.
With reference to
With reference to
The compensation optical element 500 should completely cover the structured light projector 21 and the structured light camera 22 at the same time. In this way, on the one hand, the structured light emitted by the structured light projector 21 may all pass through the compensation optical element 500, which may counteract the diffraction effect of the display screen 10; on the other hand, the structured light reflected by the target object may also all pass through the compensation optical element 500 to counteract the diffraction effect of the display screen 10, so that the speckle image captured by the structured light camera 22 may include the measurement spot formed when the laser light is diffracted by the diffractive optical element 213 and reflected by the target object.
In detail, when the structured light camera 22 is arranged on the side of the rear surface 13 of the display screen 10, and the display screen 10 is not provided with the through slot 14, the laser light emitted by the light source 211 of the structured light projector 21 may sequentially pass through the compensation optical element 500 and the display area 11. The structured light camera 22 may receive the structured light that passes through the flat mirror including the compensation optical element 500 and the display screen 10 when exiting, is reflected by the target object, and passes through the flat mirror including the compensation optical element 500 and the display screen 10 when entering. Since the compensation optical element 500 counteracts the diffraction effect of the display area 11, the speckle image captured by the structured light camera 22 may include the measurement spots formed when the laser light is diffracted by the diffractive optical element 213 and reflected by the target object, not include the measurement spots formed when the laser light is diffracted by the diffractive optical element 213, diffracted by the display screen 10, and reflected by the target object, and not include the measurement spots formed when the laser light is diffracted by the diffractive optical element 213, diffracted by the display screen 10, reflected by the target object, and diffracted by the display screen 10 again.
Correspondingly, the reference spots in the reference image should also include reference spots formed when the laser light is diffracted by the diffractive optical element 213 and reflected by the calibration object. The calibration scene should be as follows. The structured light projector 21 and the structured light camera 22 may be arranged on the side of the rear surface 13 of the display screen 10 having the compensation optical element 500, and the display screen 10 may be not provided with the through slot 14. In this way, in the calibration scene and the actual use scene, the positions of the structured light projector 21 and the structured light camera 22 relative to the display screen 10 are consistent. The processor 200 may control the structured light projector 21 to emit structured light. The structured light passes through the compensation optical element 500 and the display screen 10 in turn, and is projected to the calibration board separated from the structured light assembly 20 by a predetermined distance. The structured light reflected by the calibration board passes through the display screen 10 and the compensation optical element 500 in turn, and is received by the structured light camera 22. The reference spots included in the formed reference image are the reference spots formed when the laser light is diffracted by the diffractive optical element 213 and reflected by the calibration object.
When calculating the depth image, the processor 200 may not need to filter out the measurement spots formed when the laser light is diffracted for multiple times. The processor 200 may directly calculate the depth image based on the measurement spots formed when the laser light is diffracted for one time and the reference spots in the reference image. In detail, the processor 200 may calculate the offsets between the measurement spots and the reference spots, and calculate the depth data based on the offsets, thereby acquiring the depth image.
In summary, in the method for acquiring the image according to embodiments of the disclosure, the diffraction effect of the display screen 10 may be counteracted by setting the compensation optical element 500. In this way, the speckle image captured by the structured light camera 22 may include the measurement spots formed when the laser light is diffracted by the diffractive optical element 213 and reflected by the target object, not include the measurement spots formed when the laser light is diffracted by the diffractive optical element 213, diffracted by the display screen 10, and reflected by the target object, and not include the measurement spots formed when the laser light is diffracted by the diffractive optical element 213, diffracted by the display screen 10, reflected by the target object, and diffracted by the display screen 10 again
With reference to
014: the structured light camera 22 is controlled to receive structured light that is diffracted by the display area 11 of the display screen 10 when existing and reflected by the target object to acquire a speckle image. The optical element 214 in the structured light projector 21 is configured to compensate the brightness uniformity of the structured light diffracted by the display screen 10. The speckle image may include measurement spots.
With reference to
With reference to
In detail, microscopic gaps may be formed between adjacent pixels in the display area 11 of the display screen 10. The structured light emitted by the structured light projector 21 passes through the display area 11 and is diffracted by the display area 11 to form spots. However, the brightness distribution of the diffracted spots in the display area 11 is not uniform.
In the method for acquiring the image according to embodiments of the disclosure, the diffraction effect of the display screen 10 may be employed to form the spots. The diffractive optical element 213 in the structured light projector 21 is replaced with the optical element 214. The optical element 214 may compensate the brightness uniformity of the structured light diffracted by the display screen 10. That is, the laser light emitted by the light source 211 of the structured light projector 21 passes through the optical element 214 and the display screen 10 in turn and is projected to the scene. There are spots in the speckle pattern projected on the scene. The brightness of the spots is relatively uniform. The spots are diffracted by the display screen 10, and the brightness uniformity of the spots is compensated by the optical element 214.
In this way, the measurement spots in the speckle image captured by the structured light camera 22 are directly formed by the diffraction effect of the display screen 10, and the processor 200 may calculate the depth image based on these measurement spots. The optical element 214 compensates the brightness uniformity of the structured light diffracted by the display screen 10 to improve the accuracy of the acquired depth image.
With reference to
0141: the structured light camera 22 is controlled to receive structured light that is diffracted by the display area 11 and directly enters after being reflected by the target object to acquire a speckle image. The optical element 214 is configured to compensate the brightness uniformity of the structured light diffracted by the display screen 10. The speckle image may measurement spots. The measurement spots may include the first measurement spots formed when the laser light is diffused by the optical element 214 and then diffracted by the display screen 10 and reflected by the target object.
The method for acquiring the image may include the following.
091: when calibrating the reference image, the structured light camera 22 is controlled to receive structured light that is diffracted by the display area 11, reflected by a calibration object, and enters directly to acquire the reference image. The optical element 214 is configured to compensate the brightness uniformity of the structured light diffracted by the display screen 10. The reference image may include reference spots. The reference spots may include first reference spot formed when the laser light is diffused by the optical element 214, diffracted by the display screen 10, and reflected by the calibration object.
The action 02 may include the following.
0281: offsets of the first measurement spots relative to the first reference spots are calculated.
0282: depth data is calculated based on the offsets to acquire the depth image.
With reference to
With reference to
In detail, when the structured light camera 22 is arranged on the side of the rear surface 13 of the display screen 10, and the display screen 10 is provided with the through slot 14 aligned with the light incident surface of the structured light camera 22, the laser light emitted by the light source 211 of the structured light projector 21 sequentially passes through the optical element 214 and the display area 11 of the display screen 10 to form the structured light and exit into the scene. The structured light is reflected by the target object and enters through the through slot 14 to be received by the structured light camera 22. Since the optical element 214 is configured to compensate the brightness uniformity of the structured light diffracted by the display screen 10 without increasing the number of measurement spots, and the through slot 14 does not have microscopic gaps and does not diffract the reflected structured light, the speckle image captured by the structured light camera 22 may include the first measurement spots formed when the laser light is diffused by the optical element 214, diffracted by the display screen 10, and reflected by the target object.
Correspondingly, the reference spots in the reference image should also include the first reference spots formed when the laser light is diffused by the optical element 214, diffracted by the display screen 10, and reflected by the calibration object. The calibration scene should be as follows. The structured light projector 21 with the optical element 214, and the structured light camera 22 may be arranged on the side of the rear surface 13 of the display screen 10, and the light incident surface of the structured light camera 22 is aligned with the through slot 14 of the display screen 10. In this way, in the calibration scene and the actual use scene, the positions of the structured light projector 21 and the structured light camera 22 relative to the display screen 10 are consistent. The processor 200 controls the light source 211 of the structured light projector 21 to emit laser light, and the laser light passes through the optical element 214 and the display screen 10 in turn to form the structured light. The structured light is projected to the calibration board separated by a predetermined distance from the structured light assembly 20, and is reflected by the calibration board. The structured light reflected by the calibration board passes through the through slot 14 and is received by the structured light camera 22. At this time, the structured light camera 22 receives the laser light emitted by the light source 211, diffused by the diffractive optical element 213, diffracted by the display screen 10, reflected by the calibration board, and directly enters through the through slot 14. The reference spots in the formed reference image may include the first reference spots formed when the laser light is diffused by the optical element 214, diffracted by the display screen 10, and reflected by the calibration object.
When calculating the depth image, the processor 200 directly calculates the depth image based on the first measurement spots and the first reference spots in the reference image. In detail, the processor 200 calculates the offsets between the first measurement spots and the first reference spots, and calculates the depth data based on the offsets to acquire the depth image.
Similarly, with reference to
0142: the structured light camera 22 is controlled to receive structured light that is diffracted by the display area 11 when exiting, reflected by the target object, and diffracted by the display area 11 when entering to acquire a speckle image. The optical element 214 is configured to compensate the brightness uniformity of the structured light diffracted by the display screen 10. The speckle image may include measurement spots. The measurement spots may include first measurement spots formed when the laser light is diffused by the optical element 214, diffracted by the display screen 10, and reflected by the target object, and second measurement spots formed when the laser light is diffused by the optical element 214, diffracted by the display screen 10, reflected by the target object, and diffracted by the display screen 10 again.
With reference to
With reference to
In detail, when the structured light camera 22 is arranged on the side of the rear surface 13 of the display screen 10, and the display screen 10 is not provided with the through slot 14, the laser light emitted by the light source 211 of the structured light projector 21 passes through the optical element 214 and the display area 11 of the display screen 10 to form the structured light to exit into the scene. The structured light may be reflected by the target object and enter through the display screen 10 to be received by the structured light camera 22. Since the optical element 214 is configured to compensate the brightness uniformity of the structured light diffracted by the display screen 10 without increasing the number of measurement spots, and the display screen 10 has microscopic gaps that diffract the reflected structured light, the speckle image captured by the structured light camera 22 may include first measurement spots formed when the laser light is diffused by the optical element 214, diffracted by the display screen 10, and reflected by the target object, and second measurement spots formed when the laser light is diffused by the optical element 214, diffracted by the display screen 10, reflected by the target object, and diffracted by the display screen 10 again.
After the structured light camera 22 captures the speckle image, the processor 200 may directly calculate the depth image based on the first measurement spots and the second measurement spots in the speckle image and the reference spots in the reference image. The calculation manner of the depth image may include the two following.
With reference to
0283: offsets of all measurements pots relative to all reference spots are calculated.
0284: depth data is calculated based on the offsets to acquire a depth image.
Correspondingly, the method for acquiring the image may include the following.
092: when calibrating the reference image, the structured light camera 22 is controlled to receive structured light that is diffracted by the display area 11 when existing, reflected by a calibration object, and is diffracted by the display area 11 again when entering to acquire the reference image. The optical element 214 is configured to compensate the brightness uniformity of the structured light diffracted by the display screen 10. The reference image may include reference spots. The reference spots may include first reference spots formed when the laser light is diffused by the optical element 214, diffracted by the display screen 10, and reflected by the calibration object, and second reference spots formed when the laser light is diffused by the optical element 214, diffracted by the display screen 10, reflected by the calibration object, and diffracted by the display screen 10 again.
With reference to
With reference to
In detail, in the process of calibrating the reference image, the structured light projector 21 provided with the optical element 214, and the structured light camera 22 may be arranged on the side of the rear surface 13 of the display screen 10, and the display screen 10 is not provided with the through slot 14. In this way, in the calibration scene and the actual use scene, the positions of the structured light projector 21 and the structured light camera 22 relative to the display screen 10 are consistent. The processor 200 controls the light source 211 of the structured light projector 21 to emit laser light. The laser light passes through the optical element 214 and the display screen 10 in turn to form the structured light. The structured light is projected to a calibration board separated by a predetermined distance from the structured light assembly 20, and is reflected by the calibration board. The structured light reflected by the calibration board passes through the display screen 10, and is received by the structured light camera 22. At this time, the reference image captured by the structured light camera 22 includes first reference spots and second reference spots at the same time. The first reference spots may be formed when the laser light is diffused by the optical element 214, diffracted by the display screen 10, and reflected by the calibration object. The second reference spots may be formed when the laser light is diffused by the optical element 214, diffracted by the display screen 10, reflected by the calibration object, and diffracted by the display screen 10 again. Although the speckle image includes both the first measurement spots and the second measurement spots, and the reference image includes both the first reference spots and the second reference spots, in this calculation method, the processor 200 does not distinguish the first measurement spots and the second measurement spots in the speckle image, and also does not distinguish the first reference spots and the second reference spots in the reference image. The processor 200 may directly calculate the depth image based on all the measurement spots and reference spots. In detail, the processor 200 first calculates the offsets of all measurement spots relative to all reference spots, and calculates pieces of depth data based on the offsets, so as to acquire the depth image.
With reference to
0285: offsets of the first measurement spots relative to the first reference spots are calculated, and offsets of the second measurement spots relative to the second reference spots are calculated.
0286: depth data is calculated based on the offsets to acquire the depth image.
Correspondingly, the method for acquiring the image may include the following.
091: when calibrating the reference image, the structured light camera 22 is controlled to receive structured light that is diffracted by the display area 11, reflected by a calibration object, and enters directly to acquire the reference image. The optical element 214 is configured to compensate the brightness uniformity of the structured light diffracted by the display screen 10. The reference image may include reference spots. The reference spots may include first reference spots formed when the laser light is diffused by the optical element 214, diffracted by the display screen 10, and reflected by the calibration object.
092: when calibrating the reference image, the structured light camera 22 is controlled to receive structured light that is diffracted by the display area 11 when existing, reflected by a calibration object, and is diffracted by the display area 11 again when entering to acquire the reference image. The optical element 214 is configured to compensate the brightness uniformity of the structured light diffracted by the display screen 10. The reference image may include reference spots. The reference spots may include first reference spots formed when the laser light is diffused by the optical element 214, diffracted by the display screen 10, and reflected by the calibration object, and second reference spots formed when the laser light is diffused by the optical element 214, diffracted by the display screen 10, reflected by the calibration object, and diffracted by the display screen 10 again.
043: the first reference image is compared with the second reference image to acquire the second reference spots.
053: a ratio between an average brightness value of the second reference spots and an average brightness value of the first reference spots is calculated as a preset ratio, and an average brightness value of the first reference spots is calculated as a preset brightness.
063: an actual ratio between each measurement spot and the preset brightness is calculated.
073: measurement spots with the actual ratio greater than the preset ratio are classified as the first measurement spots, and measurement spots with the actual ratio less than the preset ratio as the second measurement spots.
With reference to
With reference to
In this calculation method, the processor 200 needs to calibrate the first reference image and the second reference image. The process for calibrating the first reference image is consistent with the calibration process of the scene described in the action 091 where the structured light projector 21 provided with the optical element 214, and the structured light camera 22 may be arranged on the side of the rear surface 13 of the display screen 10, and the light incident surface of the structured light camera 22 is aligned with the through slot 14 of the display screen 10. The process for calibrating the second reference image is consistent with the calibration process of the scene described in the action 092 where the structured light projector 21 provided with the optical element 214, and the structured light camera 22 may be arranged on the side of the rear surface 13 of the display screen 10, and the display screen 10 does not have the through slot 14. All of the above may be not repeated herein.
After the processor 200 calibrates the first reference image and the second reference image, the processor 200 marks the coordinates of the first reference spots in the first reference image, and filters out the first reference spots in the second reference image based on the coordinates of the first reference spots. The remaining reference spots in the second reference image are the second reference spots. In this way, the processor may distinguish the first reference spots and the second reference spots among all the reference spots in the second reference image.
In the subsequent calculation of the depth data, the measurement spots in the speckle image also need to be distinguished. In detail, the first measurement spots and the second measurement spots may be distinguished by brightness. It may be understood that the first measurement spots may be formed when the laser light is diffused by the optical element 214 and diffracted by the display screen 10, and the second measurement spots may be formed when the laser light is diffused by the optical element 214, and diffracted by the display screen 10 for two times. The times of diffraction of the laser light for forming the second measurement spots is more than the times of diffraction of the laser light for forming the first measurement spots. Therefore, the energy loss of the laser light for forming the first measurement spots is smaller, and the energy loss of the laser light for forming the second measurement spots is larger. The brightness of the second measurement spots will be lower than the brightness of the first measurement spots. In this way, it is feasible to distinguish the first measurement spots and the second measurement spots based on the brightness. After the calibration of the reference images is completed, it is necessary to further calibrate the preset brightness for distinguishing the first measurement spots from the second measurement spots. In detail, after the processor 200 distinguishes the first reference spots and the second reference spots, the processor 200 may calculate the average brightness value of the first reference spots in the second reference image, and calculate the average brightness value of the second reference spots in the second reference image. Subsequently, the processor 200 may use the average brightness value of the first reference spots as the preset brightness, and calculate the ratio between the average brightness value of the second reference spots and the average brightness value of the first reference spots as the preset ratio.
In the subsequent depth data calculation, the processor 200 may first calculate the brightness of each measurement spot. Subsequently, the processor 200 may calculate the actual ratio between each measurement spot and the preset brightness, and classify the measurement spots with the actual ratio greater than or equal to the preset ratio as the first measurement spots, and classify the measurement spots with the actual ratio less than the preset ratio as the second measurement spots. Therefore, the first measurement spots and the second measurement spots are distinguished.
After the processor 200 distinguishes the first measurement spots and the second measurement spots, since the first reference spots and the second reference spots in the second reference image have also been distinguished, the processor 200 may use the speckle image and the second reference image to calculate the depth data. In detail, the processor 200 first may calculate the offsets of the first measurement spots relative to the first reference spots, and the offsets of the second measurement spots relative to the second reference spots. Subsequently, the processor 200 may calculate pieces of depth data based on the offsets. The pieces of depth data may form the depth image.
Compared with the first calculation method, the second calculation method distinguishes the first measurement spots from the second measurement spots, and distinguishes the first reference spots from the second reference spots. Therefore, the more accurate offsets may be acquired based on the more accurate correspondence relationship between the first measurement spots and the first reference spots, and the more accurate correspondence relationship between the second measurement spots and the second reference spots. Furthermore, the more accurate depth data may be acquired. Therefore, the accuracy of the acquired depth image may be improved.
In some embodiments, the preset brightness and the preset ratio may be determined by the ambient brightness of the scene and the luminous power of the structured light projector 21. In this way, the accuracy of distinguishing the first measurement spots and the second measurement spots may be improved.
In summary, in the method for acquiring the image according to embodiments of the disclosure, the measurement spots in the speckle image captured by the structured light camera 22 are directly formed by the diffraction effect of the display screen 10. The processor 200 may calculate the depth image based on these measurement spots. The optical element 214 compensates the brightness uniformity of the structured light diffracted by the display screen 10, which is beneficial to improve the acquisition accuracy of the depth image.
In the description of the present disclosure, reference throughout this specification to “an embodiment,” “some embodiments,” “an example,” “a specific example,” or “some examples,” means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present disclosure. Thus, the appearances of the phrases in various places throughout this specification are not necessarily referring to the same embodiment or example of the present disclosure. Furthermore, the particular features, structures, materials, or characteristics may be combined in any suitable manner in one or more embodiments or examples. Without a contradiction, the different embodiments or examples and the features of the different embodiments or examples can be combined by those skilled in the art.
In addition, terms such as “first” and “second” are used herein for purposes of description and are not intended to indicate or imply relative importance or significance. Furthermore, the feature defined with “first” and “second” may comprise one or more this feature distinctly or implicitly. In the description of the present disclosure, “a plurality of” means two or more than two, unless specified otherwise.
Although explanatory embodiments have been shown and described, it would be appreciated by those skilled in the art that the above embodiments cannot be construed to limit the present disclosure, and changes, alternatives, and modifications can be made in the embodiments without departing from scope of the present disclosure.
Number | Date | Country | Kind |
---|---|---|---|
201811287250.2 | Oct 2018 | CN | national |
This application is continuation of International Application No. PCT/CN2019/101431, filed on Aug. 19, 2019, which claims priority to and benefits of Chinese Patent Application No. 201811287250.2 filed on Oct. 31, 2018, the entire contents of both of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/CN2019/101431 | Aug 2019 | US |
Child | 17243431 | US |