The embodiments of the disclosure relate to a time-of-flight camera module and a display device.
Time of Flight (ToF) method is a method of calculating a distance by measuring flight time (that is, measuring time required for light from reflection to return). A ToF camera is a camera configured to use the ToF method to capture depth information about an object. According to different measurement principles, the time-of-flight method can be divided into a D-ToF (Direct-ToF) direct measurement and an I-ToF (Indirect-ToF) indirect measurement; among them, the D-ToF direct measurement is to measure a time difference required for light from emission to return, and then multiply the speed of light and divide by 2 to obtain the depth information; the I-ToF indirect measurement is to measure a phase difference between emitted light wave and returned light wave to derive depth information.
Initially, the TOF technology is mainly used in a field of distance measurement, but now with improvement of technical accuracy and improvement of algorithm and computing power such as artificial intelligence, especially improvement of TOF CCD/CMOS chip process, resolution of TOF camera is improved and power consumption is reduced, and more miniaturization and microminiaturization designs have emerged. Now, TOF cameras are widely used on mobile terminals to provide auxiliary imaging for mobile devices such as a mobile phone.
At least some embodiments of the disclosure provide a time-of-flight camera module. The time-of-flight camera module includes: a first carrier board, a second carrier board, a structure support, an infrared laser emission unit, a lens and a time-of-flight sensor, wherein the first carrier board, the structure support and the second carrier board are stacked, and the first carrier board and the second carrier board are fixed on both sides of the structure support, respectively, the infrared laser emission unit is arranged on a side of the first carrier board away from the structure support, the first carrier board comprises a lens via hole located on a side of the infrared laser emission unit, the structure support comprises a support via hole corresponding to the lens via hole, and the lens passes through the support via hole, the time-of-flight sensor is arranged on the second carrier board and is configured to sense infrared light collected by the lens to generate depth information.
For example, the time-of-flight camera module provided by some embodiments of the disclosure further comprises: a housing, wherein the structure support comprises a plurality of wedges, the housing comprises wedge mounting portions corresponding to the plurality of wedges one to one, and the structure support is fixed to the housing by matching of the plurality of wedges and the plurality of wedge mounting portions.
For example, in the time-of-flight camera module provided by some embodiments of the disclosure, the infrared laser emission unit comprises a first infrared laser emission unit and a second infrared laser emission unit, the first infrared laser emission unit and the second infrared laser emission unit are arranged side by side on the side of the first carrier board away from the structure support, the lens via hole is located on a side of the second infrared laser emission unit away from the first infrared laser emission unit.
For example, in the time-of-flight camera module provided by some embodiments of the disclosure, an orthographic projection of a center of the first infrared laser emission unit on the first carrier board, an orthographic projection of a center of the second infrared laser emission unit on the first carrier board and an orthographic projection of a center of the lens on the first carrier board are located on a same straight line.
For example, the time-of-flight camera module provided by some embodiments of the disclosure further comprises: a front housing, wherein the front housing comprises a front surface, the front surface comprises a first groove and a second groove recessing in a direction towards the structure support; the first groove comprises a first opening corresponding to the first infrared laser emission unit and a second opening corresponding to the second infrared laser emission unit, the second groove comprises a third opening corresponding to the lens; the time-of-flight camera module further comprises a first filter covering the second groove, and the first filter is configured to allow infrared light having wavelength the same as that of infrared light emitted by the first infrared laser emission unit and the second infrared laser emission unit to pass through.
For example, the time-of-flight camera module provided by some embodiments of the disclosure further comprises: a second filter covering the first groove, and the second filter is configured to allow infrared light having wavelength the same as that of infrared light emitted by the first infrared laser emission unit and the second infrared laser emission unit to pass through.
For example, in the time-of-flight camera module provided by some embodiments of the disclosure, a distance between centers of the first infrared laser emission unit and the second infrared laser emission unit is d, a distance from the first infrared laser emission module to the second filter and a distance from the second infrared laser emission unit to the second filter are both h, a horizontal emission angle of the first infrared laser emission unit and a horizontal emission angle of the second infrared laser emission unit are both a, and a vertical emission angle of the first infrared laser emission unit and a vertical emission angle of the second infrared laser emission unit are both b, a size c1 of the first cover plate in a direction parallel to a line connecting centers of the first infrared laser emission unit and the second infrared laser emission unit and a size c2 of the second filter in a direction perpendicular to the line connecting centers of the first infrared laser emission unit and the second infrared laser emission unit satisfy the following formula, respectively: c1>2h*tan(a/2)+d, c2>2h*tan(b/2).
For example, the time-of-flight camera module provided by some embodiments of the disclosure further comprises: a connection board, a second flexible circuit board and a cable, wherein the connection board is electrically connected with the first carrier board by the second flexible circuit board, the cable is electrically connected with the connection board, and the cable is configured to be connected with at least one of an external power supply and an external device.
For example, in the time-of-flight camera module provided by some embodiments of the disclosure, the structure support further comprises a fourth opening and a limit ring, the limit ring is located on a side of the structure support away from the first carrier board, the second flexible circuit board bends and passes through the fourth opening and the limit ring.
For example, the time-of-flight camera module provided by some embodiments of the disclosure further comprises: a front housing and a back housing, wherein the back housing comprises a back surface, and is fixed with the front housing by the back surface to achieve a whole machine coverage; the front housing comprises a front surface which is inclined relative to the back surface, and a value range of an inclination angle of the front surface relative to the back surface is [20°, 40°].
For example, the time-of-flight camera module provided by some embodiments of the disclosure further comprises: a back housing and a back housing blocking cover, wherein the back housing comprises a fifth opening, the connection board is fixed to the back housing and corresponds to the fifth opening; the back housing blocking cover is configured to block the fifth opening, the back housing blocking cover is a hollow structure, and the cable passes through the hollow structure to be connected with the connection board.
For example, the time-of-flight camera module provided by some embodiments of the disclosure further comprises: an infrared laser circuit module and a housing, wherein the infrared laser circuit module is arranged on the side of the first carrier board away from the structure support; thermal adhesive is provided on a side of the infrared laser circuit module away from the first carrier board, and is configured to be in contact with the housing, and a material of the housing comprises metal.
For example, the time-of-flight camera module provided by some embodiments of the disclosure further comprises: a first flexible circuit board, wherein the first carrier board and the second carrier board are both printed circuit boards, and the first carrier board is electrically connected with the second carrier board by the first flexible circuit board.
For example, the time-of-flight camera module provided by some embodiments of the disclosure further comprises: a computational component, wherein the computational component is arranged on the first carrier board, the computational component is configured to determine at least one of a number, a distance, a total activity time, and dwelling time of human body in a field of view of the time-of-flight camera module and provide an identification for the human body in the field of view of the time-of-flight camera module according to the depth information.
For example, in the time-of-flight camera module provided by some embodiments of the disclosure, a distance between the centers of the first infrared laser emission unit and the second infrared laser emission unit is in a range from 15 to 30 mm.
For example, in the time-of-flight camera module provided by some embodiments of the disclosure, a distance between the centers of the second infrared laser emission unit and the lens is in a range from 20 to 60 mm.
For example, in the time-of-flight camera module provided by some embodiments of the disclosure, a horizontal field of view of the lens is greater than 100°, and a vertical field of view of the lens is greater than 80°.
For example, in the time-of-flight camera module provided by some embodiments of the disclosure, the first infrared laser emission unit and the second infrared laser emission unit each comprises a vertical cavity surface emitting laser and a beam expander, the vertical cavity surface emitting laser has an emission power in a range from 1 W to 1.4 W, a horizontal emission angle of the vertical cavity surface emitting laser is greater than or equal to 95°, and a vertical emission angle of the vertical cavity surface emitting laser is greater than or equal to 75°, a maximum detection distance of the time-of-flight camera module is in a range from 5 m to 7 m.
For example, in the time-of-flight camera module provided by some embodiments of the disclosure, a mounting height of the time-of-flight camera module is H, a vertical field of view of the lens is α, a pitch angle of the lens of the time-of-flight camera module is θ, a detection distance range of the time-of-flight camera module is [L1, L2], a detection height range of the time-of-flight camera module at a detection distance L1 is [h1, h2], and the time-of-flight camera module satisfies the following relationship formula: tan(α/2−θ)*L1+H>h2, H−tan(α/2+θ)*L1<h1.
For example, in the time-of-flight camera module provided by some embodiments of the disclosure, L1 is in a range from 0.3 m to 0.7 m, L2 is in a range from 5 m to 7 m, h1 is in a range from 1 m to 1.2 m, and h2 is in a range from 1.9 m to 2.3 m.
At least some embodiments of the disclosure provide an application method of the time-of-flight camera module according to any of the embodiments as mentioned above. The application method includes: determining a detection distance range [L1, L2] and a detection height range [h1, h2] at a detection distance L1 required by an application scenario; determining a mounting height H of the time-of-flight camera module, a vertical field of view α of the lens and a lens pitch angle θ of the time-of-flight camera module, according to formulas: tan(α/2−θ)*L1+H>h2, H−tan(α/2+θ)*L1<h1; and determining an emission power of the first infrared laser emission unit and the second infrared laser emission unit according to a detection distance L2.
For example, in the application provided by some embodiments of the disclosure, L1 is in a range from 0.3 m to 0.7 m, L2 is in a range from 5 m to 7 m, h1 is in a range from in to 1.2 m, and h2 is in a range from 1.9 m to 2.3 in.
At least some embodiments of the disclosure provide a display device comprising: a device body and the time-of-flight camera module according to any embodiments as mentioned above, wherein the time-of-flight camera module is arranged on a top of the device body, and the device body comprises a display screen.
For example, in the display device provided by some embodiments of the disclosure, the time-of-flight camera module is configured to detect whether a person is in a predetermined detection region, the display screen is configured to display advertising content in the case that no person is in the predetermined detection region, and display product information in the case that a person is in the predetermined detection region.
For example, in the display device provided by some embodiments of the disclosure, the display screen is configured to display advertising content; the time-of-flight camera module is configured to: detect whether there is a person in the predetermined detection region, in the case that a person is in the predetermined detection region, provide an independent identification for each person in the predetermined detection region, track each person, and proceed statistics of dwelling time of each person in the predetermined detection region; and, in the case that the dwelling time is greater than a threshold, increase an advertising reading amount by 1; and in the case that the independent identification disappears and appears repeatedly, proceed no repeated counting for the advertising reading amount.
In order to clearly illustrate the technical solution of the embodiments of the disclosure, the drawings of the embodiments will be briefly described in the following; it is obvious that the described drawings are only related to some embodiments of the disclosure and thus are not limitative of the disclosure.
In order to make objects, technical details and advantages of the embodiments of the disclosure apparent, the technical solutions of the embodiment will be described in a clearly and fully understandable way in connection with the drawings related to the embodiments of the disclosure. It is obvious that the described embodiments are just a part but not all of the embodiments of the disclosure. Based on the described embodiments herein, those skilled in the art can obtain other embodiment(s), without any inventive work, which should be within the scope of the disclosure.
Unless otherwise defined, the technical terminology or scientific terminology used herein should have the general meanings understood by those skills in the art to which the present invention belongs. The “first”, “second” and similar words used in the present invention application specification and claims do not mean any sequence, amount or importance, but are merely used to distinguish different components. Likewise, “a” or “an” or similar words do not mean the limitation to amount, instead, it refers to at least one. The word “comprise”, “include” or the like only indicates that an element or a component before the word contains elements or components listed after the word and equivalents thereof, not excluding other elements or components. The words “connection”, “connected” and the like are not limited to physical or mechanical connection but may include electrical connection, either directly or indirectly. The words “on”, “beneath”, “left”, “right” and the like only indicate the relative position relationship which is correspondingly changed when the absolute position of a described object is changed.
With development of Internet of Things (IoT) technology, new retail models based on big data and artificial intelligence are gradually emerging around the world. While companies provide digital services to the general population, they further want to be able to access consumer information. For example, by collecting feedback from viewers while playing an advertising, companies can get an effective reading volume of the advertising, so that an advertising strategy can be adjusted to achieve a better commercial layout. Although human body recognition based on face recognition can obtain detailed attributes of a crowd through a camera, this method is bound to bring about personal attribute information records including facial features while obtaining information. In recent years, an issue of privacy leakage has become increasingly concerning. In order to take consideration to a protection of personal privacy, an only solution is not to involve a collection of any personal information at a hardware level.
In recent years, more and more scenarios use 3D depth information for object recognition and behavior detection, and 3D depth cameras are widely used in a field such as machine vision. Common 3D depth cameras include a 3D camera based on binocular stereo vision principle, a 3D camera based on structure light principle, or a 3D camera based on flight time principle. Both of the binocular stereo vision principle and the structure light principle are based on additional projection calculation of RGB/grayscale image to obtain depth information, and a device itself will process high-resolution images in a process of acquiring the depth information, therefore it cannot meet needs of privacy protection. The TOF technology is to directly measure a time/phase difference between emitted light and received light to obtain a point cloud map, and an acquisition of the depth information does not rely on two-dimensional image, therefore it can meet the needs of privacy protection.
At least some embodiments of the present disclosure provide a time-of-flight camera module. The time-of-flight camera module includes: a first carrier board, a second carrier board, a structure support, an infrared laser emission unit, a lens and a time-of-flight sensor; wherein the first carrier board, the structure support and the second carrier board are stacked, and the first carrier board and the second carrier board are fixed on both sides of the structure support, respectively. The infrared laser emission unit is arranged on a side of the first carrier board away from the structure support, the first carrier board comprises a lens via hole located on a side of the infrared laser emission unit, the structure support comprises a support via hole corresponding to the lens via hole, the lens passes through the support via hole, the time-of-flight sensor is arranged on the second carrier board and is configured to sense infrared light collected by the lens to generate depth information.
Some embodiments of the present disclosure further provide an application method and a display device corresponding to the time-of-flight camera module described above.
The time-of-flight camera module provided by the embodiments of the present disclosure, by a design of a triple-layer-stacking structure of the first carrier board, the structure support and the second carrier board, reasonably arranges each component (device) in each camera module, which at the same time is conducive to compactness, stability and heat dissipation of an overall structure.
The following embodiments of the present disclosure are described in detail in conjunction with accompanying drawings. It should be noted that, in order to maintain clarity and brevity of description of the present disclosed embodiments, detailed description of known functions and known parts (elements) may be omitted. In the case that any one of parts (elements) of the embodiments of the present disclosure appears in more than one of the drawings, the part (element) in each drawing is represented by the same or similar reference number.
For example, as shown in
For example, as shown in
For example, as shown in
For example, as shown in
For example, in some examples, an orthographic projection of a center of the first infrared laser emission unit 201 on the first carrier board 203, an orthographic projection of a center of the second infrared laser emission unit 202 on the first carrier board 203 and an orthographic projection of a center of the lens 205 (for example, a center of the lens via hole 204) on the first carrier board 203 are located on the same straight line, so that a laser coverage of the first infrared laser emission unit 201 and the second infrared laser emission unit 202 overlaps or roughly overlaps with a field of view of the lens 205, which is beneficial to meet requirements of ultra-wide angle and long distance-range detection. It should be noted that the orthographic projections of the three on the first carrier board 203 being located on the same straight line include that the orthographic projection of the center of one of them on the first carrier board 203 is slightly deviated from the straight line on which the orthographic projections of centers of the other two on the first carrier board 203 are located. In the case that a distance between the orthographic projection of the center of one of them on the first carrier board 203 and the straight line on which the orthographic projections of the centers of the other two on the first carrier board 203 are located is less than or equal to 10 mm, it may be considered that the orthographic projection of the center of the first infrared laser emission unit 201 on the first carrier board 203, the orthographic projection of the center of the second infrared laser emission unit 202 on the first carrier board 203 and the orthographic projection of the center of the lens 205 (that is, the center of the lens via hole 204) on the first carrier board 203 are located on the same straight line.
For example, as shown in
For example, the time-of-flight camera module may further include a computational component. For example, as shown in
For example, as shown in
That is, in the time-of-flight camera module provided in the embodiments of the present disclosure, the emission unit (for example, the first infrared laser emission unit 201 and the second infrared laser emission unit 202) and the sensing unit (for example, time-of-flight sensor 206) are integrated on two different printed circuit boards. By stacking an upper layer and a lower layer of the printed circuit board, and providing the lens via hole 204 for mounting the lens 205 in the upper layer of the printed circuit board (i.e., the first carrier board 203, accordingly, the lower layer of the printed circuit board is the second carrier board 205), thickness of the internal structure of the time-of-flight camera module can be reduced, which at the same time is conducive to the compactness of the overall structure of the time-of-flight camera module. In addition, the structure support 304 is provided between the upper layer and the lower layer of the printed circuit boards to lock the upper layer and the lower layer of the printed circuit boards, which is conducive to the stability of the internal structure of the time-of-flight camera module.
For example, as shown in
For example, as shown in
For example, in some examples, as shown in
For example, as shown in
For example, as shown in
For example, as shown in
For example, as shown in
For example, as shown in FIG. JA, the time-of-flight camera module may further include an infrared laser circuit module 212; for example, the infrared laser circuit module 212 may provide power supply for the infrared laser emission unit or be configured to control the infrared laser emission unit. For example, as shown in
For example, as shown in
For example, as shown in
For example, as shown in
For example, as shown in
For example, as shown in
For example, as shown in
For example, as shown in
For example, in some embodiments, the first infrared laser emission unit and the second infrared laser emission unit both include a vertical cavity surface emitting laser (Vertical-Cavity Surface-Emitting Laser, VCSEL) and a beam expander. A vertical cavity surface emitting laser is a semiconductor laser, and laser can be emitted from its top surface. For example, common infrared laser emission wavelength is 850 nm, 940 nm and so on. For example, the beam expander generally includes a beam expansion lens or a diffuser.
In order to meet the requirements of the ultra-wide angle and long distance-range detection, the infrared laser emission unit of the time-of-flight camera module needs to cover a wide-angle range. For example, in some embodiments, a horizontal emission angle of the vertical cavity surface emitting laser is greater than or equal to 95°, for example, the horizontal emission angle is 100°, 110° or 120° and so on; a vertical emission angle of the vertical cavity surface emitting laser is greater than or equal to 75°, for example, the vertical emission angle is 80°, 85° or 90° and so on. Correspondingly, in order to meet the requirements of the ultra-wide angle and long distance-range detection, the lens 205 of the time-of-flight camera module needs to use an ultra-wide-angle lens to collect infrared light. For example, in some embodiments, a horizontal field of view of the lens 205 is greater than 100° and a vertical field of view of the lens 203 is greater than 80°. It should be understood that, in order to achieve better performance, the lens 205 may have characteristics of large relative aperture, low distortion and high resolution, so as to meet a requirement of a Modulation Transfer Function (MTF) of a high-resolution ToF Sensor. Further, in practical applications, the emission angle of the infrared laser emission unit is generally close to the field of view of the lens, that is, a horizontal emission angle of the vertical cavity surface emitting laser is close to the horizontal field of view of the lens 205, and a vertical emission angle of the vertical cavity surface emitting laser is close to the vertical field of view of the lens 205. It should be further understood that a ratio of the horizontal field of view of the lens 205 to the vertical field of view is generally similar to a horizontal/vertical ratio of the TOF Sensor resolution.
In practical applications, laser intensity of a single infrared laser emission unit is generally difficult to meet detection needs of a medium and long distance target object (energy density of infrared light reflected by the medium and long distance target object is too low), which results that a detection of the medium and long distance target object is susceptible to interference and poor accuracy of information collection. In this case, it can be considered to enhance emission power of the single infrared laser emission unit to meet the detection needs of the medium and long distance target object, but practicality of this scheme is poor. On the one hand, in the case that the emission power of the single infrared laser emission unit is too strong (for example, beyond its conventional working capacity), then its heat generation is generally serious and concentrated, and a resulting heat dissipation problem is not conducive to a compact design of the time-of-flight camera module; on the other hand, the emission power of the single infrared laser emission unit being too strong may cause harm to human eyes in practical applications, so it is likely that it will not pass a human eye safety certification. To solve the above problems, in the time-of-flight camera module provided in the embodiments of the present disclosure, two or more infrared laser emission units may be used. It should be noted that only a time-of-flight camera module comprising two infrared laser emission units is shown in drawings by example, but should not be considered as a limitation on the present disclosure.
In the case of using a plurality of infrared laser emission units to enhance the laser intensity, it is necessary to ensure that laser projection surfaces of a plurality of infrared laser emission units cover the field of view of the lens 205 (for example, a corresponding region of the angle of view of the lens 205). For example, an effective detection region is an overlapping region of the laser projection surfaces of the infrared laser emission units (as shown in
For example, in some embodiments, as shown in
In the case of using a plurality of infrared laser emission units, a very small phase difference may exist between the infrared light emitted by each of the infrared laser emission units, therefore a distance between the infrared laser emission units should be as small as possible under a premise of ensuring good heat dissipation and not affecting a routing layout of the printed circuit board. For example, in some embodiments, the distance d between the centers of the first infrared laser emission unit 201 and the second infrared laser emission unit 202 is 15-30 mm, for example, 20 mm and the like. For example, in some embodiments, the distance between the centers of the second infrared laser emission unit 202 and the lens 205 is 20-60 mm. For example, the size of the first infrared laser emission unit 201 and the second infrared laser emission unit 202 (for example, the vertical cavity surface emitting laser) is about 3-5 mm, and the diameter of the lens 205 is about 10-20 mm.
It should be understood that, in the embodiments of the present disclosure, in a comprehensive consideration of the size of the first infrared laser emission unit 201 and the second infrared laser emission unit 202, the diameter of the lens 205, the heat dissipation and the compact design and other factors, the distance between the centers of the first infrared laser emission unit 201 and the second infrared laser emission unit 202 and the distance between the centers of the second infrared laser emission unit 202 and the lens 205 may be set according to actual needs.
For example, in some embodiments, the maximum detection distance of the time-of-flight camera module may be 5-7 m. For example, in order to achieve the maximum detection distance described above, as well as take into account the human eye safety certification, the emission power of the first infrared laser emission unit 201 and the second infrared laser emission unit 202 (for example, vertical cavity surface emitting laser) may be set to, for example, 1˜1.4 W, and embodiments of the present disclosure include but are not limited to this.
For example, in some embodiments, a mounting height of the time-of-flight camera module is H, the vertical field of view of the lens is α, the lens pitch angle of the time-of-flight camera module is θ, the detection distance range of the time-of-flight camera module is [L1, L2], and the detection height range of the time-of-flight camera module at a detection distance L1 is [h1, h2], and satisfies the following formula relationship: tan(α/2−θ)*L1+H>h2, H−tan(α/2+θ)*L1<h1. For example, in some examples, L1 is in a range from 0.3 m to 0.7 m, L2 is in a range from 5 m to 7 m, h1 is in a range from 1 m to 1.2 m, and h2 is in a range from 1.9 m to 2.3 m. It should be noted that embodiments of the present disclosure include, but are not limited to this. That is, the mounting height of the time-of-flight camera module H, the vertical field of view of the lens a, and the lens pitch angle θ of the time-of-flight camera module can be set according to the actual needs, so as to adjust the values of L1, h1 and h2. For example, the value of L2 is generally preset according to the detection needs, and then, the infrared laser emission unit with suitable emission power can be selected according to the value of L2. It should be noted that, at the detection distance L1, the time-of-flight camera module being able to detect objects (including human bodies, etc.) within the detection height range [h1, h2] or a part of the object within the detection height range [h1, h2], should be considered as that the time-of-flight camera module can detect presence of the object, but whether to perform counting (for example, counting the number of people) and other functions needs to be determined according to an algorithm design, the present disclosure does not limit this. It should be noted that the detection distances 1I and L2 are both horizontal distances.
For example, a working principle of the time-of-flight camera module provided in the embodiments of the present disclosure is roughly as follows: the infrared emission unit (the first infrared laser emission unit 201 and the second infrared laser emission unit 202) emits a laser pulse according to a predetermined modulation frequency, the laser pulse is projected through the infrared filter (for example, the first cover board 2 provided with an infrared coating) to the external environment, after being reflected by a target object in the external environment, the pulsed light enters the lens 205 through the infrared filter (for example, the second cover board 3 provided with an infrared coating), and enters the TOF Sensor through the lens 205; the depth information generated by the TOF Sensor is digitally processed and reaches the computational component, which can further process the depth information and interact with external devices. Some embodiments of the present disclosure further provide a time-of-flight camera module application method. For example, the time-of-flight camera module is a time-of-flight camera module provided by any one of the embodiments of the present disclosure, the time-of-flight camera module may be widely used in a variety of scenarios requiring object recognition and/or behavior detection while ensuring the privacy protection.
For example, as shown in
Step S10: determining a detection distance range [L1, L2] and a detection height range [h1, h2] at a detection distance L1 required by an application scenario.
For example, in some examples, L1 is in a range from 0.3 m to 0.7 m, L2 is in a range from 5 m to 7 m, h1 is in a range from 1 m to 1.2 m, and h2 is in a range from 1.9 m to 2.3 m. It should be noted that embodiments of the present disclosure include, but are not limited to this. That is, the values of L1, L2, h1, h2 can be set according to the actual needs.
Step S20: determining a mounting height H of the time-of-flight camera module, a vertical field of view α of the lens and a lens pitch angle θ of the time-of-flight camera module, according to formulas: tan(α/2−θ)*L1+H>h2, H−tan(α/2+θ)*L1<h1.
For example, the mounting height H of the time-of-flight camera module, the vertical field of view α of the lens and the lens pitch angle θ of the time-of-flight camera module can refer to
It should be understood that the lens pitch angle θ of the time-of-flight camera module is the pitch angle of the front surface mentioned above, that is, the pitch angle of the stacking structure mentioned above.
The mounting height of the time-of-flight camera module, the vertical field of view α of the lens and the lens pitch angle θ of the time-of-flight camera module determined in step S20 can ensure that, at a detection distance L1, the time-of-flight camera module can detect the object (including human bodies, etc.) located in the detection height range [h1, h2] or a part of the object located within the detection height range [h1, h2].
Step S30: determining an emission power of the first infrared laser emission unit and the second infrared laser emission unit according to a detection distance L2.
It should be understood that the emission power of the first infrared laser emission unit and the second infrared laser emission unit should meet the needs of the detection distance range reaching the detection distance L2, and at the same time further taking into account the human eye safety certification. Therefore, the detection distance L2 is generally and reasonably set according to the actual application scenarios. For example, an infrared laser emission unit with a suitable emission power can be selected according to the value of L2 after setting the value of L2 according to the actual application scenarios.
For example, as shown in
In practical applications, suitable time-of-flight camera modules can be manufactured or selected according to parameters determined in step S20 and step S30 above to adapt to a variety of different application scenarios.
Some embodiments of the present disclosure further provide a display device.
For example, as shown in
For example, in some embodiments, the time-of-flight camera module is configured to detect whether a person is in a predetermined detection region; the display screen is configured to display advertising content in the case that no person is in the predetermined detection region, and to display product information in the case that a person is in the predetermined detection region. In this case, the display device may be, for example, a container machine and so on, the time-of-flight camera module realizes a function of a distance sensor. Because a placement position of the container machine is fixed, a field of view of a front-placed time-of-flight camera module is determined, and the predetermined detection region can be freely set in the sector region corresponding to the detection distance L2 mentioned above (hereinafter referred to as a “maximum detection region”). Because the TOF Sensor can directly output distance information, therefore, a human body profile (such as a whole body profile or an upper body profile of the human body, and so on, but is not limited to this) may be taken as a basis for judgment, with the ground as a reference, human body behavior trajectory may be tracked after the human body is identified in the maximum detection region. In the case that the human body enters the predetermined detection region, switch display pictures in the display screen, for example, from the advertising content to the product information; in the case that the human body leaves the predetermined detection region, it is possible to switch the display pictures in the display screen, for example, from the product information to the advertising content. It should be understood that, because the upper body profile can be used as the basis for judgment, therefore, for an adult with a normal height (for example, greater than the aforementioned h1), even if it is located in the aforementioned blind region (as shown in
For example, in other embodiments, the display screen is configured to display advertising content; the time-of-flight camera module is configured to: detect whether a person is in the predetermined detection region; in the case that a person is in the predetermined detection region, provide an independent identification for each person in the predetermined detection region, and track each person, perform statistic on dwelling time of each person in the predetermined detection region; and in the case that the dwelling time is greater than a threshold, increase an amount of advertising reading by 1; further, perform no double counting on the amount of advertising reading in the case that the independent identification disappears and appears repeatedly within a specified time interval. In this case, the display device may be, for example, an advertising machine and the like.
For example, a human body detection logic can be as follows: 1) in the case that the display device is in operation, a background depth image is firstly acquired through the time-of-flight camera module; of course, the background depth image can further be taken in advance; 2) then, depth images are continuously acquired through the time-of-flight camera module, which can be compared with the background depth image to filter out newly emerging individuals in the predetermined detection region; wherein the human body characteristics (for example, the human body profile, such as the whole body profile or the upper body profile, and so on, but is not limited to this) can be used as sifting criteria; 3) the individual is marked and tracked in the case of determining that it meets the human body characteristics and track it. For example, a counting logic can be as follows: determining whether a count has been made for the individual; in the case of being counted, it is continuously tracked until it walks out of the predetermined detection region (for example, disappearing from the predetermined detection region for more than a certain period of time, for example, disappearing from the predetermined detection region for more than 1 s, but is not limited to this), and recorded information of the individual is removed; in the case of being not counted, the individual is continuously tracked, and timing is started as the individual stops moving in the predetermined detection region, and in the case that the dwelling time of the individual is greater than the threshold (for example, 3 s, but is not limited to this), the count is increased by 1 (for example, the advertising reading amount is increased by 1), and the individual is continuously tracked until it walks out of the predetermined detection region.
In practice, a definition of an effective advertising reading amount may be based on customer needs. For example, in one specific example, an effective advertising reading may be a matter that meets the following conditions: 1) a person is in the predetermined detection region; 2) the person stays in the predetermined detection region for at least 3 s; 3) no repeated counting in the case that the person moves in the predetermined detection region. To avoid the repeated counting in a short period of time, it is necessary to track the behavior trajectory of a person entering the predetermined detection region.
For example, in practice, a detection method may include: starting tracking after detecting a human body in a predetermined detection region, and then performing statistic of individual's dwelling time in the predetermined detection region, in the case that the dwelling time in the region exceeds 3 s, then counting according to the individual; after counting for the individual, tracking the behavior trajectory continuously, and performing no repeated counting in the case that a counted individual moves in the predetermined detection region. According to a working principle of ToF, a human body which is shielded cannot be recognized, but in the case that the human body re-enters the field of view of the time-of-flight camera module from a shielding state, a detection algorithm may re-perform a counting operation, and thus may causing some wrong counting. In order to improve a detection accuracy, each detection of individual (for example, a detected individual) in the field of view of the time-of-flight camera module can be provided with an independent identification (for example, an independent ID), and an individual who suddenly disappears and reappears in the field of view of the time-of-flight camera module can be determined to be the same person, so that the counting cannot be repeated at least within a certain time range. In particular, for individuals located in the blind region and near the blind region of the time-of-flight camera module, the upper body can generally be detected, therefore, in order to increase a detection range, at an algorithm level, a human body judgment standard of an individual in the blind region and near the blind region can be set to the upper body profile, so that the depth information obtained by the time-of-flight camera module is sufficient for human detection. In this case, a person who enters a close-distance detection range from a side of the display device can be counted.
For example, the predetermined detection region may be freely set within the sector region corresponding to the aforementioned detection distance L2 (hereinafter referred to as the “maximum detection region”). For example, the threshold and the specified time interval mentioned above may be set according to actual needs, embodiments of the present disclosure are not limited to this. For example, a mounting height of the time-of-flight camera module and the pitch angle of the lens can also be set according to actual needs to adapt to a variety of different application scenarios.
The time-of-flight camera module in the display device provided by embodiments of the present disclosure may meet requirements of reducing blind regions in complex and diverse environments, and can perform a wider range of coverage and detection. The time-of-flight camera module is used in the display device, instead of the RGB camera, so that the human body tracking statistics and a real-time distance feedback can be effectively carried out through the detection algorithm under a premise of meeting the privacy protection.
It should be noted that the display device provided by the embodiments of the present disclosure are exemplary and not restrictive, according to the needs of actual applications, the display device may further include other conventional components or structures, for example, to achieve necessary functions of the display device, those skilled in the art may provide other conventional components or structures according to the specific application scenarios, and embodiments of the present disclosure are not limited to this.
The Following Statements should be Noted:
The above are only the specific embodiments of the disclosure, but the scope of protection of the disclosure is not limited to this. Any person skilled in the technical field who can easily think of changes or replacements within the scope of the disclosure should be covered in the protection scope of the disclosure. Therefore, the protection scope of the disclosure shall be subject to the protection scope of the claims.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/CN2021/126396 | 10/26/2021 | WO |