This application claims priority to Chinese Patent Application 202311760929.X, filed on Dec. 19, 2023, the entire disclosure of which is incorporated herein by reference.
The present application belongs to the field of three-dimensional imaging technology, and particularly to a three-dimensional imaging device and method.
With the development of camera imaging technology, people are no longer limited to imaging in two-dimensional plane, a three-dimensional stereo imaging technology is thus gradually developed. In the current three-dimensional stereoscopic imaging technology, the active imaging technology is difficult to reconstruct transparent and smooth (highly reflective) surface of the objects, and is susceptible to interference by changes in the external environment and light source.
It is to be noted that the information disclosed in the above background technology section is only intended to enhance the understanding of the background of the present application, and thus may include information that does not constitute related art known to those skilled in the art.
There are three-dimensional imaging devices and methods for solving the technical problem of low accuracy in reconstructing the surface of an object in the related technology. The technical solution is as below:
According to an aspect of the embodiment of the present application, it provides a three-dimensional imaging device, which includes:
According to an aspect of the present application, it provides a three-dimensional imaging method, which includes:
It should be understood in the present application that the above general description and the later detailed description are only exemplary and explanatory, and do not limit the present application.
The accompanying drawings herein are incorporated into and form a part of the specification, illustrate embodiments in accordance with the present application, and are used in conjunction with the specification to explain the principles of the present application. It will be apparent that the accompanying drawings in the following description are only some of the embodiments of the present application, and that other drawings may be obtained from these drawings by those skilled in the art without creative labor.
Embodiments will now be described more fully with reference to the accompanying drawings. However, the embodiments can be implemented in a variety of forms and should not be construed as limitation to the examples set forth herein; rather, the provision of these embodiments allows the present application to be more comprehensive and complete, and conveys the idea of the embodiments in a comprehensive manner to those skilled in the art.
The following describes an embodiment of a three-dimensional imaging device of the present application, as shown in
The lens group 101 includes a first lens 101a and a second lens 101b. The lens group 101 is used to gather light beams to output a first light beam and a second light beam towards a first direction. In
The beam-splitting prism group 120 is provided on a side of the lens group 101 that outputs light beams. The beam-splitting prism group 120 includes a first beam-splitting prism 102a and a second beam-splitting prism 102b. As shown in
The polarizer group 130 includes at least three polarizers provided on a side of the beam-splitting prism group 120 that outputs a light beam. At least one of the polarizers is provided on a side of the beam-splitting prism group 120 that outputs a transmitted light beam, and at least one of the polarizers is provided on a side of the beam-splitting prism group 120 that outputs a reflected light beam. The transmitted light beam includes a first transmitted light beam and a second transmitted light beam, and the reflected light beam includes a first reflected light beam and a second reflected light beam. The polarizer group 130 is used to convert the transmitted light beam or the reflected light beam into a polarized light of a preset polarization angle.
Specifically, an asymmetry of the vibration direction relative to the propagation direction is called polarization, and then the polarized light is light whose vibration direction is asymmetric relative to the propagation direction. A polarizer is an optical basic element that can transform the incident light in any polarization state into the polarized light. The preset polarization angle may be any three polarization angles among 0°, 45°, 90° and 135°, or the polarizers may be provided with the above four polarization angles, for example, the first polarizer 103 has a polarization angle of 0°, the second polarizer 105 has a polarization angle of 45°, the third polarizer 107 has a polarization angle of 90°, and the fourth polarizer 109 has a polarization angle of 135°. It should be understood that the preset polarization angles are not limited to the angles enumerated herein. Since the direction of the normal of the pixel point needs to be calculated on the basis of the polarized light of at least three different polarization angles when reconstructing the surface of the object, the transmitted light beam or the reflected light beam is converted into the polarized light of the preset polarization angles by the polarizer group 130. In addition, the polarized light of at most two polarization angles is obtained in the first direction or the second direction, therefore, at least one polarizer is provided on a side of the beam-splitting prism group 120 that outputs the transmitted light beam, and at least one polarizer is provided on a side of the beam-splitting prism group 120 that outputs the reflected light beam.
The sensor group 140 includes at least three sensors. A position of each sensor corresponds to that of each polarizer. Each sensor is configured to obtain the polarized light with the preset polarization angle output from each polarizer to form an image.
Specifically, the sensor is used to obtain the light intensities of the polarized lights of different preset polarization angles to provide data in a subsequent image reconstruction process. As shown in
In the technical solution of the present application, the light beam reflected from the surface of the object is gathered by the lens group to obtain the first light beam and the second light beam. The beam-splitting prism group can transmit and reflect the first light beam and the second light beam respectively, so as to obtain the first transmitted light beam, the first reflected light beam, the second transmitted light beam, and the second reflected light beam, and then the transmitted and reflected light beams are converted by the polarizer group into the polarized lights with the preset polarization angles, and finally at least three polarized lights with different preset polarization angles are obtained by the sensor group to form an image. With the three-dimensional imaging device provided by the present application, the number of lenses can be reduced and the structure of the three-dimensional imaging device can be optimized, and the present application belongs to the passive imaging technology, which is capable of reconstructing the surface of an object even when the surface of the object is transparent and highly reflective, so as to improve the accuracy of the three-dimensional imaging.
In one embodiment of the present application, as shown in
Specifically, the beam-splitting surface 1021 is neither perpendicular nor parallel to the first light beam and the second light beam, otherwise a first transmitted light beam in the first direction and a first reflected light beam in the second direction cannot be obtained after the first light beam passes through the beam-splitting prism group, and a second transmitted light beam in the first direction and a second reflected light beam in the second direction cannot be obtained after the second light beam passes through the beam-splitting prism group. The specific proportions of the first preset proportion and the second preset proportion depend on factors such as the angles between the beam-splitting surface 1021 and the first light beam and the second light beam, and the material of the beam-splitting surface 1021 or process. For example, a ratio of the first preset proportion to the second preset proportion may be 50%: 50%, 30%: 70%, or 20%: 80%, and if the ratio of the first preset proportion to the second preset proportion is 30%: 70%, then 30% of the first light beam is transmitted by the beam-splitting surface 1021, and 70% of the first light beam is reflected by the beam-splitting surface 1021; and 30% of the second light beam is transmitted by the beam-splitting surface 1021, and 70% of the second light beam is reflected by the beam-splitting surface 1021.
In one embodiment of the present application, the beam-splitting surface 1021 is coated with a multi-layers dielectric film by evaporation coating, such that the beam-splitting surface 1021 is transmissible and reflective of the first light beam and the second light beam.
Specifically, the beam-splitting surface 1021 is coated with the multi-layers dielectric film by evaporation coating, which means that the material is evaporated and condensed into a film on the beam-splitting surface 1021 under vacuum conditions, and then after high-temperature heat treatment, a film layer with strong adhesion is formed on the beam-splitting surface 1021. The dielectric film layer is a film layer of an insulating nature, and different dielectric film layers, when combined based on certain optical thicknesses, can form an optical film, for example, the combination of Ta2O5 and SiO2 can play a role in increasing the transmittance (decreasing the reflectivity) or high reflectivity for the emergent light at a certain wavelength. Therefore, the multi-layers dielectric film is coated on the beam-splitting surface 1021, which can enable the beam-splitting surface 1021 to transmit and reflect the first light beam and the second light beam. The combination of different dielectric film layers, the first preset proportion and the second preset proportion of the beam-splitting surface 1021 can be adjusted.
In one embodiment of the present application, as shown in
Specifically, this embodiment is a case where the polarizer group has two polarizers in the first direction and only one polarizer in the second direction, and accordingly, the polarizer group also has two sensors in the first direction and only one sensor in the second direction. As shown in
In one embodiment of the present application, the lens group includes a first lens 101a and a second lens 101b; the polarizer group includes a first polarizer 103, a second polarizer 105, and a third polarizer 107; the sensor group includes a first sensor 104, a second sensor 106, and a third sensor 108; and along the first direction, centers of the first lens 101a, the first beam-splitting prism 102a, the first polarizer 103 and the first sensor 104 are located in the same axis; along the second direction, centers of the first beam-splitting prism 102a, the second polarizer 105 and the second sensor 106 are located in the same axis, and centers of the second beam-splitting prism 102b, the third polarizer 107 and the third sensor 108 are located in the same axis. Alternatively, the polarizer group includes a first polarizer 103, a second polarizer 105 and a third polarizer 107; the sensor group includes a first sensor 104, a second sensor 106 and a third sensor 108; along the first direction, centers of the second lens 101b, the second beam-splitting prism 102b, the first polarizer 103 and the first sensor 104 are located in the same axis; along the second direction, centers of the first beam-splitting prism 102a, the second polarizer 105 and the second sensor 106 are located in the same axis, and centers of the second beam-splitting prism 102b, the third polarizer 107 and the third sensor 108 are located in the same axis.
Specifically, this embodiment is a case where the polarizer group has only one polarizer in the first direction and two polarizers in the second direction, and accordingly, the sensor group also has only one sensor in the first direction and two sensors in the second direction. As shown in
In another embodiment, the direction in which the first light beam and the second light beam are reflected by the beam-splitting prism group can be adjusted by adjusting the specific position of the beam-splitting surface 1021, i.e., the first light beam and the second light beam can be reflected by the beam-splitting prism group to the negative direction of y-axis shown in
In one embodiment of the present application, as shown in
In one embodiment of the present application, the polarizer group in the three-dimensional imaging device of the present application includes four polarizers, and light intensities of the polarized lights output from each of the four polarizers includes a first polarized light intensity I1, a second polarized light intensity I2, a third polarized light intensity I3, and a fourth polarized light intensity I4; the first polarized light intensity I1 is a light intensity of the first polarized light obtained by the first sensor 104 and the second polarized light intensity I2 is a light intensity of the second polarized light obtained by the second sensor 106; and the third polarized light intensity I3 is a light intensity of the third polarized light obtained by the third sensor 108 and the fourth polarized light intensity I4 is a light intensity of the fourth polarized light obtained by the fourth sensor 110; an incident light intensity S0 of the device is half of a total light intensity of the polarized lights of a plurality of preset polarization angles, and the total light intensity of the polarized lights is a sum of the first polarized light intensity I1, the second polarized light intensity I2, the third polarized light intensity I3 and the fourth polarized light intensity I4; and a light intensity difference S1 of the device in a first incident light polarization direction is a difference (I1−I3) between the first polarized light intensity I1 and the third polarized light intensity I3; a light intensity difference S2 of the device in a second incident light polarization direction is a difference (I2−I4) between the second polarized light intensity I2 and the fourth polarized light intensity I4.
Specifically, the circular polarization does not exist in the present application, and thus the polarization state of the polarized light at each of the preset polarization angles in the present device can be described by a Stokes vector (S0, S1, S2). Stokes vectors are a set of covariates describing the polarization state of an electromagnetic wave. After the light intensities of the polarized lights of different preset polarization angles are obtained by the sensor group, relationships between the polarized lights of different preset polarization angles are expressed by the Stokes vectors, so as to reconstruct the surface of the object. For example, the first polarized light is the polarized light corresponding to the 0° polarizer, the second polarized light is the polarized light corresponding to the 45° polarizer, the third polarized light is the polarized light corresponding to the 90° polarizer, and the fourth polarized light is the polarized light corresponding to the 135° polarizer. Then the incident light intensity is S0=(I(0°)+I(45°)+I(90°)+I(135°))/2. In this device, the light intensity difference of the first incident light in polarization directions is S1=I(0°)−I(90%), the light intensity difference of the second incident light in polarization directions is S2=I(45°)−I(135°).
The following describes an embodiment of the three-dimensional imaging method of the present application, which can be applied to the three-dimensional imaging device in the above embodiment of the present application. As shown in
S710, obtaining pictures formed by at least three polarization angles of a to-be-measured object, the pictures are generated by the three-dimensional imaging device in any one of the embodiments provided by the present application.
Specifically, the three-dimensional imaging device of the present application can generate the pictures of the to-be-measured object corresponding to different preset polarization angles in the polarizer group, and then obtain the pictures and carry out subsequent steps to reconstruct a three-dimensional surface of the to-be-measured object.
S720, analyzing the pictures of different polarization angles to determine an azimuth angle and a zenith angle of a normal of each pixel point in the pictures, the azimuth angle is an angle between the normal of each pixel point and a horizontal direction, and the zenith angle is an angle between the normal of each pixel point and a vertical direction. Specifically, as shown in
However, when calculating the azimuth angle φ, there will exist two solutions differing by π, and one of the solutions indicates that the reflected light on the pixel point O is dominated by diffuse reflection, and the other solution indicates that the reflected light on the pixel point O is dominated by mirror reflection. As shown in
As shown in and the z-axis is the zenith angle θ, and different azimuth angles φ correspond to different formulas for calculating the zenith angle θ, the zenith angle θ is calculated according to Eq. 2 when diffuse reflection is dominant; and the zenith angle θ is calculated according to Eq. 3 when mirror reflection is dominant.
ρd and ρs both denote the polarization degree, and
S0 is half of a total light intensity of the polarized lights of a plurality of preset polarization angles, and the total light intensity S0 of the polarized light is a sum of the first polarized light intensity I1, the second polarized light intensity I2, the third polarized light intensity I3 and the fourth polarized light intensity I4; and a light intensity difference S1 of the first incident light in polarization directions is a difference (I1−I3) between the first polarized light intensity I1 and the third polarized light intensity I3; a light intensity difference S2 of the second incident light in polarization directions is a difference (I2−I4) between the second polarized light intensity I2 and the fourth polarized light intensity I4, and the above parameters can be obtained from the three-dimensional imaging device of the present application; f denotes the refractive index of the material, i.e., the refractive index of the material of the lens group and the beam-splitting prism group, and the material refractive index of the material is usually 1.5.
S730, forming a normal vector gradient field of each pixel point based on the azimuth angle and the zenith angle of the normal of each pixel point.
Specifically, based on the azimuth and zenith angles of the normal of the pixel point O, the normal of the pixel point O can be determined, and then based on the azimuth and zenith angles of the normal of each pixel point, the normal of the each pixel point O can be determined, so as to obtain the normal vector gradient field (p,q) of the to-be-measured object.
S740, integrating the normal vector gradient field of each pixel point to obtain depth information of each pixel point.
Specifically, after the normal vector gradient field (p,q) including the normal of each pixel point is calculated, the normal vector gradient field (p,q) is globally integrated according to the Frankot-Chellappa algorithm, the minimum value W of the difference between the surface (Zx, Zy) of the to-be-measured object and the normal vector gradient field (p,q) is determined, as specified in the following formula:
Then, the Fourier variation of the above Eq. 4 is performed to solve for the surface Z(x,y) of the to-be-measured object, and the following Eq. 5 can be obtained:
S750, generating a three-dimensional image of the to-be-measured object based on the depth information of each pixel point.
Specifically, according to the depth information of each pixel point, the specific position of each pixel point in the three-dimensional coordinate system can be determined, so that the three-dimensional image of the to-be-measured object can be generated.
In one embodiment of the present application, after analyzing the pictures of different polarization angles to determine the azimuth angle and the zenith angle of the normal of each pixel point in the pictures, the method provided by the present application includes: constructing an initial surface curve of the to-be-measured object based on a parallax error formed by the lens group in the three-dimensional imaging device; verifying the azimuth angle corresponding to the pixel point based on a result of multiplying a curvature of the initial surface curve and the normal of each pixel point; and the forming the normal vector gradient field of each pixel point based on the azimuth angle and the zenith angle of the normal of each pixel point includes: forming the normal vector gradient field of each pixel point based on the azimuth angle and the zenith angle of the normal of each pixel point when a corresponding azimuth of each pixel point is verified to be correct.
Specifically, based on the parallax error formed by the lens group, i.e., a binocular visual parallax error, a distance between the pixel point and the lens group can be determined, and thus initial depth information of the pixel point can be obtained. Thus based on the distance between the plurality of pixel points and the lens group, an initial surface curve of the to-be-measured object can be constructed. The initial surface curve can be regarded as curve passing through one or more pixel points, so that a curvature k of each initial surface curve is calculated; then a vector n of one of the pixel points on the initial surface curve is obtained, and the curvature k of each initial surface curve is multiplied by the vector {right arrow over (n)} of the pixel point in the initial surface curve, and based on the multiplicative result, it can be determined whether the vector {right arrow over (n)} is correct, and thus it can be verified whether the azimuth angle corresponding to the vector {right arrow over (n)} of the pixel point is correct, and the verification operation is performed for each initial surface curve to obtain the correct azimuth angle of each pixel point.
In one embodiment of the present application, the verifying the azimuth angle corresponding to the pixel point based on the result of multiplying the curvature of the initial surface curve and the normal of the pixel point includes: when the curvature of the initial surface curve is greater than 0, if a multiplicative result is greater than 0, the azimuth angle corresponding to the pixel point is correct, and if the multiplicative result is less than 0, then the azimuth angle corresponding to the pixel point is incorrect; and when the curvature of the initial surface curve is less than 0, if the multiplicative result is greater than 0, the azimuth angle corresponding to the pixel point is correct, if the multiplicative result is less than 0, the azimuth angle corresponding to the pixel point is incorrect.
Specifically, a positive or negative curvature of the initial surface curve indicates a concave or convex condition of the initial surface curve. When the curvature of the initial surface curve is greater than 0, it means that the constructed initial surface curve is a convex curve. When the multiplicative result of the initial surface curve and the normal of the pixel point is greater than 0, it means that the direction of the normal of the pixel point is the same as the direction of the convexity of the initial surface curve, and the azimuth angle corresponding to the pixel point is correct. If the multiplicative result of the initial surface curve and the normal of the pixel point is less than 0, it means that the direction of the normal of the pixel point is not the same as that of the initial surface curve. If the multiplicative result of the initial surface curve and the normal of the pixel point is less than 0, it means that the direction of the normal of the pixel point is not consistent with the initial surface curve, and the azimuth angle corresponding to the pixel point is incorrect. Similarly, if the curvature of the constructed initial surface curve is less than 0, it means that the constructed initial surface curve is concave curve, then if the multiplicative result of the initial surface curve and the normal of the pixel point is greater than 0. That is, the direction of the normal of the pixel point is also concave, and the direction of the normal of the pixel point is the same as the direction of the convexity of the initial surface curve, then it means that the azimuth angle corresponding to the pixel point is correct. If the multiplicative result of the initial surface curve and the normal of the pixel point normal is less than 0, it means that the direction of the normal of the pixel point is not concave, the direction of the normal of the pixel point is inconsistent with the direction of the convexity of the initial surface curve, and the azimuth angle corresponding to the pixel point is incorrect. By the verification step of this embodiment, the correct azimuth angle can be determined from the two solutions of the azimuth angle.
In another embodiment, the azimuth angle of the pixel point includes a first azimuth angle and a second azimuth angle; the zenith angle of the pixel point includes a first zenith angle and a second zenith angle; after verifying that the azimuth angle is incorrect, the method provided by the present application further includes determining whether the azimuth angle that is verified to be incorrect is a first azimuth angle; and when the azimuth angle that is verified to be incorrect is the first azimuth angle, determining a second zenith angle corresponding to the pixel point based on the second azimuth angle.
Specifically, since the solution of the azimuth angle includes a solution of a diffuse reflection dominant and a solution of the mirror reflection dominant, if one of the solutions is verified to be incorrect, then the other solution is the correct azimuth angle. For example, when the solution φd of the diffuse reflection dominant is verified to be incorrect, then the solution φs of the mirror reflection dominant is the correct azimuth angle. Based on the azimuth angle φs, the correct second zenith angle ρs is calculated, so that the correct normal n to the pixel point can be determined based on the second azimuth angle φs and the second zenith angle ρs.
S1010, analyzing pictures of different polarization angles to determine the azimuth and zenith angles of the normal of each pixel point in the pictures; the azimuth angle includes a first azimuth angle and a second azimuth angle; and the zenith angle includes a first zenith angle and a second zenith angle.
Specifically, as previously described, there are two solutions of the azimuth angle, the first azimuth angle may be the solution corresponding to the diffuse reflection dominant, then the first zenith angle is the zenith angle corresponding to the azimuth angle of the diffuse reflection dominant; the second azimuth angle may be the solution of the mirror reflection dominant, then the second zenith angle is the zenith angle corresponding to the azimuth angle of the mirror reflection dominant.
S1020, constructing an initial surface curve of the to-be-measured object based on the parallax error formed by the lens group in the three-dimensional imaging device.
Specifically, as previously described, the depth information of the pixel points can be roughly determined according to the parallax error formed by the lens group in the three-dimensional imaging device, thereby constructing the initial surface curve.
S1030, whether a multiplicative result of a curvature of the initial surface curve and a normal of the pixel point is greater than 0.
Specifically, according to the judgement result of S1030, a corresponding step is executed. If the judgement result is yes, S1040 is executed; if the judgement result is no, S1050 is executed and then S1040 is executed.
S1040, forming a normal vector gradient field of each pixel point based on the azimuth and zenith angles of the normal of each pixel point.
Specifically, based on the azimuth and zenith angles of the normal of each pixel point, the normal of the pixel point can be determined, then after determining the normals of all pixel points, the normals of these pixel points form a normal vector gradient field.
S1050, when the azimuth angle verified to be correct is a first azimuth angle, determining a second zenith angle corresponding to the pixel point based on the second azimuth angle.
Specifically, if the first azimuth angle is the solution corresponding to the diffuse reflection dominant and is verified to be incorrect, the second azimuth angle is substituted into Eq. 3 to obtain the second zenith angle corresponding to the second azimuth angle.
The following describes an embodiment of a device of the present application for realizing a three-dimensional imaging method, as shown in
In one embodiment of the present application, the device further includes a curve verification module configured for constructing an initial surface curve of the to-be-measured object based on a parallax error formed by the lens group in the three-dimensional imaging device before forming the normal vector gradient field of each pixel point based on the azimuth and zenith angles of the normal of each pixel point; verifying the initial surface curve based on a multiplicative result of the curvature of the initial surface curve and the normal of the pixel points; and the integrating the normal vector gradient field of each pixel point to obtain depth information of each pixel point includes integrating the normal vector gradient field of each pixel point to obtain depth information of each pixel point when the initial surface curve is verified to be correct.
In one embodiment of the present application, the curve verification module is configured to confirm azimuth ambiguity. When the curvature of the initial surface curve is greater than 0, it determines that the azimuth angle corresponding to the pixel point is correct if the multiplicative result of the initial surface curve and the normal of the pixel is greater than 0. Conversely, if the multiplicative result is less than 0, the azimuth angle corresponding to the pixel point is incorrect. On the other hand, when the curvature of the initial surface curve is less than 0, the curve verification module will determine that the azimuth angle corresponding to the pixel point is correct if the multiplicative result of the initial surface curve and the normal of the pixel is greater than 0. Conversely, if the multiplicative result is less than 0, the azimuth angle corresponding to the pixel point is incorrect.
It should be known that specific embodiments of the three-dimensional imaging device provided by the present application have been disclosed in the method embodiments and will not be explained herein.
As shown in
Specifically, the three-dimensional imaging method provided in the present application is stored in the memory 1220 of the electronic device 1200, and after a picture of the to-be-measured object is obtained by the sensor group in the three-dimensional imaging device, the corresponding three-dimensional imaging method is executed, to construct and verify an initial surface curve of the to-be-measured object, and performing a global integration based on the gradient field of the normal vector of each pixel point, to restore the surface of the to-be-measured object.
It should be known that specific embodiments of the electronic device provided by the present application have been disclosed in the method embodiments and will not be explained herein.
It is to be noted that the computer system 1300 of the electronic device illustrated in
As shown in
The following components are connected to the input/output interface 1305: an input portion 1306 including a keyboard, a mouse, etc.; an output portion 1307 including, for example, a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and speakers, etc.; and a storage portion 1308 including a hard disc, etc.; and a communication portion 1309 including a network interface card such as a LAN card, a modem, etc. The communication portion 1309 performs communication processing via a network such as the Internet. The driver 1310 is also connected to the input/output interface 1305 as needed. The removable media 1311, such as disks, optical discs, magnetic discs, semiconductor memories, etc., are mounted to the driver 1310 as needed so that computer programs read from it are mounted into the storage portion 1308 as needed.
In particular, according to embodiments of the present application, the processes depicted in the flowchart of each method may be implemented as computer software programs. For example, embodiments of the present application include a computer program product which includes a computer program carried on a computer readable medium, the computer program includes a program code for performing the method shown in the flowchart. In such embodiments, the computer program may be downloaded and installed from a network via communication portion 1309, and/or installed from a removable medium 1311. When the computer program is executed by the processor 1301, various functions defined in the system of the present application are performed.
It is noted that the computer-readable medium shown in embodiments of the present application may be a computer-readable signal medium or a computer-readable storage medium or any combination of the above. The computer-readable storage medium maybe, for example,—but is not limited to—a system, a device, or an apparatus, or device of electricity, magnetism, light, electromagnetism, infrared, or semiconductors, or any combination of the above. More specific examples of a computer-readable storage media may include, but are not limited to: an electrical connection having one or more wires, a portable computer disk, a hard drive, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read only memory (EPROM), a flash memory, an optical fiber, a portable compact disk Read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer-readable storage medium may be any tangible medium containing or storing a program that may be used by or in combination with an instruction execution system, apparatus or device. And in the present application, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier carrying computer-readable program code. Such propagated data signals may take a variety of forms, including, but not limited to, electromagnetic signals, optical signals, or any suitable combination of the foregoing. The computer-readable signal medium may also be any computer-readable medium other than a computer-readable storage medium that sends, propagates, or transmits a program for use by, or in combination with, an instruction-executing system, apparatus, or device. The program code contained on the computer-readable medium may be transmitted using any suitable medium, including, but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
The flowcharts and block diagrams in the accompanying drawings illustrate the architecture, functionality, and operation of systems, methods, and computer program products that may be implemented in accordance with various embodiments of the present application. At this point, each box in the flowcharts or block diagrams may represent a module, program segment, or portion of code, and the module, program segment, or portion of code contains one or more executable instructions for implementing the specified logical functions. It should also be noted that in some implementations as replacements, the functions indicated in the boxes may also occur in a different order than that indicated in the accompanying drawings. For example, two consecutively represented boxes can actually be executed substantially in parallel, and they can sometimes be executed in reverse order, depending on the function involved. It should also be noted that each box in a block diagram or flowchart, and combinations of boxes in a block diagram or flowchart, may be implemented with a dedicated hardware-based system that performs the specified function or operation, or may be implemented with a combination of dedicated hardware and computer instructions.
It should be noted that although a number of modules or units of the apparatus for action execution are mentioned in the detailed description above, this division is not mandatory. Indeed, according to embodiments of the present application, the features and functions of two or more modules or units described above may be specified in a single module or unit. Conversely, the features and functions of one module or unit described above may be further divided to be specified by more than one module or unit.
By the above description of the embodiments, it is readily understood by those skilled in the art that the embodiments described herein can be implemented by means of software or by means of software in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present application may be embodied in the form of a software product that may be stored in a non-volatile storage medium (which may be a CD-ROM, a USB flash drive, a removable hard drive, etc.) or on a network, and include a number of instructions to cause a computing device (which may be a personal computer, a server, a touch terminal, or a network device, etc.) to execute the software product according to the embodiments of the present application.
Other embodiments of this application will readily come to mind to those skilled in the art upon consideration of the specification and practice of the invention disclosed herein.
The present application is intended to cover any variations, uses, or adaptations of the present application which follow the general principles of the present application and include means of common knowledge or those skilled in the art not disclosed herein.
It is to be understood that this application is not limited to the precise construction which has been described above and illustrated in the accompanying drawings, and that various modifications and changes may be made without departing from its scope. The scope of the present application is limited only by the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
202311760929.X | Dec 2023 | CN | national |