This application relates to the image processing field, and specifically, to an image processing method and a related apparatus.
In recent years, with rapid development of smartphones, surveillance cameras, autonomous driving, and the like, people have an increasingly strong need for high-quality images and videos. In addition, with continuous improvement of image processing technologies, different image processing standards are gradually proposed.
In the conventional technology, a photographing apparatus captures an image to obtain relative red, green, and blue (RGB) data of the image, and a color gamut of the RGB data is mapped to an sRGB color gamut. Then the photographing apparatus sends the RGB data of the image meeting the sRGB color gamut to a display apparatus, and the display apparatus performs color mapping or transparently transmits the image of the sRGB color gamut based on a display capability of the display apparatus. However, the RGB data of the image captured by the photographing apparatus is quantized relative data, and maximum luminance of the image is unclear. Consequently, luminance corresponding to the image sent to the display apparatus is a relative concept, and absolute luminance of the image is lost. As a result, the display apparatus cannot accurately display real luminance of the image. In addition, the photographing apparatus directly limits a color gamut range of the image to the sRGB color gamut. Consequently, color information of the image is lost, and the display apparatus cannot display a lost color of the image.
Therefore, when a user has an increasingly strong need for a high dynamic range (HDR) image, the conventional technology is limited to the sRGB color gamut, and display of an HDR image cannot be supported. Consequently, no more dynamic ranges or image details can be displayed for the user, causing poor user experience.
Embodiments of this application provide an image processing method and a related apparatus, to support display of an HDR image, display more dynamic ranges and image details, provide richer colors, and improve user experience.
In view of this, embodiments of this application provide the following solutions.
According to a first aspect, an embodiment of this application provides an image processing method. The method is applied to a photographing apparatus, and the method may include: obtaining a first RGB image, where the first RGB image is an RGB image sensed by a photographing apparatus; converting the first RGB image into a second RGB image, where the second RGB image is an image meeting a BT2020 color gamut range; determining a first luminance value corresponding to the second RGB image, where the first luminance value indicates maximum luminance of each frame of image in the second RGB image; determining a target image based on the first luminance value and the second RGB image; and sending the target image to a display apparatus, so that the display apparatus displays the target image. In the foregoing manner, in the photographing apparatus, the first RGB image is converted into the second RGB image meeting the BT2020 color gamut range, the first luminance value of the second RGB image is determined, and then the target image matching an actual luminance status is re-determined based on the second RGB image, so that the display apparatus can finally display the target image. The described target image may be an HDR image that displays more image details, richer colors, and the like.
In some embodiments, the converting the first RGB image into a second RGB image may include: determining a first conversion matrix, where the first conversion matrix is a conversion matrix meeting the BT2020 color gamut range; and determining the second RGB image based on the first conversion matrix and the first RGB image.
In some embodiments, before the determining a first luminance value corresponding to the second RGB image, the method may further include: obtaining first information, where the first information indicates an aperture value, an exposure time, a gain value, or optical-to-electrical conversion efficiency of the photographing apparatus; and correspondingly, the determining a first luminance value corresponding to the second RGB image includes: determining a second conversion matrix and a white balance matrix, where the second conversion matrix is used to indicate conversion between a luminance value of an RGB image sensed by the photographing apparatus and a luminance value of an image in preset observer space; determining a second luminance value based on the second RGB image, the first conversion matrix, the second conversion matrix, the white balance matrix, and the first information, where the second luminance value indicates luminance of each frame of image in the second RGB image; and determining the first luminance value based on the second luminance value, the first conversion matrix, the second conversion matrix, the white balance matrix, and the first information.
In some embodiments, the determining a second conversion matrix and a white balance matrix may include: determining a third luminance value based on the second RGB image and the first information, where the third luminance value indicates luminance of each frame of image in the first RGB image; obtaining a luminance value of an image in preset observer space; determining the second conversion matrix based on the third luminance value and the luminance value of the image in the preset observer space; and performing white balance on the luminance value of the image in the preset observer space to obtain the white balance matrix.
According to a second aspect, an embodiment of this application provides another image processing method. The method is applied to a display apparatus, and the method may include: receiving a target image sent by a photographing apparatus; and displaying the target image.
In some embodiments, after the receiving a target image sent by a photographing apparatus, the method may further include: receiving a first luminance value sent by the photographing apparatus, where the first luminance value indicates maximum luminance of each frame of image in a second RGB image, and the second RGB image is an image meeting a BT2020 color gamut range; and obtaining display information, where the display information indicates a display capability of the display apparatus; and correspondingly, the displaying the target image includes: performing gamut mapping on the target image based on the first luminance value, the display information, and a preset gamut mapping policy, to display a target image obtained through gamut mapping. In the foregoing manner, after the display apparatus receives the target image and the first luminance value, in combination with the display information of the display apparatus that can indicate the display capability, the display apparatus performs gamut remapping on the target image based on the display information, the first luminance value, and the preset gamut mapping policy, for example, performs luminance and color remapping, and then displays a target image obtained through gamut remapping, so that the finally displayed target image can display richer image details and colors.
In some embodiments, the method may further include: adjusting a dynamic range of the target image based on the first luminance value and the display information. In the foregoing manner, the display apparatus may further adjust the dynamic range of the target image based on the first luminance value and the display information, so that the target image can display image details, colors, and the like to the maximum extent.
According to a third aspect, an embodiment of this application provides a photographing apparatus. The photographing apparatus may include: an obtaining unit, configured to obtain a first RGB image, where the first RGB image is an RGB image sensed by the photographing apparatus; a processing unit, configured to convert the first RGB image into a second RGB image, where the second RGB image is an image meeting a BT2020 color gamut range, where
In some embodiments, the processing unit is specifically configured to:
In some embodiments, the obtaining unit is further specifically configured to: before the first luminance value corresponding to the second RGB image is determined, obtain first information, where the first information indicates an aperture value, an exposure time, a gain value, or optical-to-electrical conversion efficiency of the photographing apparatus; and
the processing unit is specifically configured to:
In some embodiments, the processing unit is specifically configured to:
According to a fourth aspect, an embodiment of this application provides a display apparatus, and the display apparatus may include:
In some embodiments, the receiving module is further specifically configured to:
correspondingly, the display module is configured to:
In some embodiments, the display apparatus may further include:
According to a fifth aspect, an embodiment of this application provides an image processing system. The image processing system may include the photographing apparatus according to any one of the first aspect or the possible implementations of the first aspect, and the display apparatus according to any one of the second aspect or the possible implementations of the second aspect.
According to a sixth aspect, an embodiment of this application provides a photographing apparatus. The photographing apparatus may include a processor and a memory, or include an input/output (I/O) interface, a processor, and a memory. The memory stores program instructions. The processor is configured to execute the program instructions stored in the memory, so that the photographing apparatus performs the image processing method according to any one of the first aspect or the possible implementations of the first aspect.
According to a seventh aspect, an embodiment of this application provides a display apparatus. The display apparatus may include a processor and a memory, or include an input/output (I/O) interface, a processor, and a memory. The memory stores program instructions. The processor is configured to execute the program instructions stored in the memory, so that the display apparatus performs the image processing method according to any one of the second aspect or the possible implementations of the second aspect.
According to an eighth aspect, this application provides a computer-readable storage medium. The computer-readable storage medium stores instructions. When the instructions run on a photographing apparatus, the photographing apparatus is enabled to perform the image processing method according to any one of the first aspect or the possible implementations of the first aspect.
According to a ninth aspect, this application provides a computer-readable storage medium. The computer-readable storage medium stores instructions. When the instructions run on a display apparatus, the display apparatus is enabled to perform the image processing method according to any one of the second aspect or the possible implementations of the second aspect.
According to a tenth aspect, this application provides a computer program product including instructions. When the computer program product runs on a photographing apparatus, the photographing apparatus is enabled to perform the image processing method according to any one of the first aspect or the possible implementations of the first aspect.
According to an eleventh aspect, this application provides a computer program product including instructions. When the computer program product runs on a display apparatus, the display apparatus is enabled to perform the image processing method according to any one of the second aspect or the possible implementations of the second aspect.
According to a twelfth aspect, this application provides a chip system. The chip system includes a processor, configured to support a photographing apparatus in implementing the functions according to any one of the first aspect or the possible implementations of the first aspect. In a possible design, the chip system further includes a memory, and the memory is configured to store program instructions and data that are necessary for the photographing apparatus. The chip system may include a chip, or may include a chip and another discrete component.
According to a thirteenth aspect, this application provides a chip system. The chip system includes a processor, configured to support a display apparatus in implementing the functions according to any one of the second aspect or the possible implementations of the second aspect. In a possible design, the chip system further includes a memory, and the memory is configured to store program instructions and data that are necessary for the display apparatus. The chip system may include a chip, or may include a chip and another discrete component.
For technical effects of any one of the implementations of the third aspect, the sixth aspect, the eighth aspect, the tenth aspect, and the twelfth aspect, refer to technical effects of different implementations of the first aspect. Details are not described herein again.
For technical effects of any one of the implementations of the fourth aspect, the seventh aspect, the ninth aspect, the eleventh aspect, and the thirteenth aspect, refer to technical effects of different implementations of the second aspect. Details are not described herein again.
It can be learned from the foregoing technical solutions that embodiments of this application have the following advantages:
In embodiments of this application, after obtaining the first RGB image, the photographing apparatus converts the first RGB image into an image meeting the BT2020 color gamut range, and then determines the first luminance value corresponding to the second RGB image, and determines the target image based on the first luminance value and the second RGB image. Further, the display apparatus performs luminance and color remapping on the target image based on the first luminance value, the display information, and the preset gamut mapping policy, so that the finally displayed target image can display richer image details and image colors. This improves image display effects and further improves user experience.
Embodiments of this application provide an image processing method and a related apparatus, to support display of an HDR image, display more dynamic ranges and image details, provide richer colors, and improve user experience.
The following clearly describes technical solutions in embodiments of this application with reference to accompanying drawings in embodiments of this application. Clearly, the described embodiments are some but not all of embodiments of this application. All other embodiments obtained by persons of ordinary skill in the art based on embodiments of this application without creative efforts shall fall within the protection scope of this application.
In the specification, claims, and accompanying drawings of this application, the terms “first”, “second”, “third”, “fourth”, and the like (if existent) are intended to distinguish between similar objects but do not necessarily indicate a specific order or sequence. It should be understood that data used in this way is interchangeable in appropriate circumstances, so that embodiments of this application described herein can be implemented in orders other than the order illustrated or described herein. In addition, the terms “include”, “have”, and any variants thereof are intended to cover non-exclusive inclusion.
Compared with a common image, an HDR image can provide more dynamic ranges and image details, and display richer colors. Therefore, a user has an increasingly strong need for an HDR image.
However, in the conventional technology, a photographing apparatus captures an image to obtain relative RGB data of the image, and a color gamut of the RGB data is mapped to an sRGB color gamut. Then the photographing apparatus sends the RGB data of the image meeting the sRGB color gamut to a display apparatus, and the display apparatus performs color mapping or transparently transmits the image of the sRGB color gamut based on a display capability of the display apparatus. However, the RGB data of the image captured by the photographing apparatus is quantized relative data, and maximum luminance of the image is unclear. Consequently, luminance corresponding to the image sent to the display apparatus is a relative concept, and absolute luminance of the image is lost. As a result, the display apparatus cannot accurately display real luminance of the image. In addition, the photographing apparatus directly limits a color gamut range of the image to the sRGB color gamut. Consequently, color information of the image is lost, and the display apparatus cannot display a lost color of the image. As a result, the user cannot see more information about details, colors, and the like of the image.
Therefore, to resolve the foregoing problems, embodiments of this application provide an image processing method, so that an image captured by a photographing apparatus can be converted into an RGB image meeting an HDR standard. Then maximum luminance of the RGB image meeting the HDR standard is determined, and a dynamic range of the RGB image is dynamically adjusted based on the maximum luminance. In this way, a display apparatus can display a real color and real luminance of an image to the maximum extent, to display more image details and rich colors.
The following describes in detail the image processing method provided in embodiments of this application.
301: A photographing apparatus obtains a first RGB image, where the first RGB image is an RGB image sensed by the photographing apparatus.
In this embodiment, after photographing a photographed object or the like, the photographing apparatus may obtain a first RGB image of the photographed object. It may be understood that the first RGB image is an RGB image that can be sensed by the photographing apparatus and that has not undergone color gamut space conversion.
In addition, it can be further learned from
302: The photographing apparatus converts the first RGB image into a second RGB image, where the second RGB image is an image meeting a BT2020 color gamut range.
In this embodiment, after generating the first RGB image, the photographing apparatus may convert the first RGB image into the second RGB image that can meet the color gamut range in the HDR standard. It should be noted that the color gamut range in the HDR standard is BT2020, and therefore the second RGB image is an image meeting the BT2020 color gamut range. The described BT2020 color gamut range may be understood as the color gamut range Rec.2020 & Rec.2100 in the HDR standard in
In some examples, the photographing apparatus may convert the first RGB image into the second RGB image in the following manner:
That is, the photographing apparatus needs to first determine a conversion matrix meeting the BT2020 color gamut range, that is, the first conversion matrix; and then convert the first RGB image based on the first conversion matrix to generate the corresponding second RGB image.
Specifically, refer to the following formula for understanding:
where
MBT2020 indicates the first conversion matrix, and the second RGB image is expressed as
In addition, R, G, and B in the second RGB image respectively indicate red, green, and blue pixel values in the second RGB image, and X, Y, and Z in the first RGB image respectively indicate components of pixel values of the first RGB image in three-dimensional coordinates X, Y, and Z.
The first conversion matrix MBT2020 may be obtained based on the following formula:
Sr, Sg, and Sb are coefficients, Xr, +Yr, +Zr, =1, Xg+Yg+Zg=1, Xb+Yb+Zb=1, Xr, Yr, and Zr respectively indicate red pixel values of the first RGB image in an x-axis direction, a y-axis direction, and a z-axis direction, Xw, Yw, and Zw respectively indicate white pixel values of the first RGB image in the x-axis direction, the y-axis direction, and the z-axis direction, Xg, Yg, and Zg may be understood with reference to the descriptions of Xr, Yr, and Zr, and Xb, Yb, and Zb may also be understood with reference to the descriptions of Xr, Yr, and Zr.
303: The photographing apparatus determines a first luminance value corresponding to the second RGB image, where the first luminance value indicates maximum luminance of each frame of image in the second RGB image.
In this embodiment, after converting the first RGB image into the second RGB image meeting the BT2020 color gamut range, the photographing apparatus may further determine the maximum luminance of each frame of image corresponding to the second RGB image, so that a dynamic range of the second RGB image can be adjusted by using the maximum luminance, to determine a target image.
Specifically, the photographing apparatus may determine the first luminance value corresponding to the second RGB image in the following manner:
S3021: Before the first luminance value corresponding to the second RGB image is determined, obtain first information, where the first information indicates an aperture value, an exposure time, a gain value, or optical-to-electrical conversion efficiency of the photographing apparatus.
S3022: Determine a second conversion matrix and a white balance matrix, where the second conversion matrix is used to indicate conversion between a luminance value of an RGB image sensed by the photographing apparatus and a luminance value of an image in preset observer space.
S3023: Determine a second luminance value based on the second RGB image, the first conversion matrix, the second conversion matrix, the white balance matrix, and the first information, where the second luminance value indicates luminance of each frame of image in the second RGB image.
S3024: Determine the first luminance value based on the second luminance value, the first conversion matrix, the second conversion matrix, the white balance matrix, and the first information.
The determining a second conversion matrix and a white balance matrix in S3022 may be further understood in the following manner: determining a third luminance value based on the second RGB image and the first information, where the third luminance value indicates luminance of each frame of image in the first RGB image; obtaining a luminance value of an image in preset observer space; determining the second conversion matrix based on the third luminance value and the luminance value of the image in the preset observer space; and performing white balance on the luminance value of the image in the preset observer space to obtain the white balance matrix.
That is, this is understood as follows: When determining the first luminance value corresponding to the second RGB image, the photographing apparatus first needs to obtain the first information, that is, an aperture value, an exposure time, a gain value, or optical-to-electrical conversion efficiency of the photographing apparatus in a current frame of image.
Then the third luminance value is determined based on the second RGB image and the first information, so that the third luminance value indicates the luminance of each frame of image in the first RGB image. That is, the third luminance value is an actual luminance value that can be sensed by the photographing apparatus in the first RGB image.
Refer to the following formula for understanding:
A indicates the aperture value, T indicates the exposure time, ISO indicates the gain value, Q indicates the optical-to-electrical conversion efficiency, a value obtained through integration indicates a user-defined luminance value, and the third luminance value is expressed as
Then the luminance value of the image in the preset observer space is obtained, and the second conversion matrix is determined based on the third luminance value and the luminance value of the image in the preset observer space.
That is, the luminance value of the image in the preset observer space can be learned by photographing the photographed object, where the luminance value represents perception of eyes of a user for color luminance. Therefore, after the third luminance value is obtained, the second conversion matrix used for converting the third luminance value into the luminance value of the image in the preset observer space needs to be calculated. For example, refer to a schematic diagram in FIG. 5 for understanding. As shown in
Then white balance is performed on the luminance value of the image in the preset observer space to obtain the white balance matrix, which is denoted as Mwb.
Further, the second luminance value is determined based on the second RGB image, the first conversion matrix, the second conversion matrix, the white balance matrix, and the first information, where the second luminance value indicates the luminance of each frame of image in the second RGB image.
Specifically, refer to the following formula for understanding:
where
Finally, the first luminance value is determined based on the second luminance value, the first conversion matrix, the second conversion matrix, the white balance matrix, and the first information.
Specifically, refer to the following formula for understanding:
where
In this way, the first luminance value can be determined based on the second luminance value, the first conversion matrix, the second conversion matrix, the white balance matrix, and the first information, and then the maximum luminance corresponding to each frame of image in the second RGB image can be indicated by using the first luminance value.
304: The photographing apparatus determines a target image based on the first luminance value and the second RGB image.
In this embodiment, after the maximum luminance calculation module calculates the first luminance value that can indicate the maximum luminance of each frame of image in the second RGB image, the photographing apparatus may further process the first luminance value and the second RGB image by using the optical-to-electrical conversion module to obtain the target image. Specifically, the optical-to-electrical conversion module can perform optical-to-electrical conversion on the linear second RGB image based on the PQ curve in the HDR standard and with reference to the first luminance value, to generate a non-linear RGB image, that is, determine the target image. For example, the HDR standard is HDR10. The determining the target image may be specifically understood with reference to the following formula:
where
305: The photographing apparatus sends the target image to a display apparatus.
In this embodiment, after obtaining the target image, the photographing apparatus may send the target image to the display apparatus for processing by the display apparatus. The described target image may be understood as an HDR image meeting the HDR standard.
306: The display apparatus receives the first luminance value sent by the photographing apparatus, where the first luminance value indicates the maximum luminance of each frame of image in the second RGB image, and the second RGB image is an image meeting the BT2020 color gamut range.
Similarly, an objective is to enable the display apparatus to display image details, colors, and the like more clearly.
307: The display apparatus obtains display information, where the display information indicates the display capability of the display apparatus.
In this embodiment, the display information can indicate the display capability of the display apparatus, that is, a degree of displaying, by the display apparatus, a color and luminance of the target image when displaying the target image. The described display capability includes a capability of displaying a color of an image and a capability of displaying luminance of the image.
308: The display apparatus performs gamut mapping on the target image based on the first luminance value, the display information, and a preset gamut mapping policy, to display a target image obtained through gamut mapping.
In this embodiment, after obtaining the first luminance value and the display information, the display apparatus may further perform luminance and color remapping on the target image with reference to the preset gamut mapping policy, and then display a target image obtained through gamut remapping, so that the finally displayed target image can display richer image details and colors.
The preset gamut mapping policy may include but is not limited to the following five policies.
First policy: a minimum distance policy
Second policy: a maximum color value policy
Third policy: a contrast ratio policy
Fourth policy: a maximum luminance value policy
Fifth policy: an adaptation policy
In some other examples, the method may further include: displaying the target image based on the first luminance value and the display information.
In this embodiment, after obtaining the first luminance value and the display information, the display apparatus may further dynamically adjust a dynamic range of the target image based on the first luminance value and the display information, so that the target image can display image details, colors, and the like to the maximum extent. It may be understood that the described dynamic range is a relative ratio of a brightest part and a darkest part of the image.
For example,
Therefore, in this embodiment of this application, after obtaining the first RGB image, the photographing apparatus converts the first RGB image into an image meeting the BT2020 color gamut range, and then determines the first luminance value corresponding to the second RGB image, and determines the target image based on the first luminance value and the second RGB image. Further, the display apparatus performs luminance and color remapping on the target image based on the first luminance value, the display information, and the preset gamut mapping policy, so that the finally displayed target image can display richer image details and image colors. This improves image display effects and further improves user experience.
The foregoing mainly describes solutions provided in embodiments of this application from a perspective of interaction. It may be understood that, to implement the foregoing functions, the photographing apparatus and the display apparatus include corresponding hardware structures and/or software modules for performing the functions. Persons of ordinary skill in the art should easily be aware that modules, algorithms, and steps in examples described with reference to embodiments disclosed in this specification can be implemented by hardware or a combination of hardware and computer software in this application. Whether a function is performed by hardware or hardware driven by computer software depends on particular applications and design constraints of technical solutions. Persons skilled in the art may use different methods to implement the described functions for each particular application, but it should not be considered that the implementation goes beyond the scope of this application.
In embodiments of this application, the photographing apparatus and the display apparatus may be divided into functional modules based on the method examples. For example, the functional modules may be obtained through division based on corresponding functions, or two or more functions may be integrated into one processing module. The integrated module may be implemented in a form of hardware, or may be implemented in a form of a software functional module. It should be noted that, in embodiments of this application, division into the modules is an example and is merely logical function division, and may be other division in an actual implementation.
The following describes in detail a photographing apparatus in this application.
In some examples, the processing unit 1002 may be specifically configured to:
In some examples, the obtaining unit 1001 may be further specifically configured to: before the first luminance value corresponding to the second RGB image is determined, obtain first information, where the first information indicates an aperture value, an exposure time, a gain value, or optical-to-electrical conversion efficiency of the photographing apparatus; and
the processing unit 1002 may be specifically configured to:
In some examples, the processing unit 1002 is specifically configured to:
The foregoing mainly describes the photographing apparatus in this application. The following describes in detail a display apparatus in this application.
In some examples, the receiving module 1101 is further specifically configured to:
In some examples, the display apparatus may further include:
It should be noted that content such as information exchange between the modules/units of the apparatus and the execution processes thereof is based on the same concept as the method embodiments of this application, and produces the same technical effects as the method embodiments of this application. For specific content, refer to the descriptions in the method embodiments of this application. Details are not described herein again.
The foregoing describes the photographing apparatus and the display apparatus in embodiments of this application from a perspective of a modular functional entity. The following describes the photographing apparatus and the display apparatus in embodiments of this application from a perspective of hardware processing.
The processor 1201 may be a general-purpose central processing unit (central processing unit, CPU), a microprocessor, an application-specific integrated circuit (application-specific integrated circuit, ASIC), or one or more integrated circuits configured to control program execution of solutions in this application.
The communication line 1207 may include a channel for transmitting information between the foregoing components.
The communication interface 1204 is any apparatus like a transceiver, and is configured to communicate with another apparatus or a communication network, for example, Ethernet.
The memory 1203 may be a read-only memory (read-only memory, ROM), another type of static storage apparatus capable of storing static information and instructions, a random access memory (random access memory, RAM), or another type of dynamic storage apparatus capable of storing information and instructions. The memory may exist independently, and is connected to the processor 1201 through the communication line 1207. The memory may alternatively be integrated with the processor 1201.
The memory 1203 is configured to store computer-executable instructions for performing solutions in this application, and the processor 1201 controls execution of the computer-executable instructions. The processor 1201 is configured to execute the computer-executable instructions stored in the memory 1203, to implement the image processing method provided in the foregoing embodiments of this application.
Optionally, the computer-executable instructions in this embodiment of this application may also be referred to as application program code. This is not specifically limited in this embodiment of this application.
During specific implementation, in an embodiment, the communication apparatus may include a plurality of processors, for example, a processor 1201 and a processor 1202 in
During specific implementation, in an embodiment, the communication apparatus may further include an output apparatus 1205 and an input apparatus 1206. The output apparatus 1205 communicates with the processor 1201, and may display information in a plurality of manners. The input apparatus 1206 communicates with the processor 1201, and may receive a user input in a plurality of manners. For example, the input apparatus 1206 may be a mouse, a touchscreen apparatus, or a sensing apparatus.
The communication apparatus may be a general-purpose apparatus or a dedicated apparatus. During specific implementation, the communication apparatus may be a portable computer, a mobile terminal, an apparatus with a structure similar to that in
All or some of the foregoing embodiments may be implemented by software, hardware, firmware, or any combination thereof. When software is used to implement the embodiments, all or some of the embodiments may be implemented in a form of a computer program product.
It may be clearly understood by persons skilled in the art that, for the purpose of convenient and brief description, for specific operating processes of the photographing apparatus, the display apparatus, the units, and the modules described above, reference may be made to corresponding processes in the method embodiments. Details are not described herein again.
In several embodiments provided in this application, it should be understood that the disclosed device and method may be implemented in other manners. For example, the described embodiments of the photographing apparatus and the display apparatus are merely examples. For example, division into the units is merely logical function division and may be other division during actual implementation. For example, a plurality of units or components may be combined or integrated into another system, or some features may be ignored or not performed. In addition, the shown or discussed mutual couplings or direct couplings or communication connections may be implemented through some interfaces. The indirect couplings or communication connections between the modules or units may be implemented in electronic, mechanical, or other forms.
The units described as separate parts may or may not be physically separate, and parts shown as units may or may not be physical units, that is, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual requirements to achieve objectives of solutions of embodiments.
In addition, functional units in embodiments of this application may be integrated into one processing unit, each of the units may exist alone physically, or two or more units may be integrated into one unit. The integrated unit may be implemented in a form of hardware, or may be implemented in a form of a software functional unit.
When the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, the integrated unit may be stored in a computer-readable storage medium. Based on such an understanding, the technical solutions of this application essentially, or a part contributing to the conventional technology, or all or some of the technical solutions may be implemented in a form of a software product. The computer software product is stored in a storage medium and includes several instructions for instructing a computer device (which may be a personal computer, a server, a network device, or the like) to perform all or some of the steps of the methods in embodiments of this application. The storage medium includes any medium that can store program code, for example, a USB flash drive, a removable hard disk, a read-only memory (ROM, Read-Only Memory), a random access memory (RAM, Random Access Memory), a magnetic disk, or an optical disc.
The foregoing embodiments are merely intended to describe the technical solutions of this application, but not to limit this application. Although this application is described in detail with reference to the foregoing embodiments, persons of ordinary skill in the art should understand that they may still make modifications to the technical solutions described in the foregoing embodiments or make equivalent replacements to some technical features thereof, without departing from the scope of the technical solutions of embodiments of this application.
Number | Date | Country | Kind |
---|---|---|---|
202011473651.4 | Dec 2020 | CN | national |
This application is a continuation of International Application No. PCT/CN2021/132247, filed on Nov. 23, 2021, which claims priority to Chinese Patent Application No. 202011473651.4, filed on Dec. 15, 2020. The disclosures of the aforementioned applications are hereby incorporated by reference in their entireties.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/CN2021/132247 | Nov 2021 | US |
Child | 18334453 | US |