This application relates to the field of computer technologies, and specifically, to a filming parameter configuration method and apparatus, a device, a storage medium, and a product.
With the progress of scientific and technological research, virtual film production technologies are widely applied to processes such as video photographing, film production, and advertisement photographing. A virtual film production technology usually makes a photographed video or movie more realistic by combining a real scene and a virtual scene (e.g., combining a depth of field effect of the real scene and a depth of field effect of the virtual scene).
Examples of this application provide a parameter configuration method and apparatus, a device, a storage medium, and a product, to improve configuration efficiency of virtual scene parameters.
According to an aspect, an example of this application provides a parameter configuration method, comprising:
In addition, this application provides a computing device, comprising:
Also, this application provides a non-transitory computer-readable storage medium storing instructions, when executed, implement the foregoing parameter configuration method.
In addition, this application provides a computer program product or a computer program. The computer program product or the computer program includes computer instructions. The computer instructions are stored in a computer-readable storage medium. A processor of a computer device reads the computer instructions from the computer-readable storage medium, and executes the computer instructions, to cause the computer device to perform the foregoing parameter configuration method.
In the examples of this application, by determining, according to the photographing parameters of the target camera device and a preset reference dataset, target virtual scene parameters corresponding to virtual content to be photographed by the target camera device, and displaying the to-be-photographed virtual content according to the target virtual scene parameters, the target virtual scene parameters may be changed in real time according to the photographing parameters of the target camera device, to display the to-be-photographed virtual content. Therefore, the target camera device can almost simultaneously photograph a real scene and virtual content that corresponds to the real scene and that is displayed on a display device, which has high real-time performance. In addition, a transition between the real scene and the virtual content that is displayed according to the virtual scene parameters can be made smooth in an image photographed by the target camera device according to the photographing parameters of the target camera device.
To describe the technical solutions in the examples of this application or in the related art more clearly, the following briefly introduces the accompanying drawings for describing the examples or the related art. Apparently, the accompanying drawings in the following description show merely some examples of this application, and a person of ordinary skill in the art may still derive other drawings from the accompanying drawings without creative efforts.
The following clearly and completely describes the technical solutions in the examples of this application with reference to the accompanying drawings in the examples of this application. Apparently, the described examples are some of the examples of this application rather than all of the examples. All other examples obtained by a person of ordinary skill in the art based on the examples of this application without creative efforts shall fall within the protection scope of this application.
In a process of virtual film production, on a photographing site, a film producer usually adjusts a virtual scene parameter (e.g., a photographing parameter of a virtual camera device) according to a depth of field effect in a real scene (or a real space) presented in a camera lens as seen by an unaided eye, so that the depth of field effect of the real scene and a depth of field effect of a virtual scene (or a virtual space) in a light-emitting diode (LED) curtain wall remain connected. During this adjustment, real-time performance may be poor, and a depth of field transition between the real scene and the virtual scene may be unsmooth.
This application relates to a virtual film production technology implemented through a computer technology. The following briefly describes related terms:
Virtual film production: It refers to a series of computer-assisted film production and visual film production methods. Virtual film production combines virtual reality and augmented reality with a computer-generated imagery (CGI) technology and a game engine technology, so that producers can see scenes that are obtained by fusing a virtual scene and a real scene and that are spread out in front of the producers, as if the scenes were photographed in the real scene.
LED curtain wall: A large LED screen is configured to display virtual content on a photographing site of virtual film production.
Screen front scene: There is an actual photographing prop in front of the LED curtain wall.
On-site photographing camera: An on-site photographing camera in virtual film production captures fused pictures of an LED screen and a screen front scene simultaneously.
Virtual scene: It is a digital scene made in a game engine according to an artist requirement or a real scene.
Real scene (on-site scene): It is a real prop or scene built in virtual film production.
In addition, the examples of this application further relate to Artificial Intelligence (AI). The following briefly describes related terms and concepts of AI:
AI involves a theory, a method, a technology, and an application system that use a digital computer or a machine controlled by the digital computer to simulate, extend, and expand human intelligence, perceive an environment, obtain knowledge, and use knowledge to obtain an optimal result. In other words, AI is a comprehensive technology in computer science and attempts to understand the essence of intelligence and produce a new intelligent machine that can react in a manner similar to human intelligence. AI is to study the design principles and implementation methods of various intelligent machines, to enable the machines to have the functions of perception, reasoning, and decision-making.
The AI technology is a comprehensive discipline, and relates to a wide range of fields including both hardware-level technologies and software-level technologies. The basic AI technologies generally include technologies such as a sensor, a dedicated AI chip, cloud computing, distributed storage, a big application processing technology, an operating/interaction system, and electromechanical integration. AI software technologies mainly include several major directions such as a computer vision technology, a speech processing technology, a natural language processing technology, and machine learning/deep learning.
Machine learning is a multi-field interdiscipline, and relates to a plurality of disciplines such as the probability theory, statistics, the approximation theory, convex analysis, and the algorithm complexity theory. Machine learning specializes in studying how a computer simulates or implements a human learning behavior to obtain new knowledge or skills, and reorganize an existing knowledge structure, to keep improving its performance. Machine learning is the core of AI, is a basic way to make the computer intelligent, and is applied to various fields of AI. Machine learning/deep learning generally includes technologies such as an artificial neural network, a belief network, reinforcement learning, transfer learning, inductive learning, and learning from demonstrations.
Deep learning: The concept of deep learning is originated from research about artificial neural networks. A multi-layer perceptron including a plurality of hidden layers is one of deep learning structures. During the deep learning, an attribute type or feature is represented by combining lower-layer features into a more abstract higher layer, to find distributed feature representation of data. The examples of this application mainly relate to training an initial model by using a reference dataset to obtain a virtual scene parameter prediction model. The virtual scene parameter prediction model may be configured to analyze photographing parameters of a target camera device, and output target virtual scene parameters corresponding to virtual content to be photographed by the target camera device.
The following briefly describes a parameter configuration solution provided in the examples of this application. Through the parameter configuration solution, parameter configuration efficiency and accuracy can be improved.
The camera device 101 and the display device 102 in
A general principle of the parameter configuration solution is as follows:
In an implementation, the reference camera device includes a plurality of groups of photographing parameters. A manner of determining a correspondence between reference photographing parameters and virtual scene parameters is as follows. The reference camera device is enabled to photograph a reference object according to the first group of reference photographing parameters to obtain a real image. A virtual object corresponding to the reference object is adjusted by adjusting the virtual scene parameters, and the reference camera device is enabled to photograph a virtual object according to the first group of reference photographing parameters to obtain a virtual image. When the real image matches the virtual image (for example, a depth of field effect of the virtual image matches a depth of field effect of the real image), virtual scene parameters are recorded, and a correspondence between the virtual scene parameters and the first group of reference photographing parameters is established. The first group of reference photographing parameters may be any one of a plurality of groups of photographing parameters.
In an example, in a real scene, N reference objects are arranged in ascending order of distances from the reference camera device, where N is an integer greater than 1. The N reference objects are located on the same straight line and are at equal intervals from each other. Images obtained by photographing the N reference objects by the reference camera device according to the first group of reference photographing parameters are referred to as real images. When a real image is collected, a distance between the reference camera device and a closest reference object is the same as the interval between the reference objects, and the reference camera device and the N reference objects are located on the same straight line. The focal point of the reference camera device may be on one of N−x reference objects at distances from the reference camera device less than a distance threshold among the N reference objects.
After the real image is obtained, x reference objects at distances from the reference camera device greater than the distance threshold among the N reference objects are removed, and the x removed reference objects are simulated in the display device in the real scene. For example, x virtual objects are displayed on the display device, the x virtual objects are located on the same straight line as the N−x reference objects that are not removed, and display effects of the x virtual objects may be adjusted through the virtual scene parameters. An image obtained by photographing, by the reference camera device according to the first group of reference photographing parameters (at the same position at which the real image is photographed), the N−x reference objects that remain and the x virtual objects that are simulated in the display device is referred to as a virtual image. A focal point of photographing a virtual image by the reference camera device is the same as the foregoing focal point of photographing the N reference objects. In the virtual scene, a focal length of the virtual camera device is the same as that of the reference camera device. When the real image matches the virtual image (for example, when a size or diameter of the reference object in the real image matches that of the virtual object in the corresponding virtual image), virtual scene parameters are recorded, and a correspondence between the virtual scene parameters and the first group of reference photographing parameters is established.
After the correspondence between the virtual scene parameters and the first group of reference photographing parameters are established, the aperture of the reference camera device may be changed for a plurality of times, and the foregoing process is repeated, to establish a correspondence between a plurality of groups of virtual scene parameters and the reference photographing parameters.
Then, the focal point of the reference camera device may be changed (for example, to be on another one of the N−x reference objects at the distances from the reference camera device less than the distance threshold among the N reference objects), and the foregoing process is repeated. A correspondence between more groups of virtual scene parameters and the reference photographing parameters is established. For the foregoing process, reference may be made to
For example, in a real scene, 10 reference objects (for example, 10 balls) are arranged in ascending order of distances from the reference camera device. The 10 reference objects are located on the same straight line and are at equal intervals from each other (for example, the interval is 2 meters). The reference camera device is enabled to photograph the 10 reference objects according to the first group of reference photographing parameters to obtain a real image. When a real image is collected, if the interval between the reference objects is 2 meters, the distance between the reference camera device and the closest reference object is also 2 meters, and the reference camera device and the 10 reference objects are located on the same straight line. After the real image is obtained, reference objects at distances from the reference camera device greater than the distance threshold among the 10 reference objects are removed (for example, 5 reference objects farthest from the reference camera device are removed), and the 5 removed reference objects are simulated in the display device in the real scene. That is to say, 5 virtual objects are displayed on the display device, the 5 virtual objects are located on the same straight line as the 5 reference objects that are not removed, and display effects of the 5 virtual objects may be adjusted through the virtual scene parameters. The reference camera device photographs, according to the first group of reference photographing parameters (at the same position at which the real image is photographed), the 5 reference objects that remain and the 5 virtual objects that are simulated in the display device to obtain as a virtual image. When the real image matches the virtual image (for example, a depth of field effect of the virtual image matches a depth of field effect of the real image, or a size or diameter of the reference object in the real image matches that of the virtual object in the corresponding virtual image), virtual scene parameters are recorded, and a correspondence between the virtual scene parameters and the first group of reference photographing parameters is established.
For example, the photographing parameters of the camera device 101 obtained in real time include: a focal point, a focal length, and an aperture value. It is determined whether the real-time aperture value of the camera device 101 is consistent with a recorded aperture value of the reference camera device in the reference dataset, to obtain two results:
Focal point value: A position (a percentage) at which the focal point value of the camera device 101 is in the middle of La→Lb is calculated, and a product is obtained by multiplying this data by a difference between the recorded focal points of the reference camera device corresponding to La and Lb and is added to the recorded focal point corresponding to La.
Aperture value: A position (a percentage) at which the aperture value of the camera device 101 is in the middle of Fa→Fb is calculated, and a product is obtained by multiplying this data by a difference between the recorded apertures of the reference camera device corresponding to Fa and Fb and is added to the recorded aperture corresponding to Fa.
The foregoing example is also applicable to operation S304 in
Based on the foregoing parameter configuration solution, the examples of this application provide a more detailed parameter configuration method. The data transmission method provided in the examples of this application is described in detail below with reference to the accompanying drawings.
S201. Obtain photographing parameters of a target camera device.
The photographing parameters of the target camera device are parameters used by the target camera device to photograph an image. In an implementation, the photographing parameters include lens parameters of the target camera device. Specifically, the lens parameters of the target camera device may include a focal length, an aperture value, and a focal point value. The focal point value refers to a distance between the target camera device and a to-be-photographed subject of the device. The photographing parameters of the target camera device are, for example, obtained by the computer device from the target camera device after the photographing parameters of the target camera device are manually or automatically set in real time on the target camera device. The photographing parameters of the target camera device are photographing parameters that are configured for photographing a fused picture of a real scene of a photographing site and virtual content displayed on a display device of the photographing site.
S202. Determine, according to the photographing parameters of the target camera device and a reference dataset, target virtual scene parameters corresponding to virtual content to be photographed by the target camera device.
The computer device may obtain the reference dataset locally or from another device via a network. The reference dataset includes a correspondence between reference photographing parameters and virtual scene parameters. The reference photographing parameters are configured for describing photographing parameters of the reference camera device (for example, a real camera device used when the reference dataset is configured). The reference photographing parameters include lens parameters of the reference camera device. The virtual scene parameters are configured for describing photographing parameters of a virtual camera device in a virtual scene. The photographing parameters of the virtual camera device include lens parameters of the virtual camera device. By adjusting the photographing parameters of the virtual camera device, presentation of the virtual scene may be adjusted, for example, a depth of field effect of the virtual scene is adjusted.
In an implementation, the computer device determines, according to a relationship between the photographing parameters of the target camera device and the photographing parameters of the reference camera device and a correspondence between the photographing parameters of the reference camera device and the photographing parameters of the virtual camera device, the target virtual scene parameters corresponding to the virtual content to be photographed by the camera device 101 (namely, the photographing parameters of the virtual camera device that correspond to the photographing parameters of the target camera device).
In an example, the reference dataset includes M*N*P groups of photographing parameters. Each group of photographing parameters includes a focal length, an aperture value, and a focal point value, M is a quantity of candidate focal lengths, N is a quantity of candidate aperture values, P is a quantity of candidate focal point values, and M, N, and P are all positive integers. The photographing parameters of the target camera device include an actual focal length, an actual aperture value, and an actual focal point value. A specific implementation in which the computer device determines, according to the photographing parameters of the target camera device and a preset reference dataset, target virtual scene parameters corresponding to virtual content to be photographed by the target camera device is: determining the actual focal length as the scene focal length of the target virtual scene, and determining whether there is a candidate focal length that is consistent with the actual focal length in the M candidate focal lengths of the reference camera device; and determining, if the actual focal length is consistent with an ith candidate focal length of the reference camera device, a scene aperture value and a scene focal point value of the target virtual scene according to a relationship between the actual aperture value and the actual focal point value and a candidate aperture value and a candidate focal point value that are associated with the ith candidate focal length; or determining, if the actual focal length is between an ith candidate focal length and an (i+1)th candidate focal length of the reference camera device, a scene aperture value and a scene focal point value of the target virtual scene according to a relationship between the actual aperture value and the actual focal point value and a candidate aperture value and a candidate focal point value that are associated with the ith candidate focal length and a relationship between the actual aperture value and the actual focal point value and a candidate aperture value and a candidate focal point value that are associated with the (i+1)th candidate focal length, where i is a positive integer less than M.
In another implementation, the computer device trains an initial model by using a reference dataset to obtain a virtual scene parameter prediction model. The virtual scene parameter prediction model may be configured to analyze photographing parameters of a target camera device, and output target virtual scene parameters corresponding to virtual content to be photographed by the target camera device. A process in which the computer device trains an initial model by using a reference dataset to obtain a virtual scene parameter prediction model is: analyzing photographing parameters of the reference camera device by using the initial model, to predict photographing parameters of the virtual camera device; and calculating, by using a loss function, a loss value of the photographing parameters of the virtual camera device corresponding to the predicted photographing parameters of the virtual camera device and the photographing parameters of the reference camera device, and adjusting parameters in the initial model based on the loss value, to obtain the virtual scene parameter prediction model.
S203. Display the virtual content according to the target virtual scene parameters.
The computer device displays the virtual content according to the target virtual scene parameters, and the target camera device photographs the virtual content displayed in the computer device to obtain a virtual film production image.
Operation S203 may specifically include: displaying the to-be-photographed virtual content on the display device according to the target virtual scene parameters, to cause the target camera device to photograph a fused picture of the real scene and the to-be-photographed virtual content displayed on the display device.
In the examples of this application, by determining, according to the photographing parameters of the target camera device and a preset reference dataset, target virtual scene parameters corresponding to virtual content to be photographed by the target camera device, and then displaying the to-be-photographed virtual content according to the target virtual scene parameters, the target virtual scene parameters may be changed in real time according to the photographing parameters of the target camera device, to display the to-be-photographed virtual content. Therefore, the target camera device can almost simultaneously photograph a real scene and virtual content that corresponds to the real scene and that is displayed on a display device, which has high real-time performance. In addition, a transition between the real scene and the virtual content that is displayed according to the virtual scene parameters can be made smooth in an image photographed by the target camera device according to the photographing parameters of the target camera device.
S301. Configure a reference dataset according to photographing parameters of a reference camera device and photographing parameters of a virtual camera device.
The reference camera device includes a plurality of groups of photographing parameters. A configuration process in which the computer device configures the reference dataset is as follows. The computer device obtains a first group of photographing parameters of the reference camera device, and photographs a real image of a reference object based on the first group of photographing parameters by using the reference camera device. The first group of photographing parameters is any one of the plurality of groups of photographing parameters. In an aspect, the reference object is simulated to obtain a virtual object, and the virtual object is adjusted by adjusting virtual scene parameters. The reference object is simulated, for example, by a game engine. In another aspect, the virtual object is photographed based on the first group of photographing parameters by using the reference camera device, to obtain a virtual image. The virtual image is compared with the real image, and target virtual scene parameters corresponding to a case that the virtual image matches the real image are recorded. A correspondence between the first group of photographing parameters and the target virtual scene parameters is established, and the correspondence is added to the reference dataset.
The photographing a real image of a reference object based on the first group of photographing parameters by using the reference camera device specifically includes the following content. Images of N reference objects in a real scene are photographed by using the reference camera device. The N reference objects are arranged in ascending order of distances from the reference camera device, are located on the same straight line, and are at equal intervals from each other. A distance between the reference camera device and a closest reference object is the same as the interval between the reference objects. The first group of photographing parameters of the reference camera device includes a focal length, an aperture value, and a focal point value. The focal point corresponding to the focal point value is on one of N−x reference objects at distances from the reference camera device less than a distance threshold among the N reference objects, where N is an integer greater than 1, and x<N.
The photographing the virtual object based on the first group of photographing parameters by using the reference camera device, to obtain a virtual image specifically includes the following content. Images of the N−x reference objects at the distances from the reference camera device less than the distance threshold among the N reference objects, and x virtual objects in the virtual scene displayed on a display device at the interval from an (N−x)th reference object are photographed by the reference camera device based on the first group of photographing parameters. The x virtual objects correspond to x reference objects at a distance from the reference camera device greater than the distance threshold among the N reference objects.
The comparing the virtual image with the real image, and recording target virtual scene parameters corresponding to a case that the virtual image matches the real image includes: comparing sizes of the x virtual objects in the virtual image with sizes of the x reference objects in the corresponding real image, adjusting the virtual scene parameters according to a comparison result, to cause the sizes of the x virtual objects in the virtual image to match the sizes of the x reference objects in the corresponding real image, and recording virtual scene parameters adjusted during matching as the target virtual scene parameters.
In an implementation, the reference dataset includes M*N*P groups of photographing parameters. Each group of photographing parameters includes a focal length, an aperture value, and a focal point value, M is a quantity of candidate focal lengths, N is a quantity of candidate aperture values, P is a quantity of candidate focal point values, and M, N, and P are all positive integers. The first group of photographing parameters includes an ith candidate focal length, a jth candidate aperture value, and a kth candidate focal point value, where i is a positive integer less than M, j is a positive integer less than N, and k is a positive integer less than P.
In an implementation, a specific manner in which the computer device adjusts the virtual object by adjusting the virtual scene parameters is: determining the ith candidate focal length as a virtual focal length of the virtual camera device, and configuring a virtual aperture value and a virtual focal point value of the virtual camera device. When the real image collected by the reference camera device in the manner in
Table 1 is a schematic table for recording virtual scene parameters corresponding to various groups of photographing parameters according to an example of this application.
As shown in Table 1, when the candidate focal length is A1, the candidate aperture value is B1, and the candidate focal point value is C1, the virtual scene parameters are: A1, D1, and E1, where A1 is a virtual focal length, D1 is a virtual aperture value, and E1 is a virtual focal point value. One candidate focal length, one candidate aperture value, and one candidate focal point value form a group of photographing parameters of the reference camera device. Each group of photographing parameters of the reference camera device have an index function. That is to say, each group of photographing parameters of the reference camera device correspond to unique virtual scene parameters. For example, virtual scene parameters corresponding to A1, B1, and C1 are A1, D1, and E1.
In Table 1, the order of a candidate focal length, a candidate aperture value, and a candidate focal point value may be changed. For example, the order of a candidate aperture value and a candidate focal point value is changed. The order in Table 1 represents virtual scene parameters corresponding to different candidate focal point values under each candidate aperture value, and the changed order of a candidate aperture value and a candidate focal point value represents virtual scene parameters corresponding to different candidate aperture values under each candidate focal point value. Since the candidate focal length is consistent with the virtual focal length, the virtual focal length may no longer be recorded in the virtual scene parameters.
S302. Obtain photographing parameters of a target camera device.
S303. Obtain a reference dataset.
For specific implementations of operation S302 and operation S303, reference may be made to the implementation of operation S202 in
S304. Determine, according to a relationship between the photographing parameters of the target camera device and the photographing parameters of the reference camera device and a correspondence between the photographing parameters of the reference camera device and the photographing parameters of the virtual camera device, the target virtual scene parameters corresponding to the virtual content to be photographed by the target camera device.
The computer device determines the actual focal length as a scene focal length of a target virtual scene corresponding to the virtual content to be photographed by the target camera device. For example, assuming that an actual focal length of the target camera device is 16 mm, the computer device configures a scene focal length of a target virtual scene corresponding to virtual content to be photographed by the target camera device to be 16 mm.
S11: The computer device determines whether there is a candidate focal length that is the same as the actual focal length of the target camera device in the M candidate focal lengths of the reference camera device.
The computer device determines, if the actual focal length of the target camera device is consistent with an ith candidate focal length of the reference camera device, a scene aperture value and a scene focal point value of the target virtual scene according to a relationship between the actual aperture value and the actual focal point value and a candidate aperture value and a candidate focal point value that are associated with the ith candidate focal length, where i is a positive integer less than M. For details, refer to operation S12.
The computer device determines, if the actual focal length of the target camera device is between an ith candidate focal length and an (i+1)th candidate focal length of the reference camera device (for example, the ith candidate focal length < the actual focal length < the (i+1)th candidate focal length), a scene aperture value and a scene focal point value of the target virtual scene according to a relationship between the actual aperture value and the actual focal point value and a candidate aperture value and a candidate focal point value that are associated with the ith candidate focal length and a relationship between the actual aperture value and the actual focal point value and a candidate aperture value and a candidate focal point value that are associated with the (i+1)th candidate focal length, where i is a positive integer less than M. For details, refer to operation S15.
S12: When the actual focal length of the target camera device is consistent with the ith candidate focal length of the reference camera device, the computer device determines whether there is a candidate aperture value that is the same as the actual aperture value of the target camera device in the N candidate aperture values of the reference camera device.
The computer device determines, if the actual aperture value of the target camera device is consistent with a jth candidate aperture value associated with an ith candidate focal length of the reference camera device, a scene aperture value and a scene focal point value of the target virtual scene according to a relationship between the actual focal point value of the target camera device and the candidate focal point value associated with the jth candidate aperture value associated with the ith candidate focal length of the reference camera device, where j is a positive integer less than N. For details, refer to operation S13.
The computer device determines, if the actual aperture value of the target camera device is between the jth candidate aperture value associated with the ith candidate focal length of the reference camera device and the (j+1)th candidate aperture value associated with the ith candidate focal length of the reference camera device (for example, the jth candidate aperture value < the actual aperture value < the (j+1)th candidate aperture value), a scene aperture value and a scene focal point value of the target virtual scene according to a relationship between the actual focal point value of the target camera device and the candidate focal point value associated with the jth candidate aperture value associated with the ith candidate focal length of the reference camera device and a relationship between the actual focal point value of the target camera device and the candidate focal point value associated with the (j+1)th candidate aperture value associated with the ith candidate focal length of the reference camera device, where j is a positive integer less than N. For details, refer to operation S14.
S13: When the actual focal length of the target camera device is consistent with the ith candidate focal length of the reference camera device, and the actual aperture value of the target camera device is consistent with the jth candidate aperture value of the reference camera device, the computer device determines whether there is a candidate focal point value that is the same as the actual focal point value of the target camera device in the P candidate focal point values of the reference camera device.
If the actual focal point value of the target camera device is consistent with a kth candidate focal point value associated with a jth candidate aperture value associated with an ith candidate focal length of the reference camera device, where k is a positive integer less than P, the computer device determines a virtual aperture value and a virtual focal point value that correspond to the kth candidate focal point value associated with the jth candidate aperture value associated with the ith candidate focal length as a scene aperture value and a scene focal point value of the target virtual scene respectively. Specifically, referring to the foregoing Table 1, for the reference camera device, an ith candidate focal length is Ai, a jth candidate aperture value is Bj, and a kth candidate focal point value is Ck. Then, the computer device determines, based on Ai, Bj, and Ck, a scene aperture value and a scene focal point value of a virtual scene corresponding thereto from Table 1, and determines the scene aperture value and the scene focal point value of the virtual scene and the actual focal length of the target camera device as target virtual scene parameters.
If the actual focal point value of the target camera device is between the kth candidate focal point value associated with the jth candidate aperture value associated with the ith candidate focal length of the reference camera device and the (k+1)th candidate focal point value associated with the jth candidate aperture value associated with the ith candidate focal length of the reference camera device (for example, the kth candidate focal point value < the actual focal point value < the (k+1)th candidate focal point value), the computer device calculates a scene aperture value and a scene focal point value of the target virtual scene according to the virtual aperture value and the virtual focal point value corresponding to the kth candidate focal point value associated with the jth candidate aperture value associated with the ith candidate focal length of the reference camera device, and a virtual aperture value and a virtual focal point value corresponding to the (k+1)th candidate focal point value associated with the jth candidate aperture value associated with the ith candidate focal length of the reference camera device. Specifically, referring to the foregoing Table 1, for the reference camera device, an ith candidate focal length is Ai, a jth candidate aperture value is Bj, a kth candidate focal point value is Ck, and a (k+1)th candidate focal point value is Ck+1 (assuming that Ck+1>Ck). In a specific implementation, assuming that an actual focal point value of the target camera device is R1, the computer device determines, based on Ai, Bj, and Ck, a scene aperture value Tk and a scene focal point value Wk of a virtual scene corresponding thereto from Table 1; and determines, based on Ai, Bj, and Ck+1, a scene aperture value Tk+1 and a scene focal point value Wk+1 of a virtual scene corresponding thereto from Table 1 (assuming that Wk+1>Wk). Then, the target virtual scene parameters are as follows:
In addition, the target scene focal length is an actual focal length of the target camera device.
S14: When the actual focal length of the target camera device is consistent with the ith candidate focal length of the reference camera device, and the actual aperture value of the target camera device is between the jth candidate aperture value and the (j+1)th candidate aperture value of the reference camera device, the computer device determines whether there is a candidate focal point value that is the same as the actual focal point value of the target camera device in the P candidate focal point values of the reference camera device.
If the actual focal point value of the target camera device is consistent with a kth candidate focal point value associated with a jth candidate aperture value associated with an ith candidate focal length of the reference camera device, a scene aperture value and a scene focal point value of the target virtual scene are calculated according to a virtual aperture value and a virtual focal point value that correspond to the kth candidate focal point value associated with the jth candidate aperture value associated with the ith candidate focal length of the reference camera device, and a virtual aperture value and a virtual focal point value that correspond to a kth candidate focal point value associated with the (j+1)th candidate aperture value associated with the ith candidate focal length of the reference camera device. Specifically, referring to the foregoing Table 1, for the reference camera device, an ith candidate focal length is Ai, a jth candidate aperture value is Bj, a (j+1)th candidate aperture value is Bj+1, and a kth candidate focal point value is Ck. In a specific implementation, assuming that an actual aperture value of the target camera device is Q1, the computer device determines, based on Ai, Bj, and Ck, a scene aperture value Tk and a scene focal point value Wk of a virtual scene corresponding thereto from Table 1; and determines, based on Ai, Bj+1 (assuming that Bj+1>Bj), and Ck, a scene aperture value Uk (assuming that Uk>Tk) and a scene focal point value Vk of a virtual scene corresponding thereto from Table 1. Then, the target virtual scene parameters are as follows:
In addition, the target scene focal length is an actual focal length of the target camera device.
If the actual focal point value of the target camera device is between the kth candidate focal point value associated with the jth candidate aperture value associated with the ith candidate focal length of the reference camera device and the (k+1)th candidate focal point value associated with the jth candidate aperture value associated with the ith candidate focal length of the reference camera device, a scene aperture value and a scene focal point value of the target virtual scene are calculated according to the virtual aperture value and the virtual focal point value corresponding to the kth candidate focal point value associated with the jth candidate aperture value associated with the ith candidate focal length of the reference camera device, a virtual aperture value and a virtual focal point value corresponding to the (k+1)th candidate focal point value associated with the jth candidate aperture value associated with the ith candidate focal length of the reference camera device, a virtual aperture value and a virtual focal point value that correspond to a kth candidate focal point value associated with the (j+1)th candidate aperture value associated with the ith candidate focal length of the reference camera device, and a virtual aperture value and a virtual focal point value that correspond to a (k+1)th candidate focal point value associated with the (j+1)1 candidate aperture value associated with the ith candidate focal length of the reference camera device. Specifically, referring to the foregoing Table 1, for the reference camera device, an ith candidate focal length is Ai, a jth candidate aperture value is Bj, a (j+1)th candidate aperture value is Bj+1, a kth candidate focal point value is Ck, and a (k+1)th candidate focal point value is Ck+1.
In a specific implementation, it is assumed that for the target camera device, an actual focal point value is R2 and an actual aperture value is Q2. The computer device determines, based on Ai, Bj, and Ck, a scene aperture value Tk and a scene focal point value Wk of a virtual scene corresponding thereto from Table 1; determines, based on Ai, Bj, and Ck+1 (assuming that Ck+1>Ck), a scene aperture value Tk+1 and a scene focal point value Wk+1 of a virtual scene corresponding thereto from Table 1 (assuming that Wk+1>Wk); determines, based on Ai, Bj+1 (assuming that Bj+1>Bj), and Ck, a scene aperture value Uk and a scene focal point value Vk of a virtual scene corresponding thereto from Table 1; and determines, based on Ai, Bj+1, and Ck+1, a scene aperture value Uk+1 and a scene focal point value Vk+1 of a virtual scene corresponding thereto from Table 1 (assuming that Vk+1>Vk). A specific correspondence is shown in Table 2.
Based on the foregoing Table 2, the target virtual scene parameters are as follows:
where Z1 is the smaller one of (Uk+Uk+1)/2 and (Tk+Tk+1)/2, and Z2 is the larger one of (Uk+Uk+1)/2 and (Tk+Tk+1)/2.
where Z3=[(R2−Ck)/(Ck+1−Ck)]*(Wk+1−Wk)+Wk, and Z4=[(R2−Ck)/(Ck+1−Ck)]*(Vk+1−Vk)+Vk. In addition, the target scene focal length is an actual focal length of the target camera device.
S15: When the actual focal length of the target camera device is between an ith candidate focal length and an (i+1)th candidate focal length of the reference camera device, the computer device determines whether there is a candidate aperture value that is the same as the actual aperture value of the target camera device in the N candidate aperture values of the reference camera device.
The computer device determines, if the actual aperture value of the target camera device is consistent with a jth candidate aperture value associated with an ith candidate focal length of the reference camera device, a scene aperture value and a scene focal point value of the target virtual scene according to a relationship between the actual focal point value of the target camera device and the candidate focal point value associated with the jth candidate aperture value associated with the ith candidate focal length of the reference camera device, and a relationship between the actual focal point value of the target camera device and the candidate focal point value associated with the jth candidate aperture value associated with the (i+1)th candidate focal length of the reference camera device, where j is a positive integer less than N. For details, refer to operation S16.
The computer device determines, if the actual aperture value of the target camera device is between the jth candidate aperture value associated with the ith candidate focal length of the reference camera device and the (j+1)th candidate aperture value associated with the ith candidate focal length, a scene aperture value and a scene focal point value of the target virtual scene according to a relationship between the actual focal point value of the target camera device and the candidate focal point value associated with the jth candidate aperture value associated with the ith candidate focal length of the reference camera device, a relationship between the actual focal point value of the target camera device and the candidate focal point value associated with the (j+1)th candidate aperture value associated with the ith candidate focal length of the reference camera device, a relationship between the actual focal point value of the target camera device and the candidate focal point value associated with the jth candidate aperture value associated with the (i+1)th candidate focal length of the reference camera device, and a relationship between the actual focal point value of the target camera device and the candidate focal point value associated with the (j+1)th candidate aperture value associated with the (i+1)th candidate focal length of the reference camera device, where j is a positive integer less than N. For details, refer to operation S17.
S16: When the actual focal length of the target camera device is between an ith candidate focal length and an (i+1)th candidate focal length of the reference camera device, and the actual aperture value of the target camera device is consistent with the jth candidate aperture value of the reference camera device, the computer device determines whether there is a candidate focal point value that is the same as the actual focal point value of the target camera device in the P candidate focal point values of the reference camera device.
The computer device calculates, if the actual focal point value of the target camera device is consistent with a kth candidate focal point value associated with the jth candidate aperture value associated with the ith candidate focal length of the reference camera device, a scene aperture value and a scene focal point value of the target virtual scene according to a virtual aperture value and a virtual focal point value that correspond to the kth candidate focal point value associated with the jth candidate aperture value associated with the ith candidate focal length, and a virtual aperture value and a virtual focal point value that correspond to a kth candidate focal point value associated with the jth candidate aperture value associated with the (i+1)th candidate focal length. Specifically, referring to the foregoing Table 1, for the reference camera device, an ith candidate focal length is Ai, an (i+1)th candidate focal length is Ai+1, a jth candidate aperture value is Bj, and a kth candidate focal point value is Ck. In a specific implementation, assuming that an actual focal length of the target camera device is FS1, the computer device determines, based on Ai, Bj, and Ck, a scene aperture value Tk and a scene focal point value Wk of a virtual scene corresponding thereto from Table 1; and determines, based on Ai+1, Bj, and Ck, a scene aperture value HAk and a scene focal point value HBk of a virtual scene corresponding thereto from Table 1. Then, the target virtual scene parameters are as follows:
In addition, the target scene focal length is an actual focal length (that is, FS1) of the target camera device.
If the actual focal point value of the target camera device is between the kth candidate focal point value associated with the jth candidate aperture value associated with the ith candidate focal length of the reference camera device and the (k+1)th candidate focal point value associated with the jth candidate aperture value associated with the ith candidate focal length, the computer device calculates a scene aperture value and a scene focal point value of the target virtual scene according to the virtual aperture value and the virtual focal point value corresponding to the kth candidate focal point value associated with the jth candidate aperture value associated with the ith candidate focal length, a virtual aperture value and a virtual focal point value corresponding to the (k+1)th candidate focal point value associated with the jth candidate aperture value associated with the ith candidate focal length, a virtual aperture value and a virtual focal point value that correspond to a kth candidate focal point value associated with the jth candidate aperture value associated with the (i+1)th candidate focal length, and a virtual aperture value and a virtual focal point value that correspond to a (k+1)th candidate focal point value associated with the jth candidate aperture value associated with the (i+1)th candidate focal length. Specifically, referring to the foregoing Table 1, for the reference camera device, an ith candidate focal length is Ai, an (i+1)th candidate focal length is Ai+1, a jth candidate aperture value is Bj, a kth candidate focal point value is Ck, and a (k+1)th candidate focal point value is Ck+1.
In a specific implementation, assuming that for the target camera device, an actual focal length is FS2 and an actual focal point value is R3, the computer device determines, based on Ai, Bj, and Ck, a scene aperture value Tk and a scene focal point value Wk of a virtual scene corresponding thereto from Table 1; determines, based on Ai, Bj, and Ck+1 (assuming that Ck+1>Ck), a scene aperture value Tk+1 and a scene focal point value Wk+1 of a virtual scene corresponding thereto from Table 1 (assuming that Wk+1>Wk); determines, based on Ai+1, Bj, and Ck, a scene aperture value HAk and a scene focal point value HBk of a virtual scene corresponding thereto from Table 1. determines, based on Ai+1, Bj, and Ck+1, a scene aperture value HAk+1 and a scene focal point value HBk+1 of a virtual scene corresponding thereto from Table 1 (assuming that HBk+1>HBk). A specific correspondence is shown in Table 3.
Based on the foregoing Table 3, the target virtual scene parameters are as follows:
where Z5=(Tk+Tk+1)/2, and Z6=(HAk+HAk+1)/2.
where Z7=[(R3−Ck)/(Ck+1−Ck)]*(Wk+1−Wk)+Wk, and Z8=[(R3−Ck)/(Ck+1−Ck)]*(HBk+1−HBk)+HBk. In addition, the target scene focal length is an actual focal length (that is, FS2) of the target camera device.
S17: When the actual focal length of the target camera device is between an ith candidate focal length and an (i+1)th candidate focal length of the reference camera device, and the actual aperture value of the target camera device is between the jth candidate aperture value associated with the ith candidate focal length of the reference camera device and the (j+1)th candidate aperture value associated with the ith candidate focal length, the computer device determines whether there is a candidate focal point value that is the same as the actual focal point value of the target camera device in the P candidate focal point values of the reference camera device.
If the actual focal point value of the target camera device is consistent with a kth candidate focal point value associated with the jth candidate aperture value associated with the ith candidate focal length of the reference camera device, the computer device calculates a scene aperture value and a scene focal point value of the target virtual scene according to the virtual aperture value and the virtual focal point value corresponding to the kth candidate focal point value associated with the jth candidate aperture value associated with the ith candidate focal length, a virtual aperture value and a virtual focal point value corresponding to the kth candidate focal point value associated with the (j+1)th candidate aperture value associated with the ith candidate focal length, a virtual aperture value and a virtual focal point value that correspond to a kth candidate focal point value associated with the jth candidate aperture value associated with the (i+1)th candidate focal length, and a virtual aperture value and a virtual focal point value that correspond to a kth candidate focal point value associated with the (j+1)th candidate aperture value associated with the (i+1)th candidate focal length. Specifically, referring to the foregoing Table 1, for the reference camera device, an ith candidate focal length is Ai, an (i+1)th candidate focal length is Ai+1, a jth candidate aperture value is Bj, a (j+1)th candidate aperture value is Bj+1, and a kth candidate focal point value is Ck.
In a specific implementation, assuming that for the target camera device, an actual focal length is FS3 and an actual aperture value is Q3, the computer device determines, based on Ai, Bj, and Ck, a scene aperture value Tk and a scene focal point value Wk of a virtual scene corresponding thereto from Table 1; determines, based on Ai, Bj+1 (assuming that Bj+1>Bj), and Ck, a scene aperture value Uk (assuming that Uk>Wk) and a scene focal point value Vk of a virtual scene corresponding thereto from Table 1; determines, based on Ai+1, Bj, and Ck, a scene aperture value HAk and a scene focal point value HBk of a virtual scene corresponding thereto from Table 1. determines, based on Ai+1, Bj+1, and Ck, a scene aperture value GAk (assuming that GAk>HAk) and a scene focal point value GBk of a virtual scene corresponding thereto from Table 1. A specific correspondence is shown in Table 4.
Based on the foregoing Table 4, the target virtual scene parameters are as follows:
where Z9=[(Q3−Bj)/(Bj+1−Bj)]*(Uk−Tk)+Tk, and Z10=[(Q3−Bj)/(Bj+1−Bj)]*(GAk−HAk)+HAk.
where Z11=(Wk+Vk)/2, and Z12=(HBk+GBk)/2. In addition, the target scene focal length is an actual focal length (that is, FS3) of the target camera device.
If the actual focal point value of the target camera device is between the kth candidate focal point value and the (k+1)th candidate focal point value that are associated with the jth candidate aperture value associated with the ith candidate focal length of the reference camera device, the computer device calculates a scene aperture value and a scene focal point value of the target virtual scene according to the virtual aperture value and the virtual focal point value corresponding to the virtual aperture value and the virtual focal point value corresponding to the kth candidate focal point value associated with the jth candidate aperture value associated with the ith candidate focal length, a virtual aperture value and a virtual focal point value corresponding to the (k+1)th candidate focal point value associated with the jth candidate aperture value associated with the ith candidate focal length, a virtual aperture value and a virtual focal point value corresponding to the kth candidate focal point value associated with the (j+1)th candidate aperture value associated with the ith candidate focal length, a virtual aperture value and a virtual focal point value corresponding to the (k+1)th candidate focal point value associated with the (j+1)th candidate aperture value associated with the ith candidate focal length, a virtual aperture value and a virtual focal point value that correspond to a kth candidate focal point value associated with the jth candidate aperture value associated with the (i+1)th candidate focal length, a virtual aperture value and a virtual focal point value that correspond to a (k+1)th candidate focal point value associated with the jth candidate aperture value associated with the (i+1)th candidate focal length, a virtual aperture value and a virtual focal point value that correspond to a kth candidate focal point value associated with the (j+1)th candidate aperture value associated with the (i+1)th candidate focal length, and a virtual aperture value and a virtual focal point value that correspond to a (k+1)th candidate focal point value associated with the (j+1)th candidate aperture value associated with the (i+1)th candidate focal length. Specifically, referring to the foregoing Table 1, for the reference camera device, an ith candidate focal length is Ai, an (i+1)th candidate focal length is Ai+1, a jth candidate aperture value is Bj, a (j+1)th candidate aperture value is Bj+1, a kth candidate focal point value is Ck, and a (k+1)th candidate focal point value is Ck+1.
In a specific implementation, it is assumed that for the target camera device, an actual focal length is FS4 an actual aperture value is Q4, and an actual focal point value is R4. The computer device determines, based on Ai, Bj, and Ck, a scene aperture value Tk and a scene focal point value Wk of a virtual scene corresponding thereto from Table 1; determines, based on Ai, Bj, and Ck+1 (assuming that Ck+1>Ck), a scene aperture value Tk+1 and a scene focal point value Wk+1 of a virtual scene corresponding thereto from Table 1 (assuming that Wk+1>Wk); determines, based on Ai, Bj+1 (assuming that Bj+1>Bj), and Ck, a scene aperture value Uk and a scene focal point value Vk of a virtual scene corresponding thereto from Table 1; and determines, based on Ai, Bj+1, and Ck+1, a scene aperture value Uk+1 and a scene focal point value Vk+1 of a virtual scene corresponding thereto from Table 1 (assuming that Vk+1>Vk); determines, based on Ai+1, Bj, and Ck, a scene aperture value HAk and a scene focal point value HBk of a virtual scene corresponding thereto from Table 1. determines, based on Ai+1, Bj, and Ck+1, a scene aperture value HAk+1 and a scene focal point value HBk+1 of a virtual scene corresponding thereto from Table 1 (assuming that HBk+1>HBk); determines, based on Ai+1, Bj+1, and Ck, a scene aperture value GAk and a scene focal point value GBk of a virtual scene corresponding thereto from Table 1; and determines, based on Ai+1, Bj+1, and Ck+1, a scene aperture value GAk+1 and a scene focal point value GBk+1 of a virtual scene corresponding thereto from Table 1 (assuming that GBk+1>GBk). A specific correspondence is shown in Table 5.
Based on the foregoing Table 5, the target virtual scene parameters are as follows:
where Z13=[(Q4−Bj)/(Bj+1−Bj)]*(Z2-Z1)+Z1; Z1 is the smaller one of (Uk+Uk+1)/2 and (Tk+Tk+1)/2, and Z2 is the larger one of (Uk+Uk+1)/2 and (Tk+Tk+1)/2; Z14=[(Q4−Bj)/(Bj+1−Bj)]*(Z16−Z15)+Z15; Z15 is the smaller one of (HAk+HAk+1)/2 and (GAk+GAk+1)/2, and Z16 is the larger one of (HAk+HAk+1)/2 and (GAk+GAk+1)/2.
where Z17=[(R4−Ck)/(Ck+1−Ck)]*(Wk+1−Wk)+Wk, Z18=[(R4−Ck)/(Ck+1−Ck)]*(Vk+1−Vk)+Vk, Z19=[(R4−Ck)/(Ck+1−Ck)]*(HBk+1−HBk)+HBk, and Z20=[(R4−Ck)/(Ck+1−Ck)]*(GBk+1−GBk)+GBk. In addition, the target scene focal length is an actual focal length (that is, FS4) of the target camera device.
S305. Display the virtual content according to the target virtual scene parameters.
For a specific implementation of operation S305, reference may be made to the implementation of operation S203 in
In the examples of this application, a correspondence between reference photographing parameters and virtual scene parameters is determined by comparing a virtual image with a real image, and then a reference dataset is configured. Photographing parameters of a target camera device and the reference dataset are obtained, target virtual scene parameters corresponding to virtual content to be photographed by the target camera device are determined according to the photographing parameters of the target camera device and the preset reference dataset, and the virtual content is displayed according to the target virtual scene parameters, to cause the target camera device to photograph the virtual content. It can be seen that, the target virtual scene parameters corresponding to the virtual content to be photographed by the target camera device are determined through the reference dataset, so that efficiency of configuring the virtual scene parameters and real-time performance of displaying the virtual content can be improved. In addition, since the target virtual scene parameters are calculated based on the reference dataset, and the reference dataset is determined when the real image matches the virtual image, in the photographed content, the transition between the virtual content displayed according to the target virtual scene parameters and the real scene is smoother, thereby making the perception of the object more realistic when viewing the photographed content.
The method in the examples of this application is described in detail above. For ease of better implementing the foregoing solutions in the examples of this application, an apparatus in an example of this application is correspondingly provided below.
An obtaining unit 501 is configured to obtain photographing parameters of a target camera device; and configured to obtain a reference dataset, the reference dataset including a correspondence between reference photographing parameters and virtual scene parameters; and
In an implementation, the reference photographing parameters are configured for describing photographing parameters of a reference camera device, the reference camera device is a real camera device, and the virtual scene parameters are configured for describing photographing parameters of a virtual camera device in a virtual scene. The processing unit 502 is further configured to: configure a reference dataset according to photographing parameters of a reference camera device and photographing parameters of a virtual camera device.
In an implementation, the reference camera device includes a plurality of groups of photographing parameters. A configuration process of the reference dataset includes:
In an implementation, the reference dataset includes M*N*P groups of photographing parameters, each group of photographing parameters includes a focal length, an aperture value, and a focal point value, M is a quantity of candidate focal lengths, N is a quantity of candidate aperture values, P is a quantity of candidate focal point values, and M, N, and P are all positive integers; and the first group of photographing parameters includes an ith candidate focal length, a jth candidate aperture value, and a kth candidate focal point value, where i is a positive integer less than M, j is a positive integer less than N, and k is a positive integer less than P; and
In an implementation, the processing unit 502 is configured to determine, according to the photographing parameters of the target camera device and a reference dataset, target virtual scene parameters corresponding to virtual content to be photographed by the target camera device, and is specifically configured to:
In an implementation, the reference dataset includes M*N*P groups of photographing parameters, each group of photographing parameters includes a focal length, an aperture value, and a focal point value, M is a quantity of candidate focal lengths, N is a quantity of candidate aperture values, P is a quantity of candidate focal point values, and M, N, and P are all positive integers; and the photographing parameters of the target camera device include an actual focal length, an actual aperture value, and an actual focal point value; and
In an implementation, the processing unit 502 is configured to determine a scene aperture value and a scene focal point value of the target virtual scene according to a relationship between the actual aperture value and the actual focal point value and a candidate aperture value and a candidate focal point value that are associated with the ith candidate focal length, and is specifically configured to:
In an implementation, the processing unit 502 is configured to determine a scene aperture value and a scene focal point value of the target virtual scene according to a relationship between the actual focal point value and a candidate focal point value associated with the jth candidate aperture value, and is specifically configured to:
In an implementation, the processing unit 502 is configured to determine a scene aperture value and a scene focal point value of the target virtual scene according to a relationship between the actual focal point value and a candidate focal point value associated with the jth candidate aperture value and a relationship between the actual focal point value and a candidate focal point value associated with the (j+1)th candidate aperture value, and is specifically configured to:
In an implementation, the processing unit 502 is configured to determine a scene aperture value and a scene focal point value of the target virtual scene according to a relationship between the actual aperture value and the actual focal point value and a candidate aperture value and a candidate focal point value that are associated with the ith candidate focal length and a relationship between the actual aperture value and the actual focal point value and a candidate aperture value and a candidate focal point value that are associated with the (i+1)th candidate focal length, and is specifically configured to:
In an implementation, the processing unit 502 is configured to determine a scene aperture value and a scene focal point value of the target virtual scene according to a relationship between the actual focal point value and a candidate focal point value associated with the jth candidate aperture value associated with the ith candidate focal length and a relationship between the actual focal point value and a candidate focal point value associated with a jth candidate aperture value associated with the (i+1)th candidate focal length, and is specifically configured to:
In an implementation, the processing unit 502 is configured to determine a scene aperture value and a scene focal point value of the target virtual scene according to a relationship between the actual focal point value and candidate focal point values associated with the jth candidate aperture value and the (j+1)th candidate aperture value that are associated with the ith candidate focal length and a relationship between the actual focal point value and candidate focal point values associated with a jth candidate aperture value and a (j+1)th candidate aperture value that are associated with the (i+1)th candidate focal length, and is specifically configured to:
According to an example of this application, some of the operations involved in the parameter configuration methods shown in
According to another example of this application, a computer program (including program code) that can perform the operations involved in the corresponding methods shown in
A principle and beneficial effects of resolving problems by the parameter configuration apparatus provided in the examples of this application are similar to those of the parameter configuration method in the method example of this application. Reference may be made to the principle and beneficial effects of implementation of the method. For concise descriptions, details are not described herein again.
An example of this application further provides a computer-readable storage medium (memory). The computer-readable storage medium is a memory device in the terminal and is configured to store programs and data. The computer-readable storage medium herein may include a built-in storage medium of the terminal, and may certainly further includes an expanded storage medium supported by the terminal. The computer-readable storage medium provides a storage space, storing a processing system of the terminal. In addition, the storage space further stores one or more instructions suitable for being loaded and executed by the processor 601. The instructions may be one or more computer programs (including program code). The computer-readable storage medium herein may be a high speed RAM memory or a non-volatile memory, for example, at least one magnetic disk memory; or may be at least one computer-readable storage medium located away from the foregoing processor.
In an implementation, the computer device may be specifically the display device 102 shown in
In an exemplary example, the reference photographing parameters are configured for describing photographing parameters of a reference camera device, and the virtual scene parameters are configured for describing photographing parameters of a virtual camera device. The processor 601 further performs the following operations by running the executable program code in the memory 603:
Configure a reference dataset according to photographing parameters of a reference camera device and photographing parameters of a virtual camera device.
In an exemplary example, the reference camera device includes a plurality of groups of photographing parameters; and A specific example of the configuration process of the reference dataset is:
In an exemplary example, the reference dataset includes M*N*P groups of photographing parameters, each group of photographing parameters includes a focal length, an aperture value, and a focal point value, M is a quantity of candidate focal lengths, N is a quantity of candidate aperture values, P is a quantity of candidate focal point values, and M, N, and P are all positive integers; and the first group of photographing parameters includes an ith candidate focal length, a jth candidate aperture value, and a kth candidate focal point value, where i is a positive integer less than M, j is a positive integer less than N, and k is a positive integer less than P; and
In an exemplary example, a specific example in which the processor 601 determines, according to the photographing parameters of the target camera device and a reference dataset, target virtual scene parameters corresponding to virtual content to be photographed by the target camera device is:
In an exemplary example, the reference dataset includes M*N*P groups of photographing parameters, each group of photographing parameters includes a focal length, an aperture value, and a focal point value, M is a quantity of candidate focal lengths, N is a quantity of candidate aperture values, P is a quantity of candidate focal point values, and M, N, and P are all positive integers; and the photographing parameters of the target camera device include an actual focal length, an actual aperture value, and an actual focal point value; and
In an exemplary example, a specific example in which the processor 601 determines a scene aperture value and a scene focal point value of the target virtual scene according to a relationship between the actual aperture value and the actual focal point value and a candidate aperture value and a candidate focal point value that are associated with the ith candidate focal length is:
In an exemplary example, a specific example in which the processor 601 determines a scene aperture value and a scene focal point value of the target virtual scene according to a relationship between the actual focal point value and a candidate focal point value associated with the jth candidate aperture value is:
In an exemplary example, a specific example in which the processor 601 determines a scene aperture value and a scene focal point value of the target virtual scene according to a relationship between the actual focal point value and a candidate focal point value associated with the jth candidate aperture value and a relationship between the actual focal point value and a candidate focal point value associated with the (j+1)th candidate aperture value is:
In an exemplary example, a specific example in which the processor 601 determines a scene aperture value and a scene focal point value of the target virtual scene according to a relationship between the actual aperture value and the actual focal point value and a candidate aperture value and a candidate focal point value that are associated with the ith candidate focal length and a relationship between the actual aperture value and the actual focal point value and a candidate aperture value and a candidate focal point value that are associated with the (i+1)th candidate focal length is:
In an exemplary example, a specific example in which the processor 601 determines a scene aperture value and a scene focal point value of the target virtual scene according to a relationship between the actual focal point value and a candidate focal point value associated with the jth candidate aperture value associated with the ith candidate focal length and a relationship between the actual focal point value and a candidate focal point value associated with a jth candidate aperture value associated with the (i+1)th candidate focal length is:
In an exemplary example, a specific example in which the processor 601 determines a scene aperture value and a scene focal point value of the target virtual scene according to a relationship between the actual focal point value and candidate focal point values associated with the jth candidate aperture value and the (j+1)th candidate aperture value that are associated with the ith candidate focal length and a relationship between the actual focal point value and candidate focal point values associated with a jth candidate aperture value and a (j+1)th candidate aperture value that are associated with the (i+1)th candidate focal length is:
Based on a same inventive idea, a principle and beneficial effects of resolving problems by the computer device provided in the examples of this application are similar to those of the parameter configuration method in the method examples of this application. Reference may be made to the principle and beneficial effects of implementation of the method. For concise descriptions, details are not described herein again.
An example of this application further provides a computer-readable storage medium. The computer-readable storage medium has one or more instructions stored therein. The one or more instructions are adapted to be loaded and executed by a processor to implement the parameter configuration method according to the foregoing method examples.
An example of this application further provides a computer program product including instructions, and the computer program product, when run on a computer, causes the computer to perform the parameter configuration method according to the foregoing method examples.
An example of this application further provides a computer program product or a computer program. The computer program product or the computer program includes computer instructions. The computer instructions are stored in a computer-readable storage medium. A processor of a computer device reads the computer instructions from the computer-readable storage medium, and executes the computer instructions, to cause the computer device to perform the foregoing parameter configuration method.
The operations of the methods of the examples of this application may be reordered, combined, or deleted according to an actual requirement.
The modules of the apparatuses of the examples of this application may be combined, divided, or deleted according to an actual requirement.
Persons of ordinary skill in the art may understand that all or some of the operations of the methods in the foregoing examples may be implemented by a program instructing relevant hardware. The program may be stored in a computer-readable storage medium. The readable storage medium may include: a flash disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disc.
The contents disclosed above are merely examples of this application, but not intended to limit the scope of this application. Persons of ordinary skill in the art can understand all or a part of the procedures for implementing the foregoing examples, and any equivalent variation made by them according to the claims of this application shall still fall within the scope of this application.
Number | Date | Country | Kind |
---|---|---|---|
202210525942.6 | May 2022 | CN | national |
This application is a continuation application of PCT Application PCT/CN2023/092982, filed May 9, 2023, which claims priority to Chinese Patent Application No. 202210525942.6 filed on May 13, 2022, each entitled “PARAMETER CONFIGURATION METHOD AND APPARATUS, DEVICE, STORAGE MEDIUM, AND PRODUCT”, and each is incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/CN2023/092982 | May 2023 | WO |
Child | 18779673 | US |