The present application relates to the technical field of image processing, and in particular to a shooting calibration method, a system, an equipment and a storage medium.
Virtual reality (VR) shooting is a shooting method that synthesizes a target with a virtual scene. When performing VR shooting, an electronic equipment and a head-mounted equipment are required. After the electronic equipment shoots a first target image, it is necessary to remove the background from the first target image and synthesize it with the virtual background of the head-mounted equipment. However, the first target image shot through the electronic equipment may not match the virtual background in the head-mounted equipment. For example, the size of the first target image is much larger than or smaller than the size of the virtual background, resulting in huge difference between the background and the target, which in turn leads to a poor user experience.
The main purpose of the present application is to provide a shooting calibration method, a system, an equipment and a storage medium, aiming to solve the technical problem in the related art that the first target image shot through the electronic equipment may not match the virtual background in the head-mounted equipment, resulting in a poor user experience.
In order to achieve the above objectives, the present application provides a shooting calibration method, applied to a shooting calibration system. The shooting calibration system includes an electronic equipment and a wearable equipment, and the electronic equipment is communicated with the wearable equipment. The shooting calibration method includes:
In an embodiment, a shooting angle corresponding to a camera of the electronic equipment is consistent with a shooting angle corresponding to the wearable equipment, and a shooting sight corresponding to the camera of the electronic equipment is consistent with a shooting sight corresponding to the wearable equipment.
In an embodiment, the shooting, by the electronic equipment, the first target image, and sending the first target image to the wearable equipment includes:
In an embodiment, after the receiving, by the wearable equipment, the first target image sent by the electronic equipment, the shooting calibration method further includes:
In an embodiment, the in response to that the first target image does not overlap with the second target image observed through the wearable equipment, adjusting the image size parameter of the first target image, and returning to execute the step of sending the first target image to the wearable equipment until the first target image overlaps with the second target image, and obtaining the target adjustment parameter includes:
In an embodiment, the in response to that the first target image does not overlap with the second target image observed through the wearable equipment, adjusting the image size parameter of the first target image, and returning to execute the step of sending the first target image to the wearable equipment until the first target image overlaps with the second target image, and obtaining the target adjustment parameter includes:
In an embodiment, after the in response to that the first target image does not overlap with the second target image observed through the wearable equipment, adjusting the image size parameter of the first target image, and returning to execute the step of sending the first target image to the wearable equipment until the first target image overlaps with the second target image, and obtaining the target adjustment parameter, the shooting calibration method further includes:
The present application further provides a shooting calibration system, which is a virtual system and includes:
The present application further provides an electronic equipment, which is a physical equipment and includes a camera, a memory, a processor, and a shooting calibration program stored in the memory. When the shooting calibration program is executed by the processor, the shooting calibration method as mentioned above is implemented.
The present application further provides a storage medium. The storage medium is a non-transitory computer-readable storage medium, a shooting calibration program is stored on the non-transitory computer-readable storage medium, and when the shooting calibration program is executed by a processor, the shooting calibration method as mentioned above is implemented.
The present application provides a shooting calibration method, a system, an equipment and a storage medium. First, the electronic equipment shoots a first target image, and sends the first target image to the wearable equipment. The wearable equipment receives the first target image sent by the electronic equipment. If the first target image does not overlap with a second target image observed through the wearable equipment, an image size parameter of the first target image is adjusted, and then it will return to execute a step of sending the first target image to the wearable equipment until the first target image overlaps with the second target image, and obtaining a target adjustment parameter. The second target image observed through the wearable equipment through the lens is compared with the first target image shot through the electronic equipment. When there is no overlap between the second target image and the first target image, the first target image of the electronic equipment is adjusted until the first target image shot through the electronic equipment can overlap with the second target image observed through the wearable equipment, and the target adjustment parameter of the overlapped image is recorded. Based on the target adjustment parameter, the camera of the electronic equipment is calibrated, so that the calibrated camera can match with the size of the object in the virtual environment regardless of the angle the calibrated camera turns to, thereby improving the user experience.
Drawings herein are incorporated into the specification and constitute a part of the specification, showing embodiments consistent with the present application, and are used to explain the principles of the present application in conjunction with the specification.
To more clearly illustrate technical solutions in the embodiments of the present application or the related art, the following will briefly introduce the drawings that need to be used in the description of the embodiments or the related art. Obviously, for those skilled in the art, without creative effort, other drawings can be obtained according to these drawings.
The realization of the objective, functional characteristics, and advantages of the present application are further described with reference to the accompanying drawings.
It should be understood that the specific embodiments described herein are only used to explain the present application and are not used to limit the present application.
The main technical solution of the embodiments of the present application is: shooting, by the electronic equipment, a first target image, and sending the first target image to the wearable equipment; receiving, by the wearable equipment, the first target image sent by the electronic equipment; and in response to that the first target image does not overlap with a second target image observed through the wearable equipment, adjusting an image size parameter of the first target image, and returning to execute a step of sending the first target image to the wearable equipment until the first target image overlaps with the second target image, and obtaining a target adjustment parameter. Therefore, the calibrated camera can match with the size of the object in the virtual environment regardless of the angle the calibrated camera turns to, thereby improving the user experience.
In an embodiment, with reference to
Step S10, shooting, by the electronic equipment, a first target image, and sending the first target image to the wearable equipment.
In this embodiment, it should be noted that the wearable equipment includes a virtual reality (VR) equipment, an augmented reality (AR) equipment, and other equipments. In an embodiment, the wearable equipment is a head-mounted equipment. The electronic equipment can be a mobile phone, a tablet computer, a camera, and the like. The electronic equipment is provided with a camera. As shown in
It should be further explained that the shooting sights corresponding to the camera of the electronic equipment are the same as the shooting sights corresponding to the wearable equipment, and shooting angles corresponding to the camera of the electronic equipment are the same as the shooting angles corresponding to the wearable equipment. The user can observe the image of the preset target through the lens of the wearable equipment, and the first target image is an image that includes the preset target. The preset target includes a person, an animal or a real scene, and the like
The target image of the preset target is obtained by shooting with the camera of the electronic equipment, and then foreground/rear view separation is performed in the target image to obtain a foreground image, and the foreground image is used as the first target image. Then the first target image is transmitted to the wearable equipment.
The above step S10, shooting, by the electronic equipment, a first target image, and sending the first target image to the wearable equipment includes following steps.
Step S11, shooting, by a camera of the electronic equipment, a preset target, to obtain a target image.
Step S12, removing background of the target image to obtain the first target image, and transmitting the first target image to the wearable equipment.
In this embodiment, the preset target is firstly shot through the camera of the electronic equipment according to the preset shooting angle and the shooting sight to obtain the target image, and then the background of the target image is removed through the preset foreground/background separation method. For example, the foreground/background of the target image obtained by the camera is separated by the optical flow algorithm/low-rank decomposition/mixed Gaussian modeling algorithm, to obtain the background of the first target image. The first target image is uploaded to the wearable equipment to project the first target image on the lens of the wearable equipment.
Step S20, receiving, by the wearable equipment, the first target image sent by the electronic equipment.
After the above step S20, receiving, by the wearable equipment, the first target image sent by the electronic equipment, the shooting calibration method further includes following steps.
Step A10, projecting the first target image on a lens screen of the wearable equipment.
Step A20, determining a offset between the second target image and the first target image.
Step A30, performing image offset compensation on the first target image based on the offset, to determine whether the second target image overlaps with the offset-compensated first target image.
In this embodiment, it should be noted that, for the imaging principle of the wearable equipment, there will be an offset between the image projected onto the wearable equipment and the image observed through the wearable equipment. In an embodiment, the wearable equipment receives the first target image sent by the electronic equipment, and then analyzes the first target image. Based on the imaging focal length designed by the wearable equipment, the offset of the first target image with reference to the second target image observed through the wearable equipment can be calculated. Based on the offset, image offset compensation is performed on the first target image during imaging, so that the user can observe the offset-compensated first target image on the lens screen of the wearable equipment, thereby more accurately judging whether the second target image overlaps with the offset-compensated first target image.
Step S30, in response to that the first target image does not overlap with a second target image observed through the wearable equipment, adjusting an image size parameter of the first target image, and returning to execute a step of sending the first target image to the wearable equipment until the first target image overlaps with the second target image, and obtaining a target adjustment parameter.
In this embodiment, it should be noted that, in one possible implementation, the time that the user observes the second target image through the lens of the wearable equipment is the same as the time that the electronic equipment shoots the first target image. In an embodiment, after wearing the wearable equipment, the user will adjust the wearable equipment to make the shooting sight of the wearable equipment facing the preset target consistent with the shooting sight of the electronic equipment facing the preset target, and make the shooting angle of the wearable equipment facing the preset target consistent with the shooting angle of the electronic equipment facing the preset target. Then the preset target is shot through the electronic equipment to obtain the corresponding first target image, and the second target image is obtained by observing the preset target through the wearable equipment. In another possible implementation, the time that the user observes the second target image through the lens of the wearable equipment is different from the time that the electronic equipment shoots the target. In an embodiment, when the electronic equipment obtains the shooting instruction, the preset target is shot through the electronic equipment to obtain the target image, and then the background of the target image is removed to obtain the first target image. The first target image is stored. Then, when the user wears the wearable equipment, the electronic equipment sends the stored first target image to the wearable equipment, and displays the first target image on the lens of the wearable equipment. Then the shooting angle and the shooting sight of the wearable equipment are adjusted to be the same as the shooting angle and the shooting sight of the electronic equipment, to obtain the second target image of the preset target observed by the human eye.
In an embodiment, the first target image displayed on the lens screen of the wearable equipment is matched with the second target image observed through the wearable equipment. If the first target image displayed on the lens screen of the wearable equipment matches with the second target image observed through the wearable equipment, it means that the size of the first target image shot through the camera of the current electronic equipment is consistent with the image size of the preset target observed by the human eye through the lens of the wearable equipment. As shown in
The in response to that the first target image does not overlap with the second target image observed through the wearable equipment, adjusting the image size parameter of the first target image, and returning to execute the step of sending the first target image to the wearable equipment until the first target image overlaps with the second target image, and obtaining the target adjustment parameter includes following steps.
Step S31, in response to that the size of the first target image is smaller than the size of the second target image, increasing a image size parameter of the first target image to obtain the first target image after parameter adjustment; and returning to execute the step of sending the first target image to the wearable equipment until the first target image overlaps with the second target image, and configuring the image size parameter corresponding to an overlapped image as the target adjustment parameter.
In this embodiment, as shown in
Step S32, in response to that the size of the first target image is larger than the size of the second target image, reducing the image size parameter of the first target image to obtain the first target image after parameter adjustment; and returning to execute the step of sending the first target image to the wearable equipment until the first target image overlaps with the second target image, and configuring the image size parameter corresponding to an overlapped image as the target adjustment parameter.
In this embodiment, as shown in
In the embodiment of the present application, the second target image observed through the wearable equipment through the lens is compared with the first target image shot through the electronic equipment. When there is no overlap between the second target image and the first target image, the first target image of the electronic equipment is adjusted until the first target image shot through the electronic equipment can overlap with the second target image observed through the wearable equipment, and the target adjustment parameter of the overlapped image is recorded. Based on the target adjustment parameter, the camera of the electronic equipment is calibrated, so that the calibrated camera can match with the size of the object in the virtual environment regardless of the angle the calibrated camera turns to, thereby improving the user experience.
As shown in
Step B10, calibrating a camera of the electronic equipment based on the target adjustment parameter.
Step B20, sending an environment virtual image to a calibrated electronic equipment based on the environment virtual image shot by the wearable equipment in a virtual space.
Step B30, based on a target image shot through the calibrated electronic equipment and the environment virtual image sent through the wearable equipment, superimposing the environment virtual image and the target image to obtain a composite image.
In this embodiment, based on the target adjustment parameter, the camera of the electronic equipment is calibrated. Based on the environment virtual image shot through the wearable equipment in the virtual space and the target image shot through the calibrated electronic equipment, the wearable equipment transmits the environment virtual image to the electronic equipment. Then the electronic equipment removes the background of the target image, configures the target image after the background is removed as the foreground, and configures the environment virtual image shot through the wearable equipment as the background. After that, the background and the foreground will be superimposed to obtain a composite image. Since the calibrated electronic equipment is used for shooting, the shot target object matches the virtual environment of the wearable equipment. Further, when the user wears the wearable equipment, the electronic equipment sends the stored composite image to the wearable equipment, and the user can observe the composite image on the screen of the wearable equipment, so that only the electronic equipment and the wearable equipment need to be used to achieve the VR shooting, which is convenient and fast, thereby improving the user experience.
In the embodiment of the present application, the corrected electronic equipment is used for shooting, and the shot target image matches the virtual environment of the wearable equipment, so that the obtained composite image is more in line with the actual situation, thereby improving the user experience.
As shown in
As shown in
In an embodiment, the electronic equipment may further include a rectangular user interface, a network interface, a camera, a radio frequency (RF) circuit, a sensor, an audio circuit, a WiFi module, and the like. The rectangular user interface may include a display, an input sub-module, such as a keyboard. The rectangular user interface may further include a standard wired interface and a wireless interface. The network interface may include a standard wired interface and a wireless interface, such as a WIFI interface.
Those skilled in the art will understand that the electronic equipment structure shown in
As shown in
In the electronic equipment shown in
The specific implementation of the electronic equipment in the present application is basically the same as the above-mentioned shooting calibration method, which will not be repeated here.
In addition, as shown in
In an embodiment, the sending module is further used for:
In an embodiment, the shooting calibration system is further used for:
In an embodiment, the correction module is further used for:
In an embodiment, the correction module is further used for:
In an embodiment, the shooting calibration system is further used for:
The specific implementation of the shooting calibration system of the present application is basically the same as the above-mentioned shooting calibration method, which will not be repeated here.
Embodiments of the present application provide a storage medium, and the storage medium is a non-transitory computer-readable storage medium. The non-transitory computer-readable storage medium stores one or more programs. When the one or more programs are executed by one or more processors, steps of the shooting calibration method described in any one of the above embodiments are implemented.
The specific implementation of the non-transitory computer-readable storage medium of the present application is basically the same as the various embodiments of the shooting calibration method described above, which will not be repeated here.
The above are only some embodiments of the present application, and do not limit the scope of the present application thereto. Under the concept of the present application, any equivalent structural transformation or equivalent process transformation made according to the description and drawings of the present application, or direct/indirect application in other related technical fields shall fall within the claimed scope of the present application.
Number | Date | Country | Kind |
---|---|---|---|
202210438331.8 | Apr 2022 | CN | national |
The present application is a continuation application of International Application No. PCT/CN2022/102382, filed on Jun. 29, 2022, which claims priority to Chinese Patent Application No. 202210438331.8, filed on Apr. 25, 2022. The disclosures of the above-mentioned applications are incorporated herein by reference in their entireties.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/CN2022/102382 | Jun 2022 | WO |
Child | 18820957 | US |