The present disclosure relates to the field of unmanned aerial vehicles and, more particularly, to an unmanned aerial vehicle, a control method thereof, and a storage medium.
In recent years, flight with first-person view (FPV) of unmanned aerial vehicles has become more and more popular, and its immersive flight experience has attracted the attention of many people. However, the operation difficulty of flight with FPV is very high, requiring a user to operate the unmanned aerial vehicles to perform fancy flying movements to capture thrilling and exciting images. The existing technology provides a solution based on a panoramic camera. The panoramic camera is arranged on an unmanned aerial vehicle, and the panoramic video is captured by the panoramic camera during the flight of the unmanned aerial vehicle. The user edits the panoramic video using a video post-production software and cuts out the video effect that the user wants.
However, this post-production method is not able to allow the user to view thrilling and exciting pictures in real time during the flight of the unmanned aerial vehicle, and it is still difficult to meet the needs of the user.
In accordance with the disclosure, there is provided a control method for an aerial vehicle including obtaining a control stick value sent by a control device that is in communication connection with the aerial vehicle, determining, according to the control stick value, a target image region in a panoramic image captured by one or more photographing devices carried by the aerial vehicle, and sending the target image region to the control device, to enable the control device to display the target image region.
Also in accordance with the disclosure, there is provided an aerial vehicle including one or more photographing device configured to capture a panoramic image, a memory storing a computer program, and a processor configured to execute the computer program to obtain a control stick value sent by a control device in communication connection with the aerial vehicle, determine a target image region in the panoramic image according to the control stick value, and send the target image region to the control device, to enable the control device to display the target image region.
100—Unmanned aerial vehicle, 1—Memory, 2—Processor, 3—Photographing device, 10—First arm, 20—Second arm, 30—Rotation shaft.
The technical solutions of the present disclosure will be described below in conjunction with the drawings in the embodiments of the present disclosure. Obviously, the described embodiments are just some of the embodiments of the present disclosure, but not all of the embodiments. Based on the embodiments in this disclosure, all other embodiments obtained by those of ordinary skill in the art without creative work shall fall within the scope of this disclosure.
The flow charts shown in the drawings are just illustrations, and do not necessarily include all contents and operations/steps, nor must they be performed in the order described. For example, some operations/steps can be decomposed, combined or partly combined, so the actual order of execution may be changed according to the actual situation.
The embodiments of the present disclosure will be described below in conjunction with the drawings in the embodiments of the present disclosure. In the case of no conflict, the following embodiments and features in the embodiments may be combined with each other.
In recent years, flight with first-person view (FPV) of unmanned aerial vehicles has become more and more popular, and its immersive flight experience has attracted the attention of many people. However, the operation difficulty of flight with FPV is very high, requiring a user to operate the unmanned aerial vehicle to perform fancy flying movements to capture thrilling and exciting pictures. The existing technology provides a solution based on a panoramic camera. The panoramic camera is arranged on an unmanned aerial vehicle, and the panoramic video is captured by the panoramic camera during the flight of the unmanned aerial vehicle. The user edits the panoramic video using a video post-production software and cuts out the video effect that the user wants. However, this post-production method is not able to allow the user to view thrilling and exciting pictures in real time during the flight of the unmanned aerial vehicle, and it is still difficult to meet the needs of the user.
The present disclosure provides an unmanned aerial vehicle, a control method thereof, and a storage medium. The unmanned aerial vehicle may be provided with photographing devices for capturing a panoramic image. The unmanned aerial vehicle may be in communication connection with a control device, to obtain a control stick value sent by the control device. A target image region in the panoramic image captured by the photographing devices may be determined according to the control stick value. Then the target image region may be sent to the control device, such that the control device displays the target image region. Since the unmanned aerial vehicle and the control device are in communication connection, the control device may send the control stick value, and the unmanned aerial vehicle may determine the target image region in the panoramic image captured by the photographing devices according to the obtained control stick value, and send the target image region to the control device, to cause the control device to display the target image region. Therefore, the unmanned aerial vehicle may be able to determine the target image region in the panoramic image in time according to the control stick value sent by the control device, and return the target image region to the control device, to provide technical support for a user to view a scene corresponding to the control stick value in real time. When receiving the target image region, the control device may display the target image region in time, and the user may be able to view the scene corresponding to the control stick value in real time, such as various thrilling and exciting scenes. Therefore, the user's needs may be satisfied, and the user experience may be improved.
The present disclosure provides a control method of an unmanned aerial vehicle.
The unmanned aerial vehicle may be provided with one or more photographing devices for capturing a panoramic image. In some embodiments, the unmanned aerial vehicle may be provided with a plurality of photographing devices and the panoramic image may be formed by splicing images captured by the plurality of photographing devices. In an actual application process, the number of photographing devices that are needed depends on the fields of view (FOVs) of the selected photographing devices and the required splicing quality. The smaller the FOV of each camera, the more photographing devices may be needed to achieve 360° full coverage.
For example, the photographing devices may include two fisheye photographing devices which are respectively arranged above and below the unmanned aerial vehicle, and each fisheye photographing device may cover more than ½ of the FOV of the panoramic image. The FOVs of the two fisheye photographing devices may partially overlap each other. As shown in
As shown in
In some embodiments, the control method of the unmanned aerial vehicle includes S101, S102, and S103, as described in more detail below.
At S101, a control stick value sent by a control device is obtained.
At S102, a target image region in a panoramic image captured by one or more photographing devices is determined according to the control stick value.
At S103, the target image region is sent to the control device, such that the control device displays the target image region.
In some embodiments, the unmanned aerial vehicle may be in communication connection with the control device. The control device may be a device that is able to send control commands that is able to be responded by the unmanned aerial vehicle. The control device may be, but is not limited to: a remote controller, a user apparatus, a terminal apparatus, etc. The control device may be also a combination of two or more control devices, such as a remote controller and a user apparatus, a remote controller and a terminal apparatus, and so on.
The control stick value may be a control instruction for determining the target image region in the panoramic image captured by the photographing devices. The control stick value may be issued by the user by pushing the joystick, or by touching the joystick on the touch screen, or directly by inputting instructions, and so on. According to the control stick value, there may be many ways to determine the target image region in the panoramic image captured by the photographing devices. For example, the panoramic image may be divided into a plurality of image regions in advance and a correspondence relationship between a preset control stick value and the preset panoramic image regions may be configured in advance. Therefore, the target image region may be determined according to the control stick value sent by the control device and the correspondence relationship. In another example, the control stick value may map to a virtual attitude angle, and the target image region may be determined according to the virtual attitude angle.
The target image region may be sent to the control device, such that the control device displays the target image region. The user may be able to view the scene corresponding to the control stick value in real time, such as various thrilling and exciting scenes. Therefore, the user's needs may be satisfied, and the user experience may be improved.
In the present disclosure, the unmanned aerial vehicle may be provided with the photographing devices for capturing the panoramic image. The unmanned aerial vehicle may be in communication connection with the control device. The control stick value sent by the control device may be obtained. The target image region in the panoramic image captured by the photographing devices may be determined according to the control stick value. The target image region may be sent to the control device, such that the control device displays the target image region. Since the unmanned aerial vehicle may be in communication connection with the control device and the control device may issue the control stick value, the unmanned aerial vehicle may determine the target image region in the panoramic image captured by the photographing devices according to the control stick value, and send the target image region to the control device, such that the control device displays the target image region. Therefore, the unmanned aerial vehicle may be able to determine the target image region in the panoramic image captured by the photographing devices according to the control stick value issued by the control device and send the target image region back to the control device. Therefore, technical support may be provided to the user for viewing the images corresponding to the control stick value in real time (especially when the user views the images corresponding to the control stick value in real time during the flight process of the unmanned aerial vehicle). The control device may be able to display the target image region in time when receiving the target image region, and the user may be able to view the images corresponding to the control stick value in real time (especially when the user views the images corresponding to the control stick value in real time during the flight process of the unmanned aerial vehicle), such as various thrilling and exciting scenes. Therefore, the user's needs may be satisfied, and the user experience may be improved.
The application of the method in the embodiments of the present disclosure in various specific application scenarios is described in detail below.
In some embodiments, the method provided by the present disclosure may be applied in an application scenario involving an unmanned aerial vehicle, a remote controller, and a head-mounted display device. That is, the control device may include the remote controller and the head-mounted display device. In this scenario, obtaining the control stick value sent by the control device (S101) may include: obtaining the control stick value sent by the remote controller. Correspondingly, sending the target image region to the control device such that the control device displays the target image region (S103) may include: sending the target image region to the head-mounted display device, such that the head-mounted display device displays the target image region.
The head-mounted display device may use a set of optical systems (mainly precision optical lenses) to magnify an image on an ultra-micro display screen, project the image on a retina, and then present the large-screen image in eyes of a viewer, like viewing an object with a magnifying glass to present a magnified image of the virtual object. Different effects such as virtual reality (VR), augmented reality (AR), or mixed reality (MR) may be realized by sending optical signals to the user's eyes through the head-mounted display device. For normal display devices, the user needs to look at the device. For this head-mounted display device, the user may not need to look at the device. Further, since the head-mounted display device is usually in the shape of a hat or glasses, it may be easy to carry and be used at any time. Since a small display screen is used, it may be very power-saving. Especially when a large virtual display is formed, a significant energy-saving effect may be achieved.
In the application scenario involving the unmanned aerial vehicle, the remote controller, and the head-mounted display device, the user may send the control stick value through the remote controller. The unmanned aerial vehicle may determine the target image region in the panoramic image captured by the photographing device after receiving the control stick value, and then send the target image region to the head-mounted display device. After receiving the target image region, the head-mounted display device may display the target image region, and the user may be able to immersively view the target image region.
In some other embodiments, the method provided by the present disclosure may be applied in an application scenario involving an unmanned aerial vehicle, a remote controller, and a terminal device. That is, the control device may include the remote controller and the terminal device. In this scenario, obtaining the control stick value sent by the control device (S101) may include: obtaining the control stick value sent by the remote controller. Correspondingly, sending the target image region to the control device such that the control device displays the target image region (S103) may include: sending the target image region to the terminal device, such that the terminal device displays the target image region.
The terminal device may include, but is not limited to: a smartphone, a ground control station, a PC, a PDA, etc. For example, a user's mobile phone may have an application program installed therein, and the user may send the control stick value through the remote controller. The unmanned aerial vehicle may determine the target image region in the panoramic image captured by the photographing device after receiving the control stick value, and then send the target image region to the user's mobile phone. After receiving the target image region, the user's mobile phone may display the target image region in the screen of the mobile phone, and the user may be able to view the target image region. In some embodiments, the unmanned aerial vehicle may send the target image region to the remote controller through a private communication link, and the remote controller may forward the target image region to the mobile phone. For example, the remote controller may forward the target image region to the mobile phone through a connection line. In another example, the unmanned aerial vehicle may send the target image region directly to the mobile phone through a standard communication link, such as WIFI, 4G, etc.
In some other embodiments, the method provided by the present disclosure may be applied in an application scenario involving an unmanned aerial vehicle and a control device including a control area and a display area. That is, the control device may be provided with the control area and the display area. In this scenario, obtaining the control stick value sent by the control device (S101) may include: obtaining the control stick value sent by the remote controller, where the control stick value may be generated according to a user's operation in the control area. Correspondingly, sending the target image region to the control device such that the control device displays the target image region (S103) may include: sending the target image region to the control device, such that the control device displays the target image region in the display area.
In some embodiments, the control device may be provided with the control area and the display area. The control area may be configured for the user to operate for generating and sending the control stick value. The display area may be configured for displaying. The user may generate and send the control stick value based on the control area of the control device. The unmanned aerial vehicle may determine the target image region in the panoramic image captured by the photographing device after receiving the control stick value, and then send the target image region to the control device. After receiving the target image region, the control device may display the target image region in the display area, and the user may be able to view the target image region.
As shown in
The virtual attitude angle may be an imaginary and virtual attitude angle. The virtual attitude angle may include at least one of a pitch angle, a yaw angle, or a roll angle. The virtual attitude angle may be used to determine the target image region in the panoramic image. In some embodiments, the target image region may be not directly determined according to the control stick value, but the virtual attitude angle may be mapped according to the control stick value and then the target image region may be determined according to the virtual attitude angle. Therefore, the method of determining the target image region may be more intuitive, flexible, diverse, and convenient, to better meet various needs of users.
In some embodiments, determining the target image region according to the virtual attitude angle (S1022) may include: determining the target image region according to a preset FOV and the virtual attitude angle.
An FOV is also called a view field in optical engineering. The size of the FOV determines the view field of an optical instrument. The larger the FOV, the larger is the view field. In some embodiments, the preset FOV may be used to determine the range of the target image region in the panoramic image. The virtual attitude angle may be used to determine the center of the target image region. As shown in
Since the range of the target image region could be determined according to the preset FOV and the center of the target image region could be determined according to the virtual attitude angle, the target image region may be determined quickly and accurately.
In some embodiments, the virtual attitude angle to which the control stick value maps may be related to a flight control quantity of the unmanned aerial vehicle to which the control stick value maps. Since the control stick value may map to the flight control quantity of the unmanned aerial vehicle besides the virtual attitude angle, the virtual attitude angle to which the control stick value maps may be related to the flight control quantity of the unmanned aerial vehicle. Therefore, the target image region determined according to the virtual attitude angle may be related to the flight process of the unmanned aerial vehicle, such that the user is able to view the target image region of the unmanned aerial vehicle during the flight in time and have the immersive flight experience of FPV.
In some embodiments, the virtual attitude angle to which the control stick value maps may be related to both the control stick value and the flight control quantity of the unmanned aerial vehicle to which the control stick value maps. Correspondingly, the target image region determined according to the virtual attitude angle may be related to both the control stick value and the flight control quantity of the unmanned aerial vehicle to which the control stick value maps. That is, the user may be able to further control the target image region through the control stick during the flight process of the unmanned aerial vehicle, such that the user is able to view the target image region that the user further wants to see during the flight of the unmanned aerial vehicle in time.
In some embodiments, determining the flight control quantity of the unmanned aerial vehicle to which the control stick value maps according to the control stick value (S10211) may include: determining the flight control quantity of the unmanned aerial vehicle according to the control stick value and a preset virtual aircraft control model. The preset virtual aircraft control model may include a correspondence relationship between the control stick value and the flight control quantity of the unmanned aerial vehicle.
In some embodiments, the preset virtual aircraft control model may be provided in advance. The preset virtual aircraft control model may include the correspondence relationship between the control stick value and the flight control quantity of the unmanned aerial vehicle. The flight control quantity of the unmanned aerial vehicle may be determined according to the received control stick value and the correspondence relationship between the control stick value and the flight control quantity of the unmanned aerial vehicle. Correspondingly, the user may be able to experience the flight experience of the preset aircraft control model other than the current unmanned aerial vehicle. For example, in some embodiments, the preset aircraft control model may include a preset virtual FPV aircraft control model, and the user may be able to experience the flight experience of an FPV unmanned aerial vehicle. In some other embodiments, the preset virtual aircraft control model may include a preset virtual aerial photography aircraft control model, and the user may be able to experience the flight experience of an aerial photography unmanned aerial vehicle.
In some embodiments, the control stick value may include a first control stick value and a second control stick value. Therefore, determining the flight control quantity of the unmanned aerial vehicle to which the control stick value maps according to the control stick value (S10211) may include: determining a first flight control quantity of the unmanned aerial vehicle for flying upwards or downwards in the vehicle body coordinate system according to the first control stick value; and determining a second flight control quantity of the unmanned aerial vehicle for flying forwards or backwards in the vehicle body coordinate system according to the second control stick value.
In some embodiments, the first control stick value may be configured to control the unmanned aerial vehicle to fly upwards or downwards, and the second control stick value may be configured to control the unmanned aerial vehicle to fly forwards or backwards.
Correspondingly, the first control stick and the second control stick value may be used to control the unmanned aerial vehicle to fly upwards or downwards, or to fly forwards or backwards.
In the remote controller in some embodiments, the stick value issued by the throttle joystick may be referred to as the first control stick value, and the stick value issued by the pitch joystick may be referred to as the second control stick value.
The first control stick value and the second control stick value may be speed control quantities.
In some embodiments, the control stick value may further include a third control stick value. In this scenario, determining the virtual attitude angle to which the control stick value maps according to the control stick value and the flight control quantity of the unmanned aerial vehicle (S10212) may include determining a yaw angle in the virtual attitude angle according to the third control stick value; and determining a pitch angle in the virtual attitude angle according to the first flight control quantity and the second flight control quantity.
The pitch angle in the virtual attitude angle may be related to the first flight control quantity and the second flight control quantity. The yaw angle in the virtual attitude angle may be determined according to another third control stick value. In this way, the virtual attitude angler that is able to meet the needs of the user may be obtained, to obtain the target image region that is able to better meet the needs of the user.
In some embodiments, after the yaw angle in the virtual attitude angle is determined according to the third control stick value, a prediction may be performed on the movement trajectory of the unmanned aerial vehicle to obtain a predicted trajectory of the unmanned aerial vehicle, a yaw offset angle may be determined according to the predicted trajectory and the yaw angle in the virtual attitude angle may be adjusted according to the yaw offset angle.
In some embodiments, the movement trajectory of the unmanned aerial vehicle may be predicted and the yaw offset angle may be obtained based on the predicted trajectory. Then the yaw angle in the virtual attitude angle may be adjusted according to the yaw offset angle. In this way, the yaw angle in the virtual attitude angle may be made as consistent with the yaw angle of the user's desire as possible, to improve the user experience.
In some embodiments, determining the yaw offset angle according to the predicted trajectory may include obtaining a preset forward-looking time, determining a target trajectory point in the predicted trajectory according to the preset forward-looking time, and determining the yaw offset angle according to the target trajectory point.
As shown in
In this way, in the turning process of the unmanned aerial vehicle, the virtual attitude angle (the attitude angle indicated by the vertebral body in the figure) may be made facing the future trajectory of the unmanned aerial vehicle. Similar to a drive scenario where the user, when turning, often looks at the post-turning region, in some embodiments, the movement trajectory of the unmanned aerial vehicle may be predicted to obtain the predicted trajectory, and the yaw offset angle may be obtained based on the predicted trajectory. Then the yaw angle in the virtual attitude angle may be adjusted according to the yaw offset angle. Correspondingly, the target image region determined according to the virtual attitude angle may be more in line with user's habits.
In some embodiments, determining the virtual attitude angle to which the control stick value maps according to the control stick value (S1021) may further include determining the virtual attitude angle in the virtual camera coordinate system according to the control stick value.
As shown in
The present disclosure also provides an unmanned aerial vehicle.
As shown in
The processor 2 may be, for example, a microcontroller unit, a central processing unit, or a digital signal processor.
The memory 1 may be, for example, a flash chip, a read only memory, a magnetic disk, an optical disk, a flash drive, or a mobile hard disk.
The memory 1 is used to store a computer program. The processor 2 is used to execute the computer program, and when the computer program is executed, to implement the following processes when the computer program is executed: obtaining a control stick value sent by the control device; according to the control stick value, determining a target image region in a panoramic image captured by the photographing devices; and sending the target image region to the control device to make the control device display the target image region.
In some embodiments, the control device may include a remote controller and a head-mounted display device. When executing the computer program, the processor may be configured to implement the following processes: obtaining the control stick value sent by the remote controller; sending the target image region to the head-mounted display device, such that the head-mounted display device displays the target image region.
In some embodiments, the control device may include a remote controller and a terminal device. When executing the computer program, the processor may be configured to implement the following processes: obtaining the control stick value sent by the remote controller; sending the target image region to the terminal device, such that the terminal device displays the target image region.
In some embodiments, the control device may be provided with a control area and a display area. When executing the computer program, the processor may be configured to implement the following processes: obtaining the control stick value sent by the control device, where the control stick value is generated based on a user's operation in the control area; and sending the target image region to the control device, such that the display area of the control device displays the target image region.
In some embodiments, when executing the computer program, the processor may be configured to: determine a virtual attitude angle to which the control stick value maps according to the control stick value; and determine the target image region according to the virtual attitude angle.
In some embodiments, when executing the computer program, the processor may be configured to: determine the target image region according to a preset FOV and the virtual attitude angle.
In some embodiments, the virtual attitude angle to which the control stick value maps may be related to a flight control quantity of the unmanned aerial vehicle to which the control stick value maps.
In some embodiments, when executing the computer program, the processor may be configured to: determine the flight control quantity of the unmanned aerial vehicle to which the control stick value maps according to the control stick value; and determine the virtual attitude angle to which the control stick value maps according to the control stick value and the flight control quantity of the unmanned aerial vehicle.
In some embodiments, when executing the computer program, the processor may be configured to: determine the flight control quantity of the unmanned aerial vehicle according to the control stick value and a preset virtual aircraft control model, where the preset virtual aircraft control model may include a correspondence relationship between the flight control quantity of the unmanned aerial vehicle and the control stick value.
In some embodiments, the preset virtual aircraft control model may include a preset virtual FPV aircraft control model.
In some embodiments, the control stick value may include a first control stick value and a second control stick value. Therefore, when executing the computer program, the processor may be configured to: determine a first flight control quantity of the unmanned aerial vehicle for flying upwards or downwards in the vehicle body coordinate system according to the first control stick value; and determine a second flight control quantity of the unmanned aerial vehicle for flying forwards or backwards in the vehicle body coordinate system according to the second control stick value.
In some embodiments, the control stick value may further include a third control stick value. Therefore, when executing the computer program, the processor may be configured to: determine a yaw angle in the virtual attitude angle according to the third control stick value; and determine a pitch angle in the virtual attitude angle according to the first flight control quantity and the second flight control quantity.
In some embodiments, the first control stick value and the second control stick value may be speed control quantities.
In some embodiments, when executing the computer program, the processor may be configured to: perform prediction on the movement trajectory of the unmanned aerial vehicle to obtain the predicted trajectory of the unmanned aerial vehicle; determine a yaw offset angle according to the predicted trajectory; and adjust the yaw angle in the virtual attitude angle according to the yaw offset angle.
In some embodiments, when executing the computer program, the processor may be configured to: obtain a preset forward-looking time; determine a target trajectory point in the predicted trajectory according to the preset forward-looking time; and determine the yaw offset angle according to the target trajectory point.
In some embodiments, when executing the computer program, the processor may be configured to: determine the virtual attitude angle in the virtual camera coordinate system according to the control stick value.
In some embodiments, the photographing devices may include one or more photographing devices.
In some embodiments, the unmanned aerial vehicle may include a first arm and a second arm. The second arm may be connected to the first arm through a rotation shaft. The photographing devices may be disposed at two ends of the rotation shaft. The photographing devices may be fisheye photographing devices.
The present disclosure also provides a computer-readable storage medium. The computer-readable storage medium may be configured to store a computer program. When the computer program is executed by a processor, the control method of the unmanned aerial vehicle provided by various embodiments of the present disclosure may be implemented.
The computer-readable storage medium may be an internal storage unit of the unmanned aerial vehicle described in any of the foregoing embodiments of the present disclosure, such as a hard disk or a memory of the device. The computer-readable storage medium may also be an external storage device of the device, such as a plug-in hard disk equipped on the device, a smart memory card (SMC), a secure digital card (SD), or a flash card, etc.
The terminology used in the present disclosure is for the purpose of describing particular embodiments only and is not intended to limit the scope of the present disclosure.
The term “and/or” used in the present disclosure and the appended claims refers to any combination of one or more of the associated listed items and all possible combinations, and includes these combinations.
The above are only specific implementations of embodiments of the present disclosure, but the scope of the present disclosure is not limited to this. One of ordinary skill in the art can easily think of various equivalents within the technical scope disclosed in the present disclosure. These modifications or replacements shall be included within the scope of the present disclosure. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.
This application is a continuation of International Application No. PCT/CN2020/141085, filed Dec. 29, 2020, the entire content of which is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/CN2020/141085 | Dec 2020 | US |
Child | 18343369 | US |