The present application is based on PCT filing PCT/JP2020/024421, filed Jun. 22, 2020, which claims priority to Japanese Application No. 2019-146147, filed Aug. 8, 2019, the entire contents of each are incorporated herein by reference.
The present technology relates to an information processing apparatus, a moving body, an information processing system, an information processing method, and a program.
In the past, a technology for manipulating a remote-controlled moving body such as a drone has been disclosed. Such a moving body is generally equipped with a camera for imaging a subject (for example, Patent Literature 1). In recent years, it has been proposed to utilize, in the case where such a moving body is used to perform aerial imaging of scenery or remote patrol security, for example, a system including a plurality of moving bodies.
However, in the system including a plurality of moving bodies, when the imaging range of one moving body is unknown, the other moving body appears in the imaging range of the one moving body.
In this regard, the present disclosure proposes an information processing apparatus, a moving body, an information processing system, an information processing method, and a program that are capable of capable of ensuring that another moving body does not appear in an imaging range of a moving body.
In order to achieve the above-mentioned object, an information processing apparatus according to an embodiment of the present technology includes: an acquisition unit; and a control unit.
The acquisition unit acquires state information of an imaging unit of a moving object.
The control unit calculates an imageable region of the imaging unit on the basis of the state information.
The information processing apparatus transmits information regarding the imageable region to a moving body.
The acquisition unit may acquire the state information of the moving object and performance information of the moving object, and
The control unit may
The state information of the moving object may include information regarding a current position of the moving object,
The acquisition unit may acquire performance information of the imaging unit, and
The state information of the imaging unit may include information regarding an imaging direction and an angle of view of the imaging unit,
The control unit may
The information processing apparatus may be a server.
In order to achieve the above-mentioned object, a moving body according to an embodiment of the present technology includes: an acquisition unit; and a control unit.
The acquisition unit acquires information regarding an imageable region of an imaging unit of a moving object, the imageable region being calculated on the basis of state information of the imaging unit.
The control unit generates a moving route of the moving body, the moving route not intersecting the imageable region.
The moving body may be a flying object.
In order to achieve the above-mentioned object, a moving body according to an embodiment of the present technology includes: an acquisition unit; and a control unit.
The acquisition unit acquires state information of an imaging unit of a moving object.
The control unit calculates an imageable region of the imaging unit on the basis of the state information and generates a moving route of the moving body, the moving route not intersecting the imageable region.
The moving body may be a flying object.
In order to achieve the above-mentioned object, an information processing system according to an embodiment of the present technology includes: an information processing apparatus; and a moving body.
The information processing apparatus acquires state information of an imaging unit of a moving object and calculates an imageable region of the imaging unit on the basis of the state information.
The moving body acquires information regarding the imageable region from the information processing apparatus and generates a moving route of the moving body, the moving route not intersecting the imageable region.
In order to achieve the above-mentioned object, an information processing method for an information processing apparatus according to an embodiment of the present technology includes:
An imageable region of the imaging unit is calculated on the basis of the state information.
Information regarding the imageable region is transmitted to a moving body.
In order to solve the problems described above, a program according to an embodiment of the present technology causes an information processing apparatus to execute the steps of:
Hereinafter, embodiments of the present technology will be described with reference to the drawings.
[Configuration of Information Processing System]
The drone aircrafts 10 and 20 and the information processing apparatus 30 are communicably connected to each other via a network N. The network N may be the Internet, a mobile communication network, a local area network, or the like, or may be a network in which a plurality of types of the networks are combined.
The drone aircraft 20 and the controller 40 are connected to each other by wireless communication. The communication standard for connecting the second drone aircraft 20 and the controller 40 to each other is typically LTE (Long Term Evolution) communication, but is not limited thereto. The type thereof is not limited, and may be, for example, Wi-Fi.
(Drone Aircraft 10)
As shown in
The camera 101 is an apparatus for generating a captured image by imaging a real space using, for example, an image sensor such as a CMOS (Complementary Metal Oxide Semiconductor) and a CCD (Charge Coupled Device), and various members such as a lens for controlling formation of a subject image on the image sensor. The camera 101 may capture a still image or may capture a moving image. The camera 101 is an example of the “imaging unit” in the claims.
The GPS sensor 102 receives a signal from a GPS satellite and measures the current latitude and longitude of the drone aircraft 10 in real time. The GPS sensor 102 outputs, to the storage unit 108 and a movable region calculation unit 3021, sensor data regarding the latitude and longitude of the drone aircraft 10 calculated on the basis of the signal acquired from the GPS satellite.
The air pressure sensor 103 is a pressure sensor that measures an air pressure and converts it to an altitude to measure a flight altitude (air pressure altitude) of the drone aircraft 10. The air pressure sensor 103 detects a total pressure including an influence of wind received by the drone aircraft 10 and the atmospheric pressure received by the drone aircraft 10, and measures a flight speed (airspeed) of the drone aircraft 10 on the basis of a difference therebetween.
The air pressure sensor 103 outputs, to the storage unit 108 and the movable region calculation unit 3021, the sensor data obtained by measuring the flight altitude and the flight speed of the drone aircraft 10. The air pressure sensor 103 may be, for example, a piezoresistive pressure sensor, and the type thereof is not limited.
The acceleration sensor 104 constantly detects the acceleration of the drone aircraft 10. The acceleration sensor 104 detects various types of movement such as a tilt and vibration of the drone aircraft 10. The acceleration sensor 104 outputs, to the storage unit 108 and the movable region calculation unit 3021, the sensor data obtained by detecting the acceleration of the drone aircraft 10. The acceleration sensor 104 may be, for example, a piezoelectric acceleration sensor, a servo-type acceleration sensor, a strain-type acceleration sensor, a semiconductor-type acceleration sensor, or the like, and the type thereof is not limited.
The camera control unit 105 generates a control signal for changing the imaging direction, posture, and imaging magnification of the camera 101 on the basis the control of the control unit 106, and outputs this signal to the camera 101. The camera control unit 105 controls the imaging direction and the angle of view of the camera 101 via, for example, a platform (not shown) including a built-in motor, such as a 3-axis gimbal, and outputs the control signal at this time to the storage unit 108 and an imaging range calculation unit 3022.
The control unit 106 controls the entire operation of the drone aircraft 10 or part thereof in accordance with a program stored in the storage unit 108. The communication unit 107 communicates with the information processing apparatus 30 through the network N. The communication unit 107 functions as a communication interface of the drone aircraft 10.
The storage unit 108 stores sensor data output from the GPS sensor 102, the air pressure sensor 103, and the acceleration sensor 104, and a control signal output from the camera control unit 105.
(Drone Aircraft 20)
As shown in
The communication unit 201 communicates with the information processing apparatus 30 via the network N. The communication unit 201 functions as a communication interface of the drone aircraft 20. The control unit 202 controls the entire operation of the drone aircraft 20 or part thereof in accordance with a program stored in the storage unit 203.
The control unit 202 functionally includes a moving route generation unit 2021. The moving route generation unit 2021 sets, on the basis of the imaging region of the drone aircraft 10, a waypoint that is an intermediate target point of the drone aircraft 20, and generates a moving route of the drone aircraft 20 that goes through the set waypoint.
(Information Processing Apparatus 30)
As shown in
The communication unit 301 communicates with the drone aircrafts 10 and 20 via the network N. The communication unit 301 functions as a communication interface of the information processing apparatus 30.
The control unit 302 controls the entire operation of the information processing apparatus 30 or part thereof in accordance with a program stored in the storage unit 303.
The control unit 302 functionally includes the movable region calculation unit 3021, the imaging range calculation unit 3022, and an imaging region calculation unit 3023.
The movable region calculation unit 3021 calculates, on the basis of the sensor data (the latitude, the longitude, and the altitude of the drone aircraft 10) acquired from the GPS sensor 102 and the air pressure sensor 103 and the aircraft performance of the drone aircraft 10 (maximum speed, maximum ascending speed, maximum acceleration, maximum ascending acceleration, and maximum descending acceleration), a movable region in which the drone aircraft 10 is movable.
Note that the sensor data acquired from the GPS sensor 102 and the air pressure sensor 103 is an example of the “state information of the moving object” in the claims, and the aircraft performance of the drone aircraft 10 is an example of the “performance information of the moving object” in the claims.
The imaging range calculation unit 3022 calculates, on the basis of the control signal (the imaging direction and the angle of view of the camera 101) acquired from the camera control unit 105 and the aircraft performance of the camera 101 (the changeable range and the change speed of the imaging direction of the camera 101 and the changeable range and the change speed of the angle of view of the camera 101), the imageable range of the camera 101. Note that the control signal from the camera control unit 105 is an example of the “state information of the imaging unit”, and the aircraft performance of the camera 101 is an example of the “performance information of the imaging unit”.
The imaging region calculation unit 3023 superimposes, on the basis of the movable region of the drone aircraft 10 calculated by the movable region calculation unit 3021 and the imageable range of the camera 101 calculated by the imaging range calculation unit 3022, all the imageable ranges calculated from the angle of view and the imaging direction of the camera 101 obtained at the corresponding time at each point of the movable region to calculate the imageable region of the drone aircraft 10.
The storage unit 303 stores data (
Further, the storage unit 303 periodically acquires the current state of the drone aircraft 10 and the camera 101 from the drone aircraft 10 and stores this. Specifically, the storage unit 303 periodically acquires, from the drone aircraft 10, information regarding the current latitude, longitude, altitude, and speed of the drone aircraft 10 and information regarding the current horizontal angle (angle formed by the optical axis of the camera 101 and the vertical plane), the vertical angle (angle formed by the optical axis of the camera 101 and the horizontal direction), the horizontal angle of view, and the vertical angle of view of the camera 101, and updates this.
Further, the storage unit 303 stores a time interval from when acquiring an aircraft state table described below to when setting a waypoint (hereinafter, the predetermined time period t1), the update interval of the aircraft state table (hereinafter, the predetermined time period t2), and the local feature amount of each of a plurality of drone aircrafts.
(Controller 40)
The controller 40 is a manipulation apparatus for manipulating the drone aircraft 20 and includes a display unit 41. The display unit 41 is, for example, a display device such as an LCD and an organic EL display.
The display unit 41 displays video taken by a camera 116 mounted on the drone aircraft 20. As a result, a user can operate the drone aircraft 20 while watching the video displayed on the display unit 41.
(Hardware Configuration)
The information processing apparatus 100 includes a CPU (Central Processing unit) 110, a ROM (Read Only Memory) 111, and a RAM (Random Access Memory) 112. The control units 106, 202, and 302 may be the CPU 111.
Further, the information processing apparatus 100 may include a host bus 113, a bridge 114, an external bus 115, an interface 121, an input device 122, an output device 123, a storage device 124, a drive 125, a connection port 126, and a communication device 127.
Further, the information processing apparatus 100 may include a camera control unit 117, a camera 116, a GPS sensor 118, an acceleration sensor 119, and an air pressure sensor 120.
The information processing apparatus 100 may include a processing circuit such as a DSP (Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), an FPGA (Field-Programmable Gate Array), and a GPU (Graphics Processing Unit) Instead of or in addition to the CPU 110.
The CPU 110 functions an arithmetic processing unit and a control device, and controls the entire operation of the information processing apparatus 100 or part thereof in accordance with various programs stored in the ROM 111, the RAM 112, the storage device 124, or s removable recording medium 50. The storage units 108,203, and 303 may be the ROM 111, the RAM 112, the storage device 124, or the removable recording medium 50.
The ROM 111 stores the program, the calculation parameter, and the like used by the CPU 110. The RAM 112 primarily stores the program used in the execution of the CPU 110, the parameter that appropriately changes in the execution, and the like.
The CPU 110, the ROM 111, and the RAM 112 are connected to each other via the host bus 113 including an internal bus such as a CPU bus. Further, the host bus 113 is connected to the external bus 115 such as a PCI (Peripheral Component Interconnect/Interface) bus via the bridge 114.
The input device 122 is a device operated by a user, such as a touch panel, a button, a switch, and a lever. The input device 122 may be, for example, a remote control device using infrared rays or other radio waves, or an external connection device 60 such as a mobile phone corresponding to the operation of the information processing apparatus 100.
The input device 122 includes an input control circuit that generates an input signal on the basis of information input by a user and outputs the signal to the CPU 110. A user operates this input device 122 to input various types of data to the information processing apparatus 100 and instruct a processing operation.
The output device 123 includes a device capable of notifying a user of the acquired information using senses such as visual, auditory, and haptic senses. The output device 123 may be, for example, a display device such as an LCD (Liquid Crystal Display) and an organic EL (Electro-Luminescence) display, an audio output device such as a speaker and a headphone, or a vibrator.
The output device 123 outputs the result obtained by the processing of the information processing apparatus 100 as a text, video such as an image, voice such as voice and sound, vibration, or the like.
The storage device 124 is a device for data storage configured as an example of the storage unit of the information processing apparatus 100. The storage device 124 includes, for example, a magnetic storage device such as an HDD (Hard Disk Drive), a semiconductor storage device, an optical storage device, or a magneto-optical storage device. The storage device 124 stores, for example, the program to be executed by the CPU 110, various types of data, various types of data acquired from the outside, and the like.
The drive 125 is a reader/writer for the removable recording medium 50 such as a magnetic disk, an optical disk, a magneto-optical disk, and a semiconductor memory, and is built in or externally attached to the information processing apparatus 100. The drive 125 reads the information recorded in the removable recording medium 50 mounted thereon, and outputs the read information to the RAM 112. Further, the drive 125 writes a record to the removable recording medium 50 mounted thereon.
The connection port 126 is a port for connecting a device to the information processing apparatus 100. The connection port 126 may be, for example, a USB (Universal Serial Bus) port, an IEEE1394 port, or an SCSI (Small Computer System Interface) port. Further, the connection port 126 may be an RS-232C port, an optical audio terminal, an HDMI (registered trademark) (High-Definition Multimedia Interface) port, or the like. By connecting the external connection device 60 to the connection port 126, various types of data can be exchanged between the information processing apparatus 100 and the external connection device 60.
The communication device 127 is, for example, a communication interface including a communication device for connecting to the network N. The communication device 127 may be, for example, a communication card for a LAN (Local Area Network), Bluetooth (registered trademark), Wi-Fi, WUSB (Wireless USB), or LTE. Further, the communication device 127 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), a modem for various types of communication, or the like.
The communication device 127 transmits/receives signals or the like to/from, for example, the Internet or another communication device using a predetermined protocol such as TCP/IP. Further, the network N to be connected to the communication device 127 is a wirelessly-connected network, and may include, for example, the Internet, infrared communication, radio wave communication, short-range wireless communication, or satellite communication. The communication units 107, 201, and 301 may be the communication device 127.
The camera control unit 117, the camera 116, the GPS sensor 118, the acceleration sensor 119, and the air pressure sensor 120 respectively correspond to the camera control unit 105, the camera 101, the GPS sensor 102, the acceleration sensor 104, and the air pressure sensor 103.
The configuration example of the information processing system 1 has been described above. The respective components described above may be configured by using general-purpose members or may be configured by members specialized for functions of the respective components. Such a configuration may be changed as appropriate depending on the technical level at the time of implementation.
[Operation of Information Processing System]
(Step S101: Calculate Movable Region)
First, the camera 116 mounted on the drone aircraft 20 images the drone aircraft 10. Next, the movable region calculation unit 3021 performs predetermined image processing on the captured image obtained by imaging the drone aircraft 10 to specify the model name or the model number of the drone aircraft 10. Specifically, the movable region calculation unit 3021 extracts the local feature amount of the 3D shape of the drone aircraft 10 from the captured image obtained by imaging the drone aircraft 10.
The local feature amount is, for example, a feature mount calculated by SIFT (scale invariant feature transform), SURF (speed-up robust features), RIFF (rotation invariant fast feature), BREIF (binary robust independent elementary features), BRISK (binary robust invariant scalable keypoints), ORB (oriented FAST and rotated BRIEF), or CARD (compact and real-time descriptors).
The movable region calculation unit 3021 detects the drone aircraft 10 by feature amount matching in which the local feature amount of the 3D shape of the drone aircraft 10 and the local feature amount of each of a plurality of drone aircrafts stored in the storage unit 303 in advance are compared with each other, thereby specifying the model name or the model number of the drone aircraft 10.
Next, the movable region calculation unit 3021 acquires, from the drone aircraft 10, data (hereinafter, the aircraft state table) (see
Subsequently, the movable region calculation unit 3021 refers to the data table (
Next, the movable region calculation unit 3021 calculates a maximum moving range E1 in the horizontal direction (XY plane direction) in the case of accelerating at the maximum acceleration from the current speed to the maximum speed, which is the upper limit, of the drone aircraft 10 starting from the current position (the latitude, the longitude, and the altitude) of the drone aircraft 10.
Part a of
Lh=V0t1+(aht12)/2 (1)
E1=(Lh)2π (2)
Next, the movable region calculation unit 3021 calculates a maximum ascending range E2 in the vertical plane direction (XZ plane direction) in the case of ascending at the maximum ascending acceleration from the current speed to the maximum ascending speed, which is the upper limit, of the drone aircraft 10 starting from the current position (the latitude, the longitude, and the altitude) of the drone aircraft 10.
Part b of
Lup=V0t1+(aupt12)/2 (3)
E2={(Lup)2π}/2 (4)
Similarly, the movable region calculation unit 3021 calculates a maximum descending range E3 in the vertical plane direction in the case of descending at the maximum descending speed from the current speed to the maximum descending speed, which is the upper limit, of the drone aircraft 10 starting from the current position (the latitude, the longitude, and the altitude) of the drone aircraft 10.
The maximum descending range E3 is calculated by, for example, the following formulae (5) and (6), Vdown, adown, and Ldown respectively representing the current speed, the maximum descending acceleration, and the maximum descending distance of the drone aircraft 10.
Ldown=V0t1+(adownt12)/2 (5)
E3={(Ldown)2π}/2 (6)
Subsequently, the movable region calculation unit 3021 combines the calculated maximum moving range E1, the calculated maximum ascending range E2, and the calculated maximum descending range E3 with each other to calculate a maximum movable region E defined by these ranges, and thus specifies the maximum movable region E of the drone aircraft 10 within the predetermined time period t1. The movable region calculation unit 2051 outputs the calculation result of calculating the maximum movable region E to the moving route generation unit 2021. The maximum movable region E is an example of the “movable region” in the claims.
The maximum movable region E may be defined as a cylinder calculated by, for example, the following formula (7), Lh, Lup, and Ldown respectively representing the maximum moving distance, the maximum ascending distance, and the maximum descending distance.
E={Lup+Ldown}*(Lh)2*π (7)
Alternatively, as shown in
E=4/3*π*[(Lh)2*{(Lup+Ldown)/2}] (8)
(Step S102: Calculate Imageable Range)
The imaging range calculation unit 3022 refers to the data table (
Next, the imaging range calculation unit 3022 calculates a maximum angle range A1 in the horizontal direction of the camera 101 within the predetermined time period t2 starting from the current horizontal angle of the camera 101 acquired in the previous Step S101 to the horizontal-angle changeable range that is the upper limit.
The maximum angle range A1 is calculated by, for example, the following formula (9), θH, θHs, and θH(min) to θH(max) respectively representing the current horizontal angle, the horizontal-angle change speed, and the horizontal-angle changeable range of the camera 101.
θH−(θHs*t2)≤A1≤θH+(θHs*t2) (However, θH(min)≤A1≤θH(max)) (9)
Subsequently, the imaging range calculation unit 3022 calculates a maximum angle range A2 in the vertical direction of the camera 101 within the predetermined time period t2 starting from the current vertical angle of the camera 101 to the vertical-angle changeable range that is the upper limit.
The maximum angle range A2 is calculated by, for example, the following formula (10), θV, θVs, and θV(min) to θV(max) respectively representing the current vertical angle, the vertical angle-change speed, and the vertical-angle changeable range of the camera 101.
θV−(θV,*t2)≤A2≤θV+(θVs*t2) (However, θV(min)≤A2≤θV(max)) (10)
Subsequently, the imaging range calculation unit 3022 calculates a maximum horizontal angle of view θhav of the camera 101 within the predetermined time period t2 starting from the current horizontal angle of view of the camera 101 to the horizontal-angle-of-view changeable range that is the upper limit.
The maximum horizontal angle of view θhav is calculated by, for example, the following formula (11), θAH, θAHs, and θAH(min) to θAH(max) respectively representing the current horizontal angle of view, the horizontal-angle-of-view change speed, and the horizontal-angle-of-view changeable range of the camera 101.
θAH−(θAHs*t2)≤θhav≤θAH+(θAHs*t2) (however, θAH(min)≤θhav≤θAH(max)) (11)
Similarly, the imaging range calculation unit 3022 calculates a maximum vertical angle of view θvav of the camera 101 within the predetermined time period t2 starting from the current vertical angle of view of the camera 101 to the vertical-angle-of-view changeable range that is the upper limit.
The maximum vertical angle of view θvav is calculated by, for example, the following formula (12), θVH, θVHs, and θVH(min) to θV(max) respectively representing the current vertical angle of view, the vertical-angle-of-view change speed, and the vertical-angle-of-view changeable range of the camera 101.
θVH−(θVHs*t2)≤θvav≤θVH+(θVHs*t2) (however, θVH(min)≤θvav≤θVH(max)) (12)
Next, the imaging range calculation unit 2052 calculates a maximum imageable range R on the basis of the maximum angle ranges A1 and A2, the maximum horizontal angle of view θhav, and the maximum vertical angle of view θvav and specifies the maximum imageable range R within the predetermined time period t2 from the current imaging direction and the current angle of view of the camera 101. Note that the maximum imageable range R is an example of the “imageable range” in the claims.
(Step S103: Has Predetermined Time Period Elapsed?)
The imaging range calculation unit 3022 according to this embodiment repeats the previous Step S102 every predetermined time period t2 in the case where the predetermined time period t1 has not elapsed since the aircraft state table was acquired from the drone aircraft 10 (NO in Step S103). As a result, the aircraft state table stored in the storage unit 303 is updated every predetermined time period t2.
That is, the imaging range calculation unit 3022 calculates the maximum imageable range R each time the aircraft state table is acquired from the drone aircraft 10 from when the drone aircraft 20 acquired the aircraft state table to when the predetermined time period t1 elapsed. That is, as shown in
(Step S104: Calculate Imageable Region)
In the case where the predetermined time period t1 has elapsed since the aircraft state table was acquired from the drone aircraft 10 (Yes in Step S103), the imaging region calculation unit 3023 superimposes all the maximum imageable ranges R calculated by the imaging range calculation unit 3022 every predetermined time period t2 to calculate the maximum imageable region D, and thus calculates the maximum imageable region D of the camera 101 within the predetermined time period t1. The maximum imageable region D is an example of the “imageable region” in the claims.
θVT=θVP+θVH+{½*(θVHs*t2)+(θVs*t2)} (13)
ΔLh=LV*tan θVT (14)
D={(π*LV)/3}*[Lh2*{Lh*(Lh+ΔLh))}*(Lh+ΔLh)2] (15)
Note that the height LV of the maximum imageable region D is such a height that the camera 101 of the drone aircraft 10 cannot recognize the drone aircraft 20 (not appear), and is calculated by, for example, the following formula (16), Ds, F, and FV respectively representing the lateral size of the image sensor of the camera 101, the focal length of the camera 101 at the maximum zoom, and the field of view size of the camera 101.
LV=(F·FV)/Ds (16)
The imaging region calculation unit 3023 outputs the calculation result of calculating the maximum imageable region D to the moving route generation unit 2021 and the controller 40.
The display unit 41 of the controller 40 displays the maximum imageable region D of the drone aircraft 10. At this time, the display unit 41 generates an overlay image in which the maximum imageable region D is virtually superimposed on the video taken by the camera 116, and displays this image. As a result, a user can recognize the maximum imageable region D of the drone aircraft 10 as visualized information.
(Step S105: Generate Moving Route)
Specifically, for example, the moving route generation unit 2021 calculates the coordinate positions (xp, yp, and zp) of the waypoint P on the basis of the coordinate position of each point of the point cloud data forming the maximum movable region E and an aircraft width L1 of the drone aircraft 20 and generates the moving route L that goes through this coordinate position (xp, yp, and zp).
At this time, the moving route generation unit 2021 sets, in the case where the moving route L passes through the center of the drone aircraft 20 in the width direction, the coordinate position (xp, yp, and zp) such that, for example, a distance L2 between a coordinate position (xa, ya, and za) of the point cloud data forming the outermost periphery of the maximum imageable region D and the coordinate position (xp, yp, and zp) is larger than the aircraft width L1. Note that the aircraft width L1 is, for example, a distance from the center of the drone aircraft 20 in the width direction to the end in the width direction.
The information processing system 1 according to this embodiment repeatedly executes the series of processes from the previous Step S101 to Step S105 every predetermined time period t1. As a result, the maximum imageable region D of the drone aircraft 10 according to the situation at that time is generated every predetermined time period t1, and the waypoint P is intermittently set every predetermined time period t1.
[Operation and Effects]
The information processing apparatus 30 generates the maximum imageable range R every predetermined time period t2 within the range in which the drone aircraft 10 is capable of moving to the maximum within the predetermined time period t1, and calculates, from these maximum imageable ranges R, the maximum imageable range (the maximum imageable region D) of the drone aircraft 10 within the predetermined time period t1. The drone aircraft 10 then generates the moving route L that does not intersect the imaging range.
As a result, even in the case where the drone aircraft 10 takes an unexpected operation such as a sudden rise or a sudden drop within the predetermined time period t1, the imaging range of the camera 101 falls within the maximum imageable region D. Therefore, when the drone aircraft 20 moves in accordance with the moving route L avoiding the maximum movable region E and the maximum imageable region D, it is possible to reliably prevent the drone aircraft 20 from appearing in the imaging range of the camera 101 within the predetermined time period t1.
Further, in the information processing system 1 according to this embodiment, the information processing apparatus 30 executes the arithmetic processing of calculating the maximum movable region E and the maximum imageable region D of the drone aircraft 10. That is, the information processing apparatus 30 is responsible for part of the arithmetic processing to be executed by the drone aircraft 20 in order to prevent the drone aircraft 20 from appearing in the imaging range of the drone aircraft 10. As a result, it is possible to significantly reduce the calculation load of the drone aircraft 20. Further, since it is not necessary to increase the arithmetic processing capacity of the drone aircraft 20, the design cost of the drone aircraft 20 is reduced.
[Operation of Information Processing System]
The second embodiment is different from the first embodiment in that the imageable region that the drone aircraft 10 is capable of taking within a specific time period is predicted from the past history of the state of the drone aircraft 10, and the drone aircraft 20 avoids the predicted imageable region.
(Step S201: Select Moving Region and Angle Change Range)
Here, the “moving region” shown in
Further, the “horizontal angle change range” is the actual change range of the horizontal angle of the camera 101 within the predetermined time period t1, and the “vertical angle change range” is the actual change range of the vertical angle of the camera 101 within the predetermined time period t1.
Further, the “horizontal-angle-of-view change range” is the actual change range of the horizontal angle of view of the camera 101 within the predetermined time period t1, and the vertical-angle-of-view change range is the actual change range of the vertical angle of view of the camera 101 within the predetermined time period t1. Note that the definitions of the above-mentioned “horizontal angle change range”, “vertical angle change range”, “horizontal-angle-of-view change range”, and “vertical-angle-of-view change range” are the same also in the following description.
First, the control unit 302 acquires, from the drone aircraft 10, the aircraft state table showing the current state of the drone aircraft 10 (Part a in
Next, the control unit 302 refers to the data table (
Subsequently, the control unit 302 selects the moving region, the horizontal angle change range, the vertical angle change range, the horizontal-angle-of-view change range, and the vertical-angle-of-view change range associated with the selected history information (Part c of
(Step S202: Predict Imageable Range)
The imaging range calculation unit 3022 predicts an angle range A1′ of the camera 101 in the horizontal direction within the predetermined time period t2, starting from the current horizontal angle of the camera 101 to the horizontal-angle changeable range that is the upper limit.
The angle range A1′ is calculated by, for example, the following formula (16) θH, θHa, and θH(min) to θH(max) respectively representing the current horizontal angle, the horizontal angle change range, and the horizontal-angle changeable range of the camera 101.
θH−(θHa/2)≤A1′≤θH+(θHa/2) (However, θH(min)≤A1′≤θH(max)) (16)
Subsequently, the imaging range calculation unit 3022 predicts an angle range A2′ of the camera 101 in the vertical direction within the predetermined time period t2, starting from the current vertical angle of the camera 101 to the vertical-angle changeable range that is the upper limit.
The angle range A2′ is calculated by, for example, the following formula (17), θV, θVa, and θV(min) to θV(max) respectively representing the current vertical angle, the vertical angle change range, and the vertical-angle changeable range of the camera 101.
θV−(θVa/2)≤A2′≤θV+(θVa/2) (however, θV(min)≤A2′≤θV(max)) (17)
Subsequently, the imaging range calculation unit 3022 predicts a horizontal angle of view θhav′ of the camera 101 within the predetermined time period t2, starting from the current horizontal angle of view of the camera 101 to the horizontal-angle-of-view changeable range that is the upper limit.
The horizontal angle of view θhav′ is calculated by, for example, the following formula (18), θAH, θAHa, and θAH(min) to θAH(max) respectively representing the current horizontal angle of view, the horizontal-angle-of-view change range, and the horizontal-angle-of-view changeable range of the camera 101.
θAH−(θAHa/2)≤θhav′≤θAH+(θAHa/2) (however, θAH(min)≤A2′≤θAH(max)) (18)
Similarly, the imaging range calculation unit 3022 calculates a vertical angle of view θvav′ of the camera 101 within the predetermined time period t2, starting from the current vertical angle of view of the camera 101 to the vertical-angle-of-view changeable range that is the upper limit.
The vertical angle of view θvav′ is calculated by, for example, the following formula (19), θVH, θVHa, and θVH(min) to θVH(max) respectively representing the current vertical angle of view, the vertical-angle-of-view change range, and the vertical-angle-of-view changeable range of the camera 101.
θVH−(θVH/2)≤θhav′≤θVH+(θVHa/2) (However, θVH(min)≤θhav′≤θVH(max)) (19)
Next, the imaging range calculation unit 3022 calculates an imageable range R′ on the basis of the angle ranges A1′ and A2′, the horizontal angle of view θhav′, and the vertical angle of view θvav′, and predicts the imageable range R′ within the predetermined time period t2 from the current imaging direction and the current angle of view of the camera 101.
(Step S203: Has Predetermined Time Period Elapsed?)
The imaging range calculation unit 3022 repeats the previous Step S202 every predetermined time period t2 in the case where the predetermined time period t1 has not elapsed since the aircraft state table was acquired from the drone aircraft 10 (No in Step S203). As a result, the data table (
That is, the imaging range calculation unit 3022 predicts the imageable range R′ each time the aircraft state table is acquired from the drone aircraft 10 from when the drone aircraft 20 acquired the aircraft state table to when the predetermined time period t1 elapsed. That is, the imaging range calculation unit 3022 outputs, every the predetermined time period t2, the imageable range R′ corresponding to the imaging direction (the horizontal angle, the vertical angle) and the angle of view (the horizontal angle of view, the vertical angle of view) of the camera 101 at that time. The imaging range calculation unit 3022 outputs, to the imaging region calculation unit 3023, the calculation result of calculating the imageable range R′ every predetermined time period t2.
(Step S204: Predict Image Region)
The imaging region calculation unit 3023 superimposes all the imageable ranges R′ calculated by the imaging range calculation unit 3022 every predetermined time period t2 to predict an imageable region D′ in the case where the predetermined time period t1 has elapsed since the aircraft state table was acquired from the drone aircraft 10 (YES in Step S203). The imaging region calculation unit 3023 calculates the imageable region D′ of the camera 101 within the predetermined time period t1, similarly to the first embodiment, and outputs, to the controller 40, the calculation result of calculating the imageable region D′.
The display unit 41 of the controller 40 displays the imageable region D′ of the drone aircraft 10. At this time, the display unit 41 generates an overlay image in which the imageable region D′ is virtually superimposed on the video taken by the camera 119, and displays this image. As a result, a user can recognize the imageable region D′ of the drone aircraft 10 as visualized information.
(Step S205: Generate Moving Route)
Specifically, the imaging region calculation unit 3023 determines whether or not the waypoint P that is not included in the maximum imageable region D can be set from the aircraft performance of the drone aircraft 20. In the case where such a waypoint P can be set, the moving route generation unit 2021 sets the waypoint P that is not included in the maximum imageable region D and generates the moving route L in which the drone aircraft 20 goes through the waypoint P, similarly to the above-mentioned first embodiment.
Meanwhile, the imaging region calculation unit 3023 outputs, in the case of determining that the waypoint P that is not included in the maximum imageable region D cannot be set from the aircraft performance of the drone aircraft 20, the calculation result of calculating the imageable region D′ to the moving route generation unit 2021.
The moving route generation unit 2021 calculates a coordinate position (xp′, yp′, and zp′) of a waypoint P′ avoiding a virtual obstacle, the imageable region D′ and the moving region selected in the previous Step S201 being the virtual obstacle, and generates a moving route L′ that goes through this coordinate position (xp′, yp′, and zp′).
The information processing system 1 according to the second embodiment repeatedly executes the series of processes from the previous Step S201 to Step S205 every predetermined time period t1. As a result, the imageable region D′ of the drone aircraft 10 according to the situation at that time is predicted every predetermined time period t1, and the waypoint P′ is intermittently set every predetermined time period t1.
[Operation and Effects]
The information processing apparatus 30 according to the second embodiment selects, from the history of the change range in which the imaging direction (the horizontal angle change range, the vertical angle change range) and the angle of view (the horizontal-angle-of-view change range, the vertical-angle-of-view change range) of the camera 101 are actually changed, the change range associated with history information most similar to the current imaging direction (the horizontal angle, the vertical angle) and the current angle of view (the horizontal angle of view, the vertical angle of view) of the camera 101, and calculates the imaging region D′ of the camera 101 on the basis of the selected change range. The drone aircraft 20 then generates a moving route L′ that does not intersecting the imageable region D′.
As a result, it is possible to prevent the drone aircraft 20 from appearing in the imaging range of the drone aircraft 10 within the predetermined time period t1 while considering the past operation tendency and imaging tendency of a user manipulating the drone aircraft 10.
Further, the information processing apparatus 30 according to the second embodiment determines, on the basis of the aircraft performance of the drone aircraft 20, whether to generate a moving route that avoids the maximum imageable region D or the imageable region D′. Here, the information processing apparatus 30 generates, in the case of determining that a moving route that avoids the maximum imageable region D cannot be generated, the moving route L′ that avoids the imageable region D′.
As a result, since the moving route L′ can be set even in the case where the moving route L that avoids the maximum imageable region D cannot be set, the flexibility when preventing the drone aircraft 20 from appearing in the imaging range of the drone aircraft 10 is improved.
The third embodiment is different from the first embodiment in that in the case where, for example, the arithmetic processing capacity itself of the drone aircraft 20 has improved or the drone aircraft 20 cannot communicate with the information processing apparatus 30, the drone aircraft 20 consistently executes the processing of calculating the maximum imageable region D of the drone aircraft 10 and generating the moving route of itself that does not intersect this maximum imageable region D.
[Configuration of Drone Aircraft]
As shown in
[Operation of Information Processing System]
The information processing system 3 according to the third embodiment executes the operation according to the flowchart shown in
Although the embodiments of the present technology have been described above, the present technology is not limited to the embodiments described above, and it goes without saying that various modifications may be made thereto.
For example, although the maximum imageable range R is calculated on the basis of the current imaging direction and the current angle of view of the drone aircraft 10 and the aircraft performance of the drone aircraft 10 in the first embodiment described above, the present technology is not limited thereto. For example, in the case where the aircraft state table is not updated due to a communication failure between the drone aircraft 10 and the information processing apparatus 30, the information processing apparatus 30 may calculate the maximum imageable range R and the maximum imageable region D on the basis of the upper limit value of the stored aircraft performance of the drone 10, and may generate a moving route that does not intersect this maximum imageable region D.
Further, although an overlay image in which the maximum imageable region D (the imageable region D′) of the drone aircraft 10 is virtually superimposed on the video taken by the camera 116 is displayed on the display unit 41 in the embodiments described above, the present technology is not limited thereto. Instead of or in addition to the overlay image, information that calls attention to the user may be displayed on the display unit 41.
Further, although the model number of the drone aircraft 10 is specified by performing image processing on a captured image obtained by imaging the drone aircraft 10 in the embodiments described above, the present technology is not limited thereto. The movable region calculation unit 3021 may acquire information regarding the model number or the model name of the drone aircraft 10 together with the aircraft state table from the drone aircraft 10 and may read, from the storage unit 303, the aircraft performance of the drone aircraft 10 and the camera 101 associated therewith for use.
In addition, although the maximum movable region E is calculated using all the maximum speed, the maximum ascending speed, the maximum descending speed, the maximum acceleration, the maximum ascending acceleration, and the maximum descending acceleration of the drone aircraft 10 in the embodiments described above, the present technology is not limited thereto. At least one of the maximum speed, the maximum ascending speed, the maximum descending speed, the maximum acceleration, the maximum ascending acceleration, or the maximum descending acceleration may be used for calculating the maximum movable region E.
<Supplement>
The embodiments of the present technology may include, for example, the information processing apparatus, the system, the information processing method executed by the information processing apparatus or the system, the program for operating the information processing apparatus, and a non-transitory tangible medium in which the program is recorded, as described above.
Further, although the description has been made on the premise that the moving body is a flying object in the embodiments described above, the present technology is not limited thereto. The present technology may be applied to a moving body (e.g., a robot) other than the flying object, and an application thereof is not particularly limited. Note that the flying object may include, in addition to the drone aircraft, an aircraft, an unmanned aerial vehicle, an unmanned helicopter, and the like.
Further, the effects described herein are not limitative, but are merely descriptive or illustrative. In other words, the present technology may have other effects apparent to those skilled in the art from the description herein together with the effects described above or in place of the effects described above.
The favorable embodiments of the present technology have been described above in detail with reference to the accompanying drawings. However, the present technology is not limited to such examples. It is clear that persons who have common knowledge in the technical field of the present technology could conceive various alterations or modifications within the scope of the technical idea described in the claims. It is understood that of course such alterations or modifications also fall under the technical scope of the present technology.
It should be noted that the present technology may also take the following configurations.
(1)
An information processing apparatus, including:
(2)
The information processing apparatus according to (1) above, in which
(3)
The information processing apparatus according to (2) above, in which
(4)
The information processing apparatus according to (2) or (3) above, in which
(5)
The information processing apparatus according to any one of (1) to (4) above, in which
(6)
The information processing apparatus according to (5) above, in which
(7)
The information processing apparatus according to (5) or (6) above, in which
(8)
The information processing apparatus according to any one of (1) to (7) above, in which
(9)
A moving body, including:
(10)
The moving body according to (9) above, in which
(11)
A moving body, including:
(12)
The moving body according to (11) above, in which
(13)
An information processing system, including:
(14)
An information processing method for an information processing apparatus, including:
(15)
A program that causes an information processing apparatus to execute the steps of:
Number | Date | Country | Kind |
---|---|---|---|
2019-146147 | Aug 2019 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2020/024421 | 6/22/2020 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2021/024627 | 2/11/2021 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
20110304617 | Nishida | Dec 2011 | A1 |
20160299233 | Levien | Oct 2016 | A1 |
20160371864 | Gotoh | Dec 2016 | A1 |
20190094850 | Li | Mar 2019 | A1 |
20190212751 | Zhou | Jul 2019 | A1 |
20190265734 | Liu | Aug 2019 | A1 |
20200066142 | Fowe | Feb 2020 | A1 |
Number | Date | Country |
---|---|---|
6103672 | Mar 2017 | JP |
2018-173960 | Nov 2018 | JP |
2019-514236 | May 2019 | JP |
2019039099 | Feb 2019 | WO |
2019244626 | Dec 2019 | WO |
Entry |
---|
International Search Report and Written Opinion mailed on Sep. 1, 2020, received for PCT Application PCT/JP2020/024421, Filed on Jun. 22, 2020, 12 pages including English Translation. |
Temma et al., “Enhancing Drone Interface Using Spatially Coupled Two Perspectives”, Proceedings of Interaction, 2019, pp. 102-111. |
Number | Date | Country | |
---|---|---|---|
20220205791 A1 | Jun 2022 | US |