The present application is based upon and claims the benefit of priority from the prior Japanese Patent Application No. 2016-124913, filed Jun. 23, 2016, the entire contents of which are incorporated herein by reference.
The present invention relates to a technique of imaging plural subjects or a single subject.
In recent, flight devices or so-called drone-type flying devices are widely used in various fields, which are provided with four propulsion units using rotor blades driven by electric motors. Particularly, the drone-type flying device with a digital camera fixed on is used for performing an imaging operation at an unreachable high place which is very dangerous for a human to perform operation. The drone-type flying devices having the digital camera have come into wide use by making use of a timer function of the digital camera and a remote control operation.
When the drone-type flying device is used to image a subject, an operator of the drone-type flying device is required to control the drone-type flying device to move the digital camera fixed to the drone-type flying device to a position appropriate for imaging the subject.
A conventional technique for solving the above inconvenience is disclosed in Japanese Unexamined Patent Publication No. 2004-118087, which technique uses a photo balloon controlled by a signal sent from a cellular phone. Receiving from the cellular phone a signal of requesting for performing an imaging operation, the photo-balloon moves in the air toward an area including a position indicated by position data contained in the received signal, and operates its image pickup unit to image the subject, when it is determined that the photo-balloon has reached the area.
According to one aspect of the invention, there is provided an image pickup apparatus which comprises a propulsion unit which is used for propulsion of the image pickup apparatus, an image pickup unit which images plural subjects or a single subject, and a controlling unit which controls the propulsion unit to move the image pickup apparatus such that at least two out of the plural subjects and the image pickup unit do not interfere in position with each other, or such that the single subject and the image pickup unit do not interfere in position with each other, and instructs the image pickup unit to image the plural subjects or the single subject.
According to another aspect of the invention, there is provided an image pickup method in an image pickup apparatus, the image pickup apparatus being provided with a controlling unit, a propulsion unit, and an image pickup unit for imaging plural subjects or a single subject, the image pickup method comprising steps of making the controlling unit control the propulsion unit to move the image pickup apparatus such that at least two out of the plural subjects and the image pickup unit do not interfere in position with each other, or such that the single subject and the image pickup unit do not interfere in position with each other, and making the controlling unit instruct the image pickup unit to image the plural subjects or the single subject.
According to still other aspect of the invention, there is provided a non-transitory computer-readable recording medium with a computer program stored thereon, the computer program prepared for making a computer control an image pickup apparatus, wherein the image pickup apparatus is provided with a propulsion unit for propulsion of the image pickup apparatus and an image pickup unit for imaging plural subjects or a single subject, and the computer program, when installed on the computer, making the computer control the propulsion unit to move the image pickup apparatus such that at least two out of the plural subjects and the image pickup unit do not interfere in position with each other, or such that the single subject and the image pickup unit do not interfere in position with each other, and making the computer instruct the image pickup unit to image the plural subjects or the single subject.
Additional objects and advantages of the invention will be set forth in the following description, and in part will be obvious from the description, or may be learned by practice of the invention. The objects and advantages of the invention may be realized and obtained by means of the instrumentalities and combinations particularly pointed out hereinafter.
The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate embodiments of the inventions, and together with the general description given above and the detailed description of the embodiments given below, serve to explain the principles of the invention.
The embodiments of the present invention are described with reference to the accompanying drawings in detail. An image pickup apparatus according to the embodiments of the invention is provided with a propulsion unit, an image pickup unit, and a controlling unit. The propulsion unit serves to propel the image pickup apparatus in the air. The image pickup unit serves to image plural subjects or a single subject. The image pickup apparatus according to the embodiments of the invention is a so-called flight apparatus of a drone-type. The controlling unit controls the propulsion unit to move the flight apparatus such that at least two out of the plural subjects and the image pickup unit do not interfere in position with each other, or such that the single subject and the image pickup unit do not interfere in position with each other, thereby allowing the image pickup unit to image the plural subjects or the single subject.
As shown in
The main frame 101 contains a circuit box 105. The circuit box 105 stores a motor driver for driving the motors 104, a controller, and sensors. A camera (an image pickup unit) 106 is installed on a lower part of the main frame 101.
As shown in
The power sensor 206 supplies the motor drivers 205 with power while monitoring a voltage of a battery 207. A push button which can sense contact may be used in place of the touch sensor 204. Although not shown in
The controller 201 receives information representing a posture of the body of the flight apparatus 100 from the flight sensor 203 in real time. Further, the controller 201 sends each of the motor drivers (#1 to #4) 205 a pulse-wide modulated power-instruction signal based on a duty ratio, while monitoring the voltage of the battery 207 through the power sensor 206. Receiving the power-instruction signals, the motor drivers (#1 to #4) 205 control rotational speeds of the motors (#1 to #4) 104, respectively.
The controller 201 controls the camera system 202 to adjust an image pickup operation of the camera 106 (Refer to
The controller 201, the camera system 202, the flight sensor 203, the motor drivers (#1 to #4) 205, the power sensor 206 and the battery 207 shown in
The operation of the flight apparatus 100 having the configuration shown in
As shown in
The operation of the controller 201 in the first embodiment of the invention is specifically described with reference to the flow chart in
The controller 201 watches a voltage variation of the touch sensor 204 to repeatedly judge whether the flight apparatus 100 has left a hand of the user or whether the flight apparatus 100 has been thrown by the user (step S401 in
When it is determined that the flight apparatus 100 has left the hand of the user or the flight apparatus 100 has been thrown by the user (YES at step S401), the controller 201 controls the motor drivers (#1 to #4) 205 to drive the motors (#1 to #4) 104 and rotor blades 103, thereby making the flight apparatus 100 fly through the air to detect the line of the subject group as shown in
More specifically, the controller 201, for example, performs a face recognition process on image data obtained by the camera 102 and the camera system 202 to recognize the faces of the plural subjects. Then, the controller 201 confirms a state in which these recognized faces forma line, thereby detecting the line of the subject group. Further, once having confirmed a state in which the line of the subject group remains still, the controller 201, for example, repeatedly performs the face recognition process on the image data to continuously capture the faces of the plural subjects.
When the flight apparatus 100 flies through the air to detect the line of the subject group (step S402), the controller 201 judges whether the line of the subject group has satisfactorily been detected (step S403).
When it is determined that the line of the subject group has not been detected (NO at step S403), the controller 201 returns to the process of step S402 and tries to detect the line of the subject group, again.
When it is determined that the line of the subject group has been detected (YES at step S403), the controller 201 performs an image processing on the image data obtained by the camera 102 and the camera system 202, whereby controlling the motor drivers (#1 to #4) 205 to drive the motors (#1 to #4) 104 and rotor blades 103 to make the flight apparatus 100 fly through the air to the position in the direction perpendicular to the longitudinal direction of the line of the subject group, as shown in
After performing the flight controlling process at step S404, the controller 201 judges whether the camera 102 has been brought to the position in the direction perpendicular to the longitudinal direction of the line of the subject group (step S405).
When it is determined that the camera 102 has not been brought to the position (NO at step S405), the controller 201 returns to the process of step S404, and tries to perform the flight controlling process, again.
When it is determined that the camera 102 has been brought to the position (YES at step S405), the controller 201 controls the camera system 202 to instruct the camera 102 to image the subject group at the position (step S406). At this time, in accordance with a result of the face recognition process performed on the subjects by the controller 201, the camera system 202 performs an auto focusing control operation and an auto exposure controlling operation of the camera 102. And then the controller 201 finishes the image pickup operation controlling process in the first embodiment of the invention.
With availability of an autonomous control of the flight apparatus 100 based on the operation of the controller 201 in the first embodiment described above, simply throwing the flight apparatus 100 in the air, the user can make the camera 102 of the flight apparatus 100 image the plural subjects belonging to the line of the subject group from a front and appropriate position of the subject group in an extremely simple manner.
As shown in
The operation of the controller 201 in the second embodiment of the invention is specifically described with reference to the flow chart of
The controller 201 repeatedly judges whether the flight apparatus 100 has left the hand of the user or whether the flight apparatus has been thrown by the user (step S601 in
When it is determined that the flight apparatus 100 has left the hand of the user or the flight apparatus 100 has been thrown by the user (YES at step S601), the controller 201 controls the motor drivers (#1 to #4) 205 to drive the motors (#1 to #4) 104 and rotor blades 103, thereby making the flight apparatus 100 fly through the air to detect the plural subjects or the single subject as shown in
More specifically, the controller 201, for example, performs the face recognition process on image data obtained by the camera 102 and the camera system 202 to recognize the faces of the plural subjects or the face of the single subject. Then, the controller 201 detects the plural subjects or the single subject. Further, similarly to the first embodiment, once having confirmed a state in which the plural subjects or the single subject remains still, the controller 201, for example, repeatedly performs the face recognition process on the image data to continuously capture the faces of the plural subjects or the face of the single subject.
When the flight apparatus 100 flies through the air to detect the plural subjects or the single subject (step S602), the controller 201 judges whether the plural subjects or the single subject has been detected satisfactorily (step S603).
When it is determined that the plural subjects or the single subject has not been detected (NO at step S603), the controller 201 returns to the process of step S602 and tries to detect the plural subjects or the single subject, again.
When it is determined that the plural subjects or the single subject has been detected (YES at step S603), the controller 201, for example, estimates an azimuth of the sun based on information output from the geomagnetic sensor of the flight sensor 203. Further, the controller 201 performs the image processing on information of the azimuth of the sun and the image data obtained by the camera 102 and the camera system 202 to detect the shadow of the flight apparatus 100 (step S604).
The controller 201 judges whether the detected shadow of the flight apparatus 100 overlaps with the plural subjects or the single subject detected at steps S602 and S603 (step S605).
When it is determined that the shadow of the flight apparatus 100 overlaps with the plural subjects or the single subject (YES at step S605), then the controller 201, while keeping capturing the plural subjects or the single subject, controls the motor drivers (#1 to #4) 205 to drive the motors (#1 to #4) 104 and rotor blades 103, thereby making the flight apparatus 100 fly through the air, for example, for a fixed distance in a random direction (step S606).
Thereafter, the controller 201 returns to the process of step S604, and detects the shadow of the flight apparatus 100, again and judges whether the detected shadow of the flight apparatus 100 overlaps with the plural subjects or the single subject. The controller 201 repeatedly performs these operations.
When it is determined that the detected shadow of the flight apparatus 100 does not overlap with the plural subjects or the single subject (NO at step S605), similarly to the process at step S406 of
With availability of the autonomous control of the flight apparatus 100 based on the operation of the controller 201 in the second embodiment described above, simply throwing the flight apparatus 100 in the air, the user can make the camera 102 of the flight apparatus 100 image the plural subjects belonging to the subject group or the single subject in a simple manner, while avoiding an influence of the shadow of the flight apparatus 100.
A modification may be made to the disclosed second embodiment, in which the controller 201 of the flight apparatus 100 moves the flight apparatus 100 to a position, at which the shadow of the flight apparatus 100 does not overlap with at least one subject belonging to the plural subjects or the single subject and a back light imaging is not allowed, and then makes the camera 102 and the camera system 202 perform the image pickup operation at the position.
In the third embodiment of the invention, the controller 201 of the flight apparatus 100 moves the flight apparatus 100 to a position, at which another subject of the plural subjects does not overlap with a face of at lest one of the plural subjects, as shown in
The operation of the controller 201 in the third embodiment of the invention is specifically described with reference to the flow chart of
When it is determined that the flight apparatus 100 has left the hand of the user or the flight apparatus 100 has been thrown by the user (YES at step S801), the controller 201 controls the motor drivers (#1 to #4) 205 to drive the motors (#1 to #4) 104 and rotor blades 103, thereby making the flight apparatus 100 fly through the air to detect the subject group as shown in
More specifically, similarly to the process at step S602 of
When the flight apparatus 100 flies through the air to detect the subject group (step S802), the controller 201 judges whether the subject group has been detected satisfactorily (step S803).
When it is determined that the subject group has not been detected (NO at step S803), the controller 201 returns to the process of step S802 and tries to detect the subject group, again.
When it is determined that the subject group has been detected (YES at step S803), the controller 201, for example, detects an outline of the face of the subject recognized in the face recognition process which has been performed at step S802 on the image data obtained by the camera 102 and the camera system 202. Then, the controller 201 judges whether any subject whose outline of the face is deficient is included in the subject group, thereby determining whether plural subjects in the subject group overlap each other (step S804).
When it is determined that the plural subjects in the subject group overlap each other (YES at step S804), similarly to the process at step S606 in
Thereafter, the controller 201 returns to the process of step S804, and judges whether the plural subjects in the subject group overlap each other, again.
When it is determined that the plural subjects in the subject group do not overlap each other (NO at step S804), similarly to the process at step S406 of
With availability of the autonomous control of the flight apparatus 100 based on the operation of the controller 201 in the third embodiment of the invention described above, simply throwing the flight apparatus 100 in the air, the user can make the camera 102 of the flight apparatus 100 image the plural subjects belonging to the line of the subject group without any faces of the plural subjects overlapping each other.
In the fourth embodiment of the invention, the controller 201 of the flight apparatus 100 moves the flight apparatus 100 to a position, at which another subject belonging to the another subject group does not overlap with at least one subject belonging to the target subject group or a single subject, or a position, at which another plural subjects belonging to the another subject group do not overlap with the at least one subject belonging to the target subject group or the single subject as shown in
The operation of the controller 201 in the fourth embodiment of the invention is specifically described with reference to a flow chart in
When it is determined that the flight apparatus 100 has left the hand of the user or the flight apparatus 100 has been thrown by the user (YES at step S1001), the controller 201 controls the motor drivers (#1 to #4) 205 to drive the motors (#1 to #4) 104 and rotor blades 103 to make the flight apparatus 100 fly through the air to detect the target subject group as shown in
More specifically, similarly to the process at step S602 in
When the flight apparatus 100 flies through the air to detect the target subject group (step S1002), the controller 201 judges whether the target subject group has been detected satisfactorily (step S1003).
When it is determined that the target subject group has not been detected (NO at step S1003), the controller 201 returns to the process of step S1002 and tries to detect the target subject group, again.
When it is determined that the target subject group has been detected (YES at step S1003), the controller 201 controls the motor drivers (#1 to #4) 205 to drive the motors (#1 to #4) 104 and rotor blades 103, thereby making the flight apparatus 100 hover at the current position. Then, the controller 201 performs the face recognition process on the image data output from the camera system 202 to detect the another subject not belonging to the target subject group or the another plural subjects not belonging to the target subject group (step S1004).
The controller 201 judges whether the another subject not belonging to the target subject group or the another plural subjects not belonging to the target subject group detected at step 1004 overlaps the target subject group detected at steps S1002 and 1003 or the single subject, or whether the target subject group detected at steps S1002 and 1003 or the single subject overlaps the another subject not belonging to the target subject group or the another plural subjects not belonging to the target subject group detected at step 1004 (step S1005).
When it is determined YES at step 1005, similarly to the process at step S606 in
Thereafter, the controller 201 returns to the process of step S1004, and detects the another subject not belonging to the target subject group or the another plural subjects not belonging to the target subject group, again, and then performs the overlapping judgment of step S1005, again.
When it is determined that the subject groups or the subjects do not overlap each other (NO at step S1005), similarly to the process at step S406 in
With availability of the autonomous control of the flight apparatus 100 based on the operation of the controller in the fourth embodiment of the invention described above, simply throwing the flight apparatus 100 in the air, the user can make the camera 102 of the flight apparatus 100 image the plural subjects belonging to the line of the target subject group or the single subject without being overlapped by the another subject not belonging to the target subject group or the another plural subjects not belonging to the target subject group in a simple manner.
The operation of the controller 201 in the fifth embodiment of the invention is specifically described with reference to the flow chart in
When it is determined that the flight apparatus 100 has left the hand of the user or the flight apparatus 100 has been thrown by the user (YES at step S1101), the controller 201 controls the motor drivers (#1 to #4) 205 to drive the motors (#1 to #4) 104 and rotor blades 103, thereby making the flight apparatus 100 fly through the air to detect the plural subjects or the single subject as shown in
More specifically, similarly to the process at step S602 in
When the flight apparatus 100 flies through the air to detect the plural subjects or the single subject (step S1102), the controller 201 judges whether the plural subjects or the single subject has been detected satisfactorily (step S1103).
When it is determined that the plural subjects or the single subject has not been detected (NO at step S1103), the controller 201 returns to the process of step S1102 and tries to detect the plural subjects or the single subject, again.
When it is determined that the plural subjects or the single subject has been detected (YES at step S1103), the controller 201 controls the motor drivers (#1 to #4) 205 to drive the motors (#1 to #4) 104 and rotor blades 103, thereby making the flight apparatus 100 hover at the current position. Then, the controller 201 detects the light emitted from another flight apparatus, while hovering at the current position (step S1104). More particularly, the controller 201 detects a light temporally moving in the air as the light to be detected.
The controller 201 judges whether the light detected at step S1104 overlaps with the plural subjects or the single subject detected at steps S1102 and S1103 (step S1105).
When it is determined that the light overlaps with the plural subjects or the single subject (YES at step S1105), similar to the process at step S606 in
Thereafter, the controller 201 returns to the process of step S1104, and detects the light and judges at step S1105 whether the light overlaps with the plural subjects or the single subject, again.
When it is determined that the light does not overlap with the plural subjects or the single subject (NO at step S1105), similarly to the process at step S406 in
With availability of the autonomous control of the flight apparatus 100 based on the operation of the controller 201 in the fifth embodiment of the invention described above, simply throwing the flight apparatus 100 in the air, the user can make the camera 102 of the flight apparatus 100 image the plural subjects belonging to the line of the subject group or the single subject in a simple manner while avoiding an influence of the light of the another flight apparatus.
In the sixth embodiment of the invention, when the another drone-type flight apparatus is flashing the light, the flight apparatus 100 sends a request to the another flight apparatus, asking to stop flashing light for a predetermined period, and instructs the camera 102 and the camera system 202 to perform the image pickup operation.
In the flow chart of
The process at step S1201 in the flow chart of
As a result, having confirmed that the another flight apparatus has turned off the light and the light does not overlap with the plural subjects or the single subject (NO at step S1105), the controller 201 instructs the camera 102 and the camera system 202 to perform the image pickup operation.
With availability of the autonomous control of the flight apparatus 100 based on the operation of the controller in the fifth embodiment of the invention described above, simply throwing the flight apparatus 100 in the air, the user can make the camera 102 of the flight apparatus 100 image the plural subjects belonging to the line of the subject group or the single subject while avoiding an influence of the light of the another flight apparatus.
In the respective embodiments of the invention, the plural subjects or the single subject can be imaged in a simple manner as described above.
In the above description, the invention has been described taking as an example the flight apparatus of the drone type, having the propulsion unit for flying through the air, but the invention is not restricted to the flight apparatus. The present invention, for example, can be applied to a robot having a propulsion member for proceeding on the ground.
Further, in the above respective embodiments of the invention, the simple face recognizing process is applied and performed on the image data to capture the plural subjects belonging to the subject group or the single subject but it is possible to recognize smiling faces to capture the plural subjects belonging to the subject group or the single subject.
In the above described embodiments of the invention, it is possible for the user to use a switch to select the process to be performed.
In the above described embodiments of the invention, the flight apparatus 100 is allowed to perform the imaging operation, thereby obtaining any number of still images. The flight apparatus 100 can obtain not only still images but also moving images of an arbitrary time.
In the above described embodiments of the invention, the flight apparatus 100 has the propulsion unit including the motors 104 and rotor blades 103. But a propulsion unit using compressed air and/or an engine can be used in place of the propulsion unit including the motors 104 and rotor blades 103.
In the above described embodiments of the invention, a relationship between the plural subjects and the operator of the imaging apparatus 100 or a relationship between the single subject and the operator of the imaging apparatus 100 has not been explained. It is possible that the operator is included in the plural subjects or the operator is the single subject.
In the above described embodiments of the invention, the controlling unit controls the propulsion unit to move the flight apparatus such that the at least two out of the plural subjects and the image pickup unit do not interfere in position with each other, or such that the single subject and the image pickup unit do not interfere in position with each other, thereby allowing the image pickup unit to image the plural subjects or the single subject. But an image pickup timing is not restricted to the previously described timing.
Although specific embodiments of the invention have been described in the foregoing detailed description, it will be understood that the invention is not limited to the particular embodiments described herein, but modifications and rearrangements may be made to the disclosed embodiments while remaining within the scope of the invention as defined by the following claims. It is intended to include all such modifications and rearrangements in the following claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
2016-124913 | Jun 2016 | JP | national |