This application is based upon and claims the benefit of priority under 35 USC 119 of Japanese Patent Application No. 2016-076961 filed on Apr. 7, 2016, the entire disclosure of which, including the description, claims, drawings, and abstract, is incorporated herein by reference in its entirety.
The present invention relates to a technology for controlling movement of a device for moving a moving device from afar back to a user.
Flying cameras for performing imaging from the air such as so-called drones are spreading. Also, technologies for using an industrial product or a security product to track an object and image the object have been known.
As a prior art of flying cameras having a tracking function, the following technology has been known (for example, a technology disclosed in JP-A-2014-53821). A flying unit having a camera identifies a mark given to a worker by image recognition or the like, and flies along the mark if the worker moves, and acquires images of areas around the work. The images acquired by the flying unit are transmitted to a base apparatus, and are relayed from the base apparatus to a monitoring center in real time. Therefore, the monitoring center can recognize movement and work contents of the worker by the images. According to this configuration, even in cases where security objects move and cases where it is difficult to install permanent monitoring cameras, it is possible to flexibly acquire images of objects.
As a prior art of flying cameras having a tracking function, the following technology has also been known (for example, a technology disclosed in JP-A-2015-48025). A defense device 1 includes an umbrella unit for covering a moving object 25 from above, thereby defending the moving object against predetermined disturbances, an aerial movement mechanism for moving the umbrella unit in the air, an aerial movement drive unit thereof an activation information receiving unit, an imaging unit, and a control unit. If the activation information receiving unit receives activation information, the control unit activates the defense device, and controls the imaging unit such that the imaging unit starts imaging. After the activation, the control unit recognizes the moving object based on images acquired by the imaging unit. Also, if the control unit detects movement of the moving object based on images acquired by the imaging unit, it controls the aerial movement drive unit such that movement of the defense device in the air follows the movement of the moving object recognized by a recognizing means, thereby defending the moving object against the predetermined disturbances by the umbrella unit even when the moving object moves.
According to one aspect of the invention, a moving device for moving along a terminal device includes a first control unit and a second control unit. The first control unit is configured to move the moving device from a position far from the terminal device to a vicinity of the terminal device based on a current position of the terminal device. The second control unit is configured to recognize the terminal device or a user of the terminal device in the vicinity of the current position of the terminal device.
According to another aspect of the invention, in a moving system, a moving device moves by communication with a terminal device. The terminal device transmits a current position of the terminal device to the moving device. The moving device includes a first processing unit which is configured to receive information on the current position of the terminal device and to move toward the current position of the terminal device. The moving device further includes a second processing unit which is configured to identify the terminal device or a user of the terminal device when the moving device is close to the current position of the terminal device based on the received information.
According to further another aspect of the invention, a terminal device performs communication with a moving device and controls the moving device such that the moving device flies to a position above an object and performs imaging on the object. The terminal device includes a terminal-side position detecting unit and a terminal-side control unit. The terminal-side position detecting unit is configured to detect a position of the terminal device. The terminal-side control unit is configured to detect a current position of the terminal device by the terminal-side position detecting unit based on a call instruction of a user, and is configured to perform a current-position-information transmitting process of transmitting information on the current position to the moving device.
According to further another aspect of the invention, a method of controlling a moving device, includes: receiving information on a current position of a terminal device and electric waves of a positioning system and controlling a position of a moving device, until the moving device reaches a vicinity of the terminal device; and after the moving device reaches the vicinity of the terminal device, identifying the terminal device or a user of the terminal device.
Hereinafter, a mode for carrying out the present invention will be described in detail with reference to the accompanying drawings.
In the flying camera device 100, four motor frames 102 (supporting units) are attached to a main frame 101. The motor frames 102 are configured to be capable of supporting motors 104, and rotor blades 103 are fixed to motor shafts of the motors 104. The rotor blades 103 have a predetermined attack angle such that their rotation causes lift. The four pairs of the motors 104 and the rotor blades 103 constitute propelling units.
On the lower portion of the main frame 101, a camera 105 is attached as a camera unit. This camera 105 is, for example, a hemispherical camera, and can simultaneously or sequentially acquire images of areas below the flying camera device 100 in the range of 360 degrees. Around the camera 105, landing legs 107 are installed. The main frame 101 contains a control box 106 which contains various control devices to be described below with reference to
As shown by a reference symbol “122” in
The visible-light flickering object 111 (a visible-light emitting unit) is a device capable of driving, for example, a light emitting diode (LED) to flicker, and can emit visible light, for example, such that the visible light flickers. The visible-light flickering object 111 may be installed at a portion of the wearable device 110, or may be assembled on a bracelet, a brooch, or a pendant, independently from the wearable device 110.
When the flying camera device 100 is close to the current position of the wearable device 110, it can catch the flickering light emitted from the visible-light flickering object 111 close thereto, by the camera 105, as shown by a reference symbol “124” in
In this state, if the parent performs a call instruction operation on the wearable device 110, in response to the call instruction, the wearable device 110 detects the current position of the wearable device by a GPS sensor, and performs a current-position-information transmitting process of transmitting information on the current position to the flying camera device 100 (a reference symbol “201” of
If the flying camera device 100 receives the information on the current position of the wearable device 110 from the wearable device, it performs a process of controlling the propelling units including the motors 104 and the rotor blades 103 while sequentially comparing the current position of the wearable device based on the received information with current positions of the flying camera device sequentially detected by the GPS sensor such that the flying camera device flies toward the current position of the wearable device 110 (a reference symbol “202” of
If the flying camera device 100 gets close to the current position of the wearable device 110 based on the received information, while searching for, for example, flickering visible light emitted by the visible-light flickering object 111 of the child close to the wearable device 110 of the parent (reference symbol “203” and “204” of
In this case, during flight start, the flying camera device 100 performs a process of acquiring the position of the flying camera device from the GPS sensor and storing the acquired position as a flight start position. Also, if imaging finishes, the flying camera device 100 performs a returning process of flying back to the flight start position (a reference symbol “206” of
As described above, in the present embodiment, in response to the call instruction from the wearable device 110 operated by the parent, the flying camera device 100 can perform a series of automatic imaging operations in which it flies to an area above the child, and automatically recognizes and images the child, and returns to the original flying camera if the imaging finishes. As another usage scene of the present embodiment, for example, when a user having the wearable device 110 including the visible-light flickering object 111 is surfing in the sea, the user can call the flying camera device 100 disposed at a coast, with the wearable device 110, such that the flying camera device performs a series of automatic imaging operations in which it flies to an area above the user, and automatically images the surfing user, and returns to the coast if the imaging finishes. As a further usage scene of the present embodiment, for example, when a user having the wearable device 110 including the visible-light flickering object 111 is fishing, if a fish is caught, the user can call the flying camera device 100 disposed at a rocky area, with the wearable device 110, such that the flying camera device performs a series of automatic imaging operations in which it flies to an area above the user, and automatically images the fishing user, and returns to the rocky area if the imaging finishes. In this case, the wearable device 110 can inform the current position of the wearable device to the flying camera device 100, for example, based on a mobile telephone communication standard, and the flying camera device 100 can fly to an area over the wearable device 110, for example, based on the GPS or the above-described beacon. Therefore, even if the flying camera device 100 is disposed at first in a place or an environment where it cannot specify a user by imaging of the camera 105, it can specify and follow the user from there.
The controller 301, the camera system 302, the flight sensor 303, the motor drivers 304, the communication control unit 305, the power sensor 306, and the battery 307 shown in
The wearable device 110 is also connected to the visible-light flickering object 111. As described above, the visible-light flickering object 111 may be configured integrally with or separately from the wearable device 110. In a case where the wearable device 110 and the visible-light flickering object 111 are separated from each other, they may be wirelessly connected, for example, based on Bluetooth which is a wire communication standard.
First, the CPU 401 determines whether an operation button of the operation unit 405 (
If the determination result of STEP S501 becomes “YES”, in STEP S502, the CPU 401 controls the GPS sensor included in the sensor unit 403 such that the GPS sensor acquires the current position.
In STEP S503, the CPU 401 determines whether the GPS sensor has acquired the current position.
If the determination result of STEP S503 is “YES”, in STEP S505, the CPU 401 transmits information on the current position acquired in STEP S502, to the flying camera device 100 through the communication control unit 406.
Subsequently, the CPU 401 repeatedly performs the series of the processes of STEPS S502, S503, and S505 described above (if the determination result of STEP S506 is “NO”), until it receives a searching start notification from the flying camera device 100 through the communication control unit 406.
If the CPU 401 receives a searching start notification from the flying camera device 100 through the communication control unit 406, whereby the determination result of STEP S506 becomes “YES”, in STEP S507, the CPU controls the visible-light flickering object 111 such that the visible-light flickering object starts flickering.
Subsequently, in STEP S508, the CPU 401 displays a message urging the user to turn the visible-light flickering object 111 to the flying camera device 100 flying toward the user, on the display of the touch panel display 404.
Subsequently, in STEP S509, the CPU 401 displays an imaging mode menu for allowing the user to designate an imaging mode such as a still image shooting mode, a video shooting mode, a time-lapse imaging mode, or the like.
Subsequently, in STEP S510, the CPU 401 determines whether the user has designated any one imaging mode on the touch panel display 404.
If the determination result of STEP S510 becomes “YES”, in STEP S511, the CPU 401 transmits information on the imaging mode designated by the user, to the flying camera device 100 through the communication control unit 406.
If the determination result of STEP S510 becomes “NO”, the CPU 401 skips the process of STEP S511.
Thereafter, in STEP S512, the CPU 401 determines whether the user has operated an operation button of the operation unit 405 for instructing finish of imaging.
If the determination result of STEP S512 becomes “NO”, the CPU 401 returns to the determining process of STEP S510, and repeatedly performs the series of the processes of STEPS S510 to S512.
If the determination result of STEP S512 becomes “YES”, the CPU 401 transmits the imaging finish instruction to the flying camera device 100 through the communication control unit 406. Thereafter, the CPU 401 finishes the control process shown by the flow chart of
If the GPS sensor has failed to acquire the current position in the process of STEP S502, whereby the determination result of STEP S503 becomes “NO”, since the flying camera device 100 cannot grasp the current position of the wearable device 110, in STEP S504, the CPU 401 transmits a finish instruction to the flying camera device 100 through the communication control unit 406. Thereafter, the CPU 401 returns to the process of STEP S501.
First, in STEP S601 of
Subsequently, in STEP S602 of
If the determination result of STEP S602 becomes “NO”, since flight is impossible, the controller 301 immediately finishes the control process shown by the flow charts of
If the determination result of STEP S602 becomes “YES”, in STEP S603 of
Subsequently, in STEP S604 of
If the determination result of STEP S604 becomes “NO”, in STEP S605 of
If the determination result of STEP S605 also becomes “NO”, the controller 301 returns to STEP S604.
In a case where the determination result of STEP S605 becomes “YES”, since the controller does not know the position of the wearable device 110, flight is impossible. Therefore, the controller 301 immediately finishes the control process shown by the flow charts of
If the determination result of STEP S604 becomes “YES”, in STEP S606 of
During the flight, the controller 301 acquires the current position by the GPS sensor included in the flight sensor 303 (STEP S607 of
Then, in STEP S608 of
If the determination result of STEP S608 becomes “NO”, since the flying camera device cannot fly any more, in STEP S609 of
If the determination result of STEP S608 becomes “YES”, in STEP S610 of
If the determination result of STEP S610 becomes “NO”, the controller 301 proceeds to the process of STEP S606 such that the flying camera device keeps flying.
If the determination result of STEP S610 becomes “YES”, the controller 301 proceeds to the process of STEP S611 of
Subsequently, in STEP S612 of
Thereafter, in STEP S613 of
Then, in STEP S614 of
If the determination result of STEP S614 becomes “NO”, in STEP S615 of
If the determination result of STEP S615 becomes “NO”, the controller 301 returns to the process of STEP S613 in which the controller keeps searching for the visible-light flickering object 111.
If the determination result of STEP S615 becomes “YES”, the controller 301 performs the series of the processes of STEPS S627 to S630 of
When the processes of STEPS S613 to S615 are repeatedly performed, if the visible-light flickering object 111 is found, whereby the determination result of STEP S614 becomes “YES”, in STEP S616 of
Subsequently, in STEP S617 of
Subsequently, in STEP S618 of
Thereafter, in STEP S619, the controller 301 determines whether the face of the user has been recognized.
If the determination result of STEP S619 becomes “NO”, in STEP S620, the controller 301 slightly gains the altitude by controlling the first to fourth motor drivers 304 based on the output of the flight sensor 303 of
Thereafter, the controller 301 returns to the process of STEP S613 in which it searches for the visible-light flickering object 111 again.
If the face recognition succeeds, whereby the determination result of STEP S619 becomes “YES”, in STEP S621 of
If the determination result of STEP S621 becomes “YES”, in STEP S622 of
If the determination result of STEP S621 becomes “NO”, in STEP S623 of
Subsequently, in STEP S624 of
If the determination result of STEP S624 becomes “NO”, in STEP S625 of
If the determination result of STEP S625 also becomes “NO”, the controller 301 returns to the process of STEP S621 in which it controls the camera system 302 such that the camera system keeps imaging by the camera 105.
In a case where the imaging finishes, or finish of the imaging is instructed, whereby the determination result of STEP S624 or STEP S625 becomes “YES”, in STEP S626 of
Thereafter, in STEP S627 of
Subsequently, in STEP S628 of
If the determination result of STEP S628 becomes “NO”, since the flying camera device cannot fly any more, in STEP S629 of
If the determination result of STEP S628 becomes “YES”, in STEP S630, the controller 301 performs a returning process.
First, in STEP S801, the controller 301 controls the first to fourth motor drivers 304 such that the flying camera device starts to fly toward a returning point corresponding to the flight start position stored in STEP S603 of
During the flight, the controller 301 acquires the current position by the GPS sensor included in the flight sensor 303 (STEP S802).
Subsequently, in STEP S803, the controller 301 determines whether the GPS sensor has acquired the current position.
If the determination result of STEP S803 becomes “NO”, since the flying camera device cannot fly any more, in STEP S804, the controller 301 controls the first to fourth motor drivers 304 such that the flying camera device lands at the place where there is the flying camera device. Thereafter, the controller 301 finishes the process of STEP S630 of
If the determination result of STEP S803 becomes “YES”, in STEP S805, the controller 301 determines whether the flying camera device has reached the returning point, by comparing the current position of the flying camera device acquired in STEP S802 with the flight start position stored in STEP S603 of
If the determination result of STEP S805 becomes “NO”, the controller 301 proceeds to the process of STEP S801 in which it controls such that the flying camera device keeps the returning flight.
If the determination result of STEP S805 becomes “YES”, in STEP S806, the controller 301 controls the first to fourth motor drivers 304 such that the flying camera device lands at the returning point. Thereafter, the controller 301 finishes the process of STEP S630 of
Even in a case where the visible-light flickering object 111 has not been found for the predetermined time, whereby the determination result of STEP S615 of
According to the above-described embodiment, even if the flying camera device 100 is disposed in a place or an environment where it cannot specify the user by imaging of the camera 105, at first, it can specify and follow the user from there, and can automatically return to the flight start position if imaging finishes.
However, after imaging finishes, the flying camera device may land at a place where there is the flying camera device, or hovers in the air, based on a notification transmitted from the wearable device 110 based on a user's instruction, without returning to the flight start position.
Also, although the flying camera device has been described as an example of a moving device in the above-described embodiment, flight is not essential for movement, and the moving device may move on the ground or on water. The moving device may have a plurality of camera units. Also, the camera unit is not essential.
Although the preferred embodiment and modifications of the present invention have been described, the present invention is not limited to the specific embodiment, and inventions disclosed in claims and equivalents to those inventions are included in the present invention.
From the present invention, various embodiments and modifications can be made without departing from the broad sprit and scope of the present invention. Also, the above-described embodiment is for explaining the present invention, and does not limit the scope of the present invention. In other words, the scope of the present invention is defined by the claims, not by the embodiment. Therefore, various modifications which are made within the scope of the claims and the scope of inventions equivalent thereto are considered to fall within the scope of the present invention.
Number | Date | Country | Kind |
---|---|---|---|
2016-076961 | Apr 2016 | JP | national |