This application is the U.S. national phase of International Application No. PCT/IB2019/055237 filed 21 Jun. 2019, which designated the U.S. and claims priority to GB Patent Application No. 1810285.5 filed 22 Jun. 2018, the entire contents of each of which are hereby incorporated by reference.
The invention relates to a device and a method for gesture-based teleoperation.
Teleoperated robots, including for example unmanned aerial vehicles (UAVs or drones), have been increasingly used for the exploration of unsafe or difficultly accessible areas. Current control methods imply the use of third-party devices, which use predefined actions and rely on unnatural mappings between the pilot's movements and the actions of the aircraft. Extensive training is therefore required and the control of such a device requires the operator's full attention.
Alternative, more intuitive control approaches reduce the time before expertise and allow the operator to perform secondary tasks in parallel to the steering, such as, a visual exploration of the robot's environment by means of an embedded camera. Such approaches have been taken in recently developed gesture-based interfaces. However, these recently developed systems employ patterns established in advance and may therefore not implement the most intuitive control mapping for each user.
In the field of gesture-based teleoperation, a number of patent publications describe wearable systems for the distal control of a robot.
Furthermore, a number of scientific publications address the question of user-friendly and intuitive control gestures in the case of drone steering:
Besides, the usability of predefined gestures to control drones has been demonstrated in several works:
In the field of optimization of sensor use and power consumption for a wearable control interface, there are for example two patent publications, wherein:
The problem of using a standard remote controller is that the user has to learn how to control a distal robot. The present invention aims to provide a wearable remote controller that learns how the user wants to control a distal robot. Therefore, one aim of the present invention is to provide a device and a method for an improved intuitive control mapping.
Accordingly, in a first aspect, the invention provides a method for remotely controlling an operated unmanned object, comprising:
In a preferred embodiment, the defining of the set of control movements comprises a definition of gestures of different body parts.
In a further preferred embodiment, the defining of the set of control movements comprises a recording of the movements performed by at least one operator observing the operated unmanned object or a virtual representation thereof executing the desired behaviors while executing control movements corresponding to the desired behaviors.
In a further preferred embodiment, the operator chooses the set of movements corresponding to the different actions of the operated unmanned object.
In a further preferred embodiment, the defining of the set of control movements comprises a recording of the movements performed by at least one operator observing the operated unmanned object or a virtual representation thereof executing the desired behaviors while executing control movements corresponding to the desired behaviors. The operator chooses the set of movements corresponding to the different actions of the operated unmanned object. The method further comprises determining a minimum set of signals necessary to efficiently acquire and discriminate the movements of the operator using non-negative sparse principal component analysis.
In a further preferred embodiment, the method further comprises determining a mapping of the operator's movements to the behaviors of the operated unmanned object.
In a further preferred embodiment, the step of sensing of operator's body movements comprises using at least one inertial measurement unit. The method further comprises determining a subset of discriminant inertial measurement units amongst the inertial measurement units from which gesture data results in at least one command, identifying a maximum frequency recorded by the subset discriminant inertial measurement units; turning off the remaining inertial measurement units which do not provide any discriminant feature; and reducing a sampling frequency of the subset of inertial measurement units to a double of the maximum frequency.
In a second aspect, the invention provides a network of wearable sensors to control of distal robots by at least a command, comprising
In a further preferred embodiment, the network comprises power consumption control means configured to minimize a power consumption of the inertial measurement units by:
The invention will be understood better through the description of example embodiments and in reference to the drawings, wherein
Same references will be used to designation same of similar objects throughout the figures and the description.
Hence, the invention provides at least a wearable interface—in
In addition, the invention provides a methodology to determine
Referring now to
The user may access different functionalities (not shown in
This section describes a method implemented in the wearable system, configured to adapt the optimal set of gestures for a user-specific interaction and control of a distant robot. The method is described in
The method allows to select an optimal set of gestures to interact with and control a distant robot, and to minimize the number, type and positioning of the sensors necessary to detect the users' intentions, comprises:
This section describes a further method included in the wearable system, configured to optimize sensor—inertial measurement unit—usage in order to reduce the power consumption. The dynamics of the gestures may be different from user to user. Therefore, a user-specific optimization of the sensor usage may reduce the power consumption of the system.
The method used to optimize sensor usage is shown in
Once the desired gestures have been recorded, discriminant sensors, in particular the features extracted from those sensors, are identified online.
At this point, the dynamics of the features are analyzed and the highest frequency (f) of each sensor is identified (Analysis of Features Dynamics box). Then, according to Shannon's definition of sampling that the sampling frequency (fs) must be greater than the Nyquist sample rate, i.e., fs>2 f, the sampling frequency of each sensor is settled accordingly (Sampling Frequency Adaptation box). All the sensors not providing discriminant features are turned off.
Number | Date | Country | Kind |
---|---|---|---|
1810285.5 | Jun 2018 | GB | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/IB2019/055237 | 6/21/2019 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2019/244112 | 12/26/2019 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
5155684 | Burke | Oct 1992 | A |
8903568 | Wang et al. | Dec 2014 | B1 |
9283674 | Hoffman | Mar 2016 | B2 |
9582080 | Tilton | Feb 2017 | B1 |
9620000 | Wang et al. | Apr 2017 | B2 |
20080084385 | Ranta et al. | Apr 2008 | A1 |
20090222149 | Murray et al. | Sep 2009 | A1 |
20110109545 | Touma | May 2011 | A1 |
20110288696 | Lefebure | Nov 2011 | A1 |
20110304650 | Campillo | Dec 2011 | A1 |
20110312311 | Abifaker et al. | Dec 2011 | A1 |
20120025946 | Chuang et al. | Feb 2012 | A1 |
20130124891 | Donaldson | May 2013 | A1 |
20130173088 | Callou et al. | Jul 2013 | A1 |
20130181810 | Plotsker | Jul 2013 | A1 |
20140111414 | Hayner | Apr 2014 | A1 |
20150054630 | Xu et al. | Feb 2015 | A1 |
20150140934 | Abdurrahman et al. | May 2015 | A1 |
20150143601 | Longinotti-Buitoni et al. | May 2015 | A1 |
20160189534 | Wang et al. | Jun 2016 | A1 |
20160306434 | Ferrin | Oct 2016 | A1 |
20160313801 | Wagner et al. | Oct 2016 | A1 |
20160338644 | Connor | Nov 2016 | A1 |
20170031446 | Clark | Feb 2017 | A1 |
20170108857 | Line | Apr 2017 | A1 |
20170174344 | Lema | Jun 2017 | A1 |
20170192518 | Hygh et al. | Jul 2017 | A1 |
20170220119 | Potts et al. | Aug 2017 | A1 |
20170259428 | Assad et al. | Sep 2017 | A1 |
20180020951 | Kaifosh | Jan 2018 | A1 |
20190056725 | Su | Feb 2019 | A1 |
20190258239 | Floreano | Aug 2019 | A1 |
20210240984 | Nishio | Aug 2021 | A1 |
Number | Date | Country |
---|---|---|
103927761 | Jul 2014 | CN |
204406085 | Jun 2015 | CN |
105005383 | Oct 2015 | CN |
105739525 | Jul 2016 | CN |
105955306 | Sep 2016 | CN |
106406544 | Feb 2017 | CN |
2415486 | Dec 2005 | GB |
2015-174192 | Oct 2015 | JP |
2016-0050863 | May 2016 | KR |
10-1658332 | Sep 2016 | KR |
2016-0105053 | Sep 2016 | KR |
2008150062 | Dec 2008 | WO |
2015200209 | Dec 2015 | WO |
2016079774 | May 2016 | WO |
2016167946 | Oct 2016 | WO |
2016191050 | Dec 2016 | WO |
2017052077 | Mar 2017 | WO |
WO-2017146531 | Aug 2017 | WO |
2017157313 | Sep 2017 | WO |
Entry |
---|
English Translation of CN103927761 (Year: 2014). |
English Translation of WO 2017146531 A1 (Year: 2017). |
Artoni, Fiorenzo, et al., “RELICA: a method for estimating the reliability of independent components”, Neuroimage, vol. 103, Dec. 2014, pp. 391-400. |
Cauchard, Jessica R., et al., “Drone & Me: An Exploration Into Natural Human-Drone Interaction,” Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing, Sep. 2015, pp. 361-365. |
Lupashin, Sergei, et al., “A platform for aerial robotics research and demonstration: The Flying Machine Arena,” Mechatronics, vol. 24, No. 1, 2014, pp. 41-54. |
Miyoshi, Kensho, et al., “Above Your Hand: direct and natural interaction with aerial robot,” ACM SIGGRAPH 2014 Emerging Technologies, Aug. 2014, 1 page. |
Peshkova, Ekaterina, et al., “Exploring User-Defined Gestures and Voice Commands to Control an Unmanned Aerial Vehicle,” Intelligent Technologies for Interactive Entertainment, 2016, pp. 47-62. |
Peshkova, Ekaterina, et al., “Natural Interaction Techniques for an Unmanned Aerial Vehicle System,” IEEE Pervasive Computing, vol. 16, No. 1, Jan.-Mar. 2017, pp. 34-42. |
Pfeil, Kevin P., et al., “Exploring 3D Gesture Metaphors for Interaction with Unmanned Aerial Vehicles, ” Proceedings of the 2013 International Conference on Intelligent User Interfaces, Mar. 2013, pp. 257-266. |
Sakamoto, Mizuki, et al., “Human Interaction Issues in a Digital-Physical Hybrid World,” The 2nd IEEE International Conference on Cyber-Physical Systems, Networks, and Applications, 2014, pp. 49-54. |
Sanna, Andrea, et al., “A Kinect-based natural interface for quadrotor control,” Entertainment Computing, vol. 4, No. 3, 2013, pp. 179-186. |
Waibel, Markus, “Controlling a Quadrotor Using Kinect,” IEEE Spectrum: Technology, Engineering, and Science News, Jul. 2, 2011, http://spectrum.ieee.org/automaton/robotics/robotics-software/quadrotorinteraction 1 page. |
International Search Report for PCT/IB2019/055237, mailed Sep. 26, 2019, 5 pages. |
Written Opinion of the ISA for PCT/IB2019/055237, mailed Sep. 26, 2019, 6 pages. |
Combined Search and Examination Report under Sections 17 & 18(3) dated Jan. 11, 2019, issued in United Kingdom Application No. GB1810285.5, 3 pages. |
Number | Date | Country | |
---|---|---|---|
20210247758 A1 | Aug 2021 | US |