The present disclosure relates to the field of control systems for robots and, in particular, to a system permitting augmented and mixed reality applications.
It has been proposed to provide systems permitting augmented and mixed reality applications.
“Augmented reality” corresponds to a direct or indirect live view of a physical real world environment whose elements are “augmented” by computer-generated information, such as visual and audio information, that is superposed on the live view.
“Mixed reality”, also known as hybrid reality, is the merging of real and virtual worlds to produce new environments and visualizations where physical and digital objects can coexist and interact in real-time. Mixed reality derives its name from the fact that the world is neither entirely physical nor entirely virtual, but is a mixture of both worlds.
There is however a technical difficulty in providing mixed reality environments in which events involving virtual elements in a virtual world can be synchronized with the dynamic behavior of real objects in the physical world.
It is an aim of embodiments of the present description to at least partially address one or more difficulties in the prior art.
According to one aspect, there is provided a processing device for implementing a mixed reality system, the processing device comprising: one or more processing cores; and one or more instruction memories storing instructions that, when executed by the one or more processing cores, cause the one or more processing cores to: maintain a virtual world involving at least a first virtual replica corresponding to a first robot in the real world; generate one or more virtual events impacting the first virtual replica in the virtual world; generate a control signal for controlling the first robot in response to the one or more virtual events; and transmit the control signal to the first robot to modify the behavior of the first robot and provide a real world response to the one or more virtual events.
According one embodiment, the instructions further cause the one or more processing cores to receive, prior to generating the control signal, a user or computer-generated command intended to control the first robot, wherein generating the control signal comprises modifying the user or computer-generated command based on the one or more virtual events.
According one embodiment, the instructions further cause the one or more processing cores to limit the control signal resulting from a user or computer-generated command in the absence of a virtual event to a first range, wherein the control signal providing a real world response to the one or more virtual events exceeds the first range.
According one embodiment, the instructions further cause the one or more processing cores to generate a mixed reality video stream to be relayed to a display interface, the mixed reality video stream including one or more virtual features from the virtual world synchronized in time and space and merged with a raw video stream captured by a camera.
According one embodiment, the instructions cause the one or more processing cores to generate virtual features in the mixed reality video stream representing virtual events triggered by the behavior of the first robot in the real world.
According one embodiment, the instructions further cause the one or more processing cores to continuously track the 6 Degrees of Freedom coordinates of the first robot corresponding to its position and orientation based on tracking data provided by a tracking system.
According one embodiment, the instructions further cause the one or more processing cores to generate the control signal to ensure contactless interactions of the first robot with one or more real static or mobile objects or further robots, based at least on the tracking data of the first robot and the 6 Degrees of Freedom coordinates of the one or more real static or mobile objects or further robots.
A mixed reality system comprising: the above processing device; an activity zone comprising the first robot and one or more further robots under control of the processing device; and a tracking system configured to track relative positions and orientations of the first robot and the one or more further robots.
According one embodiment, the first robot is a drone or land-based robot.
According one embodiment, the mixed reality system further comprises one or more user control interfaces for generating user commands.
According to a further aspect, there is provided a method of controlling one or more robots in a mixed reality system, the method comprising: maintaining, by one or more processing cores under control of instructions stored by one or more instruction memories, a virtual world involving at least a first virtual replica corresponding to a first robot in the real world; generating one or more virtual events impacting the first virtual replica in the virtual world; generating a control signal for controlling the first robot in response to the one or more virtual events; and transmitting the control signal to the first robot to modify the behavior of the first robot and provide a real world response to the one or more virtual events.
The foregoing features and advantages, as well as others, will be described in detail in the following description of specific embodiments given by way of illustration and not limitation with reference to the accompanying drawings, in which:
Throughout the present disclosure, the term “coupled” is used to designate a connection between system elements that may be direct, or may be via one or more intermediate elements such as buffers, communication interfaces, intermediate networks, etc.
Furthermore, throughout the present description, the following terms will be considered to have the following definitions:
“Robot”—any machine or mechanical device that operates to some extent automatically and to some extent under control of a user. For example, as will be described in more detail hereafter, a robot is for example to some extent remotely controlled via a wireless control interface based on user commands.
“Mixed-reality application”—an application in which there are interactions between the real world and a virtual world. For example, events occurring in the real world are tracked and applied to the virtual world, and events occurring in the virtual world result in real world effects. Some examples of mixed-reality interactive video games are provided at the internet site www.drone-interactive.com. The name “Drone Interactive” may correspond to one or more registered trademarks. While in the following description embodiments of a mixed reality system are described based on an example application of an interactive game, it will be apparent to those skilled in the art that the system described herein could have other applications, such as for maintenance of machines or buildings, for exploration, including space exploration, for the manufacturing industry, such as in a manufacturing chain, for search and rescue, or for training, including pilot or driver training in the context of any of the above applications.
“Virtual replica”—a virtual element in the virtual world that corresponds to a real element in the real world. For example, a wall, mountain, tree or other type of element may be present in the real world, and is also defined in the virtual world based on at least some of its real world properties, and in particular its 6 Degrees of Freedom (DoF) coordinates corresponding to its relative position and orientation, its 3D model or its dynamic behavior in the case of mobile elements. Some virtual replicas may correspond to mobile elements, such as robots, or even to a user in certain specific cases described in more detail below. While the 6 DoF coordinates of static elements are for example stored once for a given application, the 6 DoF coordinates of mobile elements, such as robots, are tracked and applied to their virtual replica in the virtual world, as will be described in more detail below. Finally, the behavior of each virtual replica mimics that of the corresponding mobile elements in the real world.
The system 100 for example comprises an activity zone 102 of any shape and dimensions. The activity zone 102 for example defines a volume in which the mixed reality system can operate, and in particular in which a number of robots may operate and in which the 6 DoF coordinates (position and orientation) of the robots can be tracked. While in the example of
One or more robots are for example present within the activity zone 102 and may interact with each other, with other mobile or static real objects in the activity zone and with virtual elements in the virtual world. For example, the activity zone 102 defines a gaming zone in which robots forming part of a mixed reality game are used. In the example of
Each of the robots within the activity zone 102 is for example a remotely controlled robot that is at least partially controllable over a wireless interface. It would however also be possible for one or more robots to include wired control lines.
It is assumed herein that each of the robots within the activity zone 102 comprises a source of power, such as a battery, and one or more actuators, motors, etc. for causing parts of each robot to move based on user commands and/or under control of one or more automatic control loops. For example: the drones include one or more propellers creating forward, backward, lateral and/or vertical translations; and the land-based robots in the form of model vehicles include a motor for driving one or more wheels of the vehicle and one or more actuators for steering certain wheels of the vehicle. Of course, the particular types of motors or actuators used for moving the robots will depend on the type of robot and the types of operations it is designed to perform.
The computing system 120 is for example configured to track activity in the real world (within the activity zone 102) and also to maintain a virtual world, and merge the real and virtual worlds in order to provide one or more users and/or spectators with a mixed reality experience, as will now be described in more detail.
The mixed reality system 100 for example comprises a tracking system 112 capable of tracking the relative positions and orientations (6 DoF coordinates) of the robots, and in some cases of other mobile or static objects, within the activity zone 102. The position information is for example tracked with relatively high accuracy, for example with a precision of 1 cm or less, and the orientation is for example measured with a precision of 1 degree or less. Indeed, the overall performance of the system for accurately synchronizing the real and virtual worlds and creating interactions between them will depend to some extent on the accuracy of the tracking data. In some embodiments, the robots have six degrees of freedom, three being translation components and three being rotation components, and the tracking system 112 is capable of tracking the position and orientation of each of them with respect to these six degrees of freedom.
In some embodiments, the robots may each comprise a plurality of active or passive markers (not illustrated) that can be detected by the tracking system 112. The emitters of the tracking system 112 for example emit infrared light, and cameras, which may be integrated in the light emitters, for example detect the 6 DoF coordinates of the robots based on the light reflected by these markers. For example, each tracked object (including robots) has a unique pattern of markers that permit it to be identified among the other tracked objects and for its orientation to be determined. In addition, the tracking system 112 may comprise one or more emitters that emit light at non-visible wavelengths into the activity zone 102. There are many different tracking systems that are available based on this type of tracking technology, an example being the one marketed under the name “Optitrack” (the name “Optitrack” may correspond to a registered trademark).
In further embodiments, the light is in the form of light beams, and the robots comprise light capture elements (not illustrated) that detect when the robot traverses a light beam, and by identifying the light beam, the 6 DoF coordinates of the robot can be estimated. Such a system is for example marketed by the company HTC under the name “Lighthouse” (the names “HTC” and “Lighthouse” may correspond to registered trademarks).
It would also be possible for the robots to include on-board tracking systems, for example based on inertial measurement units or any other positioning devices, permitting the robots to detect their 6 DoF coordinates (position and orientation), and relay this information to the computing system 120.
In yet further embodiments, different types of tracking systems could be used, such as systems based on UWB (ultra-wide band) modules, or systems based on visible cameras in which image processing is used to perform object recognition and to detect the 6 DoF coordinates (position and orientation) of the robots.
The computing system 120 for example receives information from the tracking system 112 indicating, in real time, the 6 DoF coordinates (position and orientation) of each of the tracked objects (including robots) in the activity zone 102. Depending on the type of tracking system, this information may be received via a wired connection and/or via a wireless interface.
The mixed reality system 100 comprises cameras for capturing real time (streaming) video images of the activity zone that are processed to create mixed reality video streams for display to users and/or spectators. For example, the mixed reality system 100 comprises one or more fixed cameras 114 positioned inside or outside the activity zone 102 and/or one or more cameras 116 mounted on some or all of the robots. One or more of the fixed cameras 114 or of the robot cameras 116 is for example a pan and tilt camera, or a pan-tilt-zoom (PTZ) camera. In the case of a camera 114 external to the activity zone 102, it may be arranged to capture the entire zone 102, providing a global view of the mixed reality scene.
The video streams captured by the cameras 114 and/or 116 are for example relayed wirelessly to the computing system 120, although for certain cameras, such as the fixed cameras 114, wired connections could be used.
The computing system 120 is for example capable of wireless communications with the robots within the activity zone 102. For example, the computing system 120 includes, for each robot, a robot control interface with one or several antennas 122 permitting wireless transmission of the control signals to the robots and a robot video interface with one or several antennas 123 permitting the wireless reception of the video streams from the robot cameras 116. While a single antenna 122 and a single antenna 123 are illustrated in
The computing system 120 is for example a central system via which all of the robots in the activity zone 102 can be controlled, all interactions between the real and virtual worlds are managed, and all video processing is performed to create mixed reality video streams. Alternatively, the computing system 120 may be formed of several units distributed at different locations.
User interfaces for example permit users to control one or more of the robots and/or permit users or spectators to be immersed in the mixed reality game or application by seeing mixed reality images of the activity zone 102. For example, one or more control interfaces 125 are provided, including for example a joystick 126, a hand-held game controller 128, and/or a steering wheel 130, although any type of control interface could be used. The control interfaces 125 are for example connected by wired connections to the computer system 120, although in alternative embodiments wireless connections could be used. Furthermore, to permit users and/or spectators to be immersed in the mixed reality game or application by seeing mixed reality images of the activity zone 102, one or more display interfaces 132 are provided, such as a virtual reality (VR) headset or video glasses 136, and/or a display screen 138, and/or a see-through augmented reality (AR) headset 134, although any type of display could be used. In some embodiments, audio streams are provided to each user. For example, the headsets 134 and 136 are equipped with headphones. Additionally or alternatively, a speaker 140 may provide audio to users and/or to spectators. The display interfaces 132 are for example connected by wired connections to the computer system 120, although in alternative embodiments wireless connections could be used.
The activity zone 102 for example comprises, in addition to the robots, one or more further static or mobile objects having virtual replicas in the virtual world. For example, in
In some embodiments, users may have direct interaction with robots in the activity zone 102. For example,
In some cases, one or more users may interact with one or more robots in a different manner than by using one of the control interfaces 125 described above (a game controller, joystick or the like). For example, the user in the activity zone 102 may use a wand 144 or any other physical object to interact directly with the robots. The tracking system 112 for example tracks movements of the wand 144, and the computing system 120 for example controls the robots as a function of these movements. For example, one or more drones may be repulsed by the wand 144, or directed to areas indicated by the wand 144, although any type of interaction could be envisaged.
The system 120 for example comprises a processing device (PROCESSING DEVICE) 202 implemented by one or more networked computers. The processing device 202 for example comprises an instruction memory (INSTR MEMORY) 204 and one or more processing cores (PROCESSING CORE(S)) 206. The processing device 202 also for example comprises a storage memory (STORAGE MEMORY) 208, storing the data processed by the processing cores 206, as will be described in more detail below.
The processing device 202 for example receives user commands (CMD) from the one or more control interfaces (CONTROL INTERFACE(S)) 125. A user command corresponds to the user's desired control of the robot, indicating for example a desired displacement and/or other desired behavior of the robot. In addition, user commands may also correspond to any user's desired triggering action in the mixed reality game or application. In some embodiments, the processing device 202 generates feedback signals FB that are sent back to the control interface(s) 125. These feedback signals for example cause the user interface(s) 125 to vibrate in response to events in the mixed reality game or application, or provide other forms of feedback response (haptic feedback or other).
The computing system 120 for example comprises a robot camera(s) interface (ROBOT CAMERA(S) INTERFACE) 210 that wirelessly receives raw video stream(s) (RAW VIDEO STREAM(S)) from the robot cameras 116 of one or more robots and transmits these raw video stream(s) to the processing device 202. In addition, the computing system 120 for example comprises a robot control interface (ROBOT CONTROL INTERFACE) 212 that receives robot control signals (CTRL) from the processing device 202 and wirelessly transmits these control signals to one or more robots. The computing system 120 for example comprises a fixed camera(s) interface (FIXED CAMERA(S) INTERFACE) 214 that receives raw video streams from the fixed cameras 114 via a wireless or wired interface and transmits these raw video streams to the processing device 202. While not illustrated in
The processing device 202 for example modifies the raw video streams received from the fixed camera(s) 114 and/or the robot camera(s) 116 to generate mixed reality video streams (MIXED REALITY VIDEO STREAM(S)), and in some cases (not illustrated) audio streams, which are transmitted to the display interfaces (DISPLAY INTERFACE(S)) 132.
The processing device 202 also for example receives tracking data (TRACKING DATA) corresponding to the 6 DoF coordinates (position and orientation) of all tracked objects (robots and static/mobile objects) from the tracking system (TRACKING SYSTEM) 112.
The processing device 202 for example implements a mixed reality module (MIXED REALITY MODULE) 302, comprising a display module (DISPLAY MODULE) 304 and a real-virtual interaction engine (REAL-VIRTUAL INTERACT. ENGINE) 305. The processing device 202 also for example comprises a database (DATABASE) 306 stored in the storage memory 208, a robot control module (ROBOT CONTROL MODULE) 310 and in some cases an artificial intelligence module (A.I. MODULE) 309.
The mixed-reality module 302 receives user commands (CMD) for controlling corresponding robots from the control interface(s) (CONTROL INTERFACE(S)) 125 of the user interfaces (USER INTERFACES), and in some embodiments generates the feedback signal(s) FB sent back to these control interfaces 125. Additionally or alternatively, one or more robots may be controlled by commands (CMD AI) generated by the artificial intelligence module 309 and received by the mixed-reality module 302.
The database 306 for example stores one or more of the following:
The mixed reality module 302 constructs and maintains the virtual world, which is composed of all the virtual elements including the virtual replicas of the robots and the static/mobile real objects in the activity zone 102. In particular, the real-virtual interaction engine 305 receives the tracking data (TRACKING DATA) from the tracking system 112 and uses the data stored in the database 306 to ensure synchronization of the 6 DoF coordinates (position and orientation) between the real elements (the robots and the static/mobile real objects in the activity zone 102) and their corresponding virtual replicas in the virtual world.
The engine 305 also for example generates modified command signals CMD′ for controlling one or more robots based on initial user command (CMD) or AI generated command (CMD_AI) and the real-virtual interactions relating to the one or more robots. For example, these real-virtual interactions are generated as a function of the tracked 6 DoF coordinates (position and orientation) of the robots, the robot data (including the robot dynamic models) from the database 306, and on events occurring in the mixed reality application and/or, depending on the application, on other specific rules from the database 306. In the case of a video game, these rules may be defined in the gameplay data. The engine 305 also for example implements anti-collision routines in order to prevent collisions between robots themselves and/or between any robot and another real object in the activity zone 102, and in some cases between any robot and a virtual element in the virtual world. Some examples of real-virtual interactions will be described below with reference to
The display module 304 for example generates mixed reality video stream(s) based on the raw video stream(s) from the fixed camera(s) 114 and/or the robot camera(s) 116 and relays it to corresponding display interfaces 132 after incorporating virtual features (such as the view of one or more virtual elements, head-up display data, visual special effects, etc.) generated by the real-virtual interaction engine 305. For example, virtual features generated by the real-virtual interaction engine 305 are synchronized in time and space and merged with the raw video stream(s). For example, the view of one or more virtual elements in the mixed reality application is presented to a display interface in a position and orientation that depends of the field of view and the 6 DoF coordinates (position and orientation) of the corresponding fixed or robot camera 114/116.
The robot control module 310 for example receives the modified command signals CMD′ generated by the real-virtual interaction engine 305 and generates one or more control signals CTRL based on these command signals for controlling one of more of the robots (TO ROBOT CONTROL INTERFACE), as will be described in more detail below in relation with
Operation of the mixed reality module 302 will now be described in more detail with reference to
The display module 304 generates a mixed reality video stream by merging the raw video stream of the real world captured by the camera 116 of the real drone 404 with virtual images of the virtual world corresponding to the view point of the virtual camera 116′ of the virtual replica 404′ of the drone 404, as will now be described in more detail with reference to
In some embodiments, the display module 304 generates, for each image of the raw video stream being processed, an image mask similar to that of
The display module 304 for example processes each raw video stream received from a robot/fixed camera 116/114 in a similar manner to the example of
As represented in
The robot control module 310 for example comprises a transfer function module 701 that transforms each modified command CMD′ into a desired robot state (DESIRED STATE), including the desired 6 DoF coordinates (position and orientation) of the robot. The module 310 also comprises a subtraction module 702 that continuously calculates an error state value (ERR_STATE) as the difference between the desired robot state and the measured robot state (MEASURED STATE) generated by a further transfer function module 703 based on the tracking data (TRACKING DATA) provided by the tracking system 112. The error state value is provided to a controller (CONTROLLER) 704, which for example uses the robot dynamic model (ROBOT DYNAMIC MODELS) from the database 306, and aims to generate control signals CTRL that minimize this error state value. The generated control signals CTRL are for example wirelessly transmitted to the robots 108 via the robot control interface.
The modification of the command signals CMD by the real-virtual interaction engine 305 will now be described in more detail through a few examples with reference to
An enlarged version of the thrust gage 806′ is shown at the top of
In the example of
The real-virtual interaction engine 305 detects the presence of the drone 802 in this zone 804′, and thus increases the thrust to a boosted level between CMD_MAX and CMD_MAX′ as indicated by the thrust gage 806′. As represented by an arrow 818, the speed of the drone 802 for example increases as a result to a high level. For example, the real-virtual interaction engine 305 determines the new thrust based on the user command CMD, increased by a certain percentage, such as 100%.
While in the example of
In some cases, the real-virtual interaction engine 305 may also simulates damage to a robot following a collision, for example by reducing any user command CMD by a certain percentage to simulate a loss of thrust.
An advantage of the embodiments described herein is that they permit a mixed reality system to be implemented in which events in a virtual world can be used to generate responses in the real world. This is achieved by generating, by the real-virtual interaction engine 305, modified robot commands to create specific robot behaviors in the real world. This for example permits relatively close simulation of virtual events in the real world, leading to a particularly realistic user experience.
Having thus described at least one illustrative embodiment, various alterations, modifications and improvements will readily occur to those skilled in the art. For example, it will be apparent to those skilled in the art that the various functions of the computing system described herein could be implemented entirely in software or at least partially in hardware.
Furthermore, it will be apparent to those skilled in the art that the various features described in relation with the various embodiments could be combined, in alternative embodiments, in any combination.
Number | Date | Country | Kind |
---|---|---|---|
1900974 | Jan 2019 | FR | national |
The present patent application claims priority from International Application Number PCT/EP2020/052321 filed on Jan. 30, 2020 which claims benefit of the French patent application filed on Jan. 31, 2019 and assigned application serial number FR19/00974, the contents of which is hereby incorporated by reference.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2020/052321 | 1/30/2020 | WO | 00 |