The present disclosure relates to command and control interfaces for collaborative robotics.
Collaborative robots, or “cobots,” are robots that are designed to operate collaboratively with humans. Modern robotics is expanding to include environments in closer proximity to humans, allowing humans and robots to work together—collaboratively. Features of modern robotics that facilitate collaboration include increases in dexterity as well as decreases in formfactor and weight. Collaborative robots, because of their closer proximity to humans, benefit from a command and control system that allows the robots to work safely while still carrying out their assigned tasks.
Figures may not be drawn to scale; like reference numbers denote like elements across the various drawings.
As will be appreciated by one skilled in the art, aspects of the present disclosure may be illustrated and described herein in any of a number of patentable classes or contexts, including any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof. Accordingly, aspects of the present disclosure may be implemented entirely in hardware, entirely in software (including firmware, resident software, micro-code, etc.) or combining software and hardware implementation that may all generally be referred to herein as a “circuit,” “module,” “component,” or “system.” Furthermore, aspects of the present disclosure may take the form of a computer program product embodied in one or more computer readable media having computer readable program code embodied thereon.
Any combination of one or more computer readable media may be utilized. The computer readable media may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a random-access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an appropriate optical fiber with a repeater, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable signal medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object-oriented programming language, such as JAVA.®., SCALA.®., SMALLTALK.®., EIFFEL.®., JADE.®., EMERALD.®., C++, C#, VB.NET, PYTHON.®. or the like, conventional procedural programming languages, such as the “C” programming language, VISUAL BASIC.®., FORTRAN.®. 2003, Perl, COBOL 2002, PHP, ABAP.®., dynamic programming languages such as PYTHON.®., RUBY.®. and Groovy, or other programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider) or in a cloud computing environment or offered as a service such as a Software as a Service (SaaS).
Aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatuses (systems) and computer program products according to aspects of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general-purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable instruction execution apparatus, create a mechanism for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer readable medium that when executed can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions when stored in the computer readable medium produce an article of manufacture including instructions which when executed, cause a computer to implement the function/act specified in the flowchart and/or block diagram block or blocks. The computer program instructions may also be loaded onto a computer, other programmable instruction execution apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatuses or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
The terminology used herein is for the purpose of describing particular aspects only and is not intended to be limiting of the disclosure. As used herein, the singular forms “a,” “an,” and “the” are intended to comprise the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
Collaborative robotics suffer from difficulty of programming directives for actions expected from collaborative robots. The difficulty arises from broad variability of command and control interfaces, power that can be exerted on a foreign object by the robots, geometry of the robots and safety consideration for the robot, subject of robot action and the human controlling the robot. The problems are even greater if the desired action should be performed by multiple robots simultaneously or in succession. Aspects of the embodiments can overcome certain issues, including (P1) the difficulty of pointing out exact desired subject of action among other possible subjects, (P2) expressing future desired position of the subjects, and (P3) orchestrating beginning and end of collaborative move.
In embodiments, the user device can make use of IMU 108 to allows the user to hold the user device 102 as a steering wheel. The user can rotate the device 102 clockwise or counter clockwise to steer direction of movement of the subject. The user can tilt the user device 102 forward or backward changes speed of the subject movement. The user can shift that user device 102 left or right and forward or backward to control the relative speed and/or alignment of the respective sides of the subject.
The user device 102 communicate with a network-based API 122. The API 120 can include distributed functionality, such as a human-robot-interface (HRI) 124 and a robot controller (RTC) 126. The HRI 124 and the RTC 126 can be distributed across a network-based cloud system 120.
The user device 102 can use the API 122 to control one or more robots 112a and 112b. The robots 112a-b can be located in a room 114 that includes one or more objects 116. The user can be in the room 114 to send real-time control information, or can be in a different room. The user can capture a photograph of the room 114 including the object 116 while in the room, but can process the image, reposition an image of the object, and transmit the image data to the API 122 across a network from a remote location.
The user can open the image in an application interface for robot controls or an image processor that allows the user to select objects within the image.
By selecting an outline or cutout 526 of the object, the user makes a copy or clone of the subject (206). The user can store the copy of the object as a separate file. The copy can be displayed onto the image (208). The copy of the image 534 is shown on
The user can move the copy of the image 534 to a new position on the image. Receive an indication of a repositioning of the subject image file within the captured image of the region of interest (210). The interface can create and store an augmented image that includes the original position of the object and the new position of the object (212). The interface can transmit captured image, subject image, and augmented image to robot command and control across a network (214).
In embodiments, CandC system can receive data from the robot (410). The image data can include an image of the object, an image of the environment, location information about the robot and/or the object, telemetry information, movement information, and other information that allows the CandC system to monitor the process of the robot and to provide updates of the robot's progress to a user.
The CandC system can provide some or all of the data to the user (412). In some embodiments, the CandC (e.g., the RTC) can notify the user about progress of the move by providing a visual indication of the robot's movements and/or the movement of the object. For example, the changing a position of the cutout of the object on the background image by sending successive updates of the image to the user device to be displayed on the display.
The CandC system can receive an additional instruction from the user to reposition the object and/or the robot (404), which restarts the cycle.
In embodiments, as a robots moves and object into a new scene (e.g., a different room or into a field of view of a different imaging device), the process of capturing of the image of the environment repeats from the beginning. That is, an image of the scene, and the object in the scene can be displayed. A user or controller can select an image of the object and move the image to a desired location in the image of the new scene. The image data can be sent to a CandC system to process the image data to control the robot.
In embodiments, more than one robot can be controlled by the CandC system to move an object or to move multiple objects selected in the scene. A single robot can also use multiple appendages to move multiple objects in a single CandC session.
The figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various aspects of the present disclosure. In this regard, each block in the flowcharts or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, may be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The corresponding structures, materials, acts, and equivalents of any means or step plus function elements in the claims below are intended to include any disclosed structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of the present disclosure has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the disclosure in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the disclosure. The aspects of the disclosure herein were chosen and described in order to best explain the principles of the disclosure and the practical application, and to enable others of ordinary skill in the art to understand the disclosure, with various modifications contemplated.
While the present disclosure has been described in connection with preferred embodiments, it will be understood by those of ordinary skill in the art that other variations and modifications of the preferred embodiments described above may be made without departing from the scope of the disclosure. Other embodiments will be apparent to those of ordinary skill in the art from a consideration of the specification or practice of the disclosure disclosed herein. It will also be understood by those of ordinary skill in the art that the scope of the disclosure is not limited to use in a server diagnostic context, but rather that embodiments of the disclosure may be used in any transaction having a need to monitor information of any type. The specification and the described examples are considered as exemplary only, with the true scope and spirit of the disclosure indicated by the following claims.