Coordinated robotic control

Information

  • Patent Grant
  • 9566711
  • Patent Number
    9,566,711
  • Date Filed
    Tuesday, March 4, 2014
    10 years ago
  • Date Issued
    Tuesday, February 14, 2017
    7 years ago
Abstract
Device coordinated robotic control technology is described. A network of robotic devices is established. An anthropomorphic motion is sensed from an operator. One or more signals are generated that are representative of at least a portion of the anthropomorphic motion. The one or more signals are converted into a collective set of commands to actuate the network of robotic devices. The collective set of commands is functionally equivalent to the anthropomorphic motion.
Description
BACKGROUND

Robots are electro-mechanical machines that are controlled by one or more computer programs and/or electronic circuitry. Autonomous robots are robots that can perform desired tasks in unstructured environments without continuous human guidance. Semi-autonomous robots and non-autonomous robots, in contrast, often require human guidance.


Robots are used in a variety of fields, including for example, manufacturing, space exploration, and medicine. Specialized robots are generally designed to perform a single task or a single set of tasks like painting a body of a car.


Humanoid robots are a category of robots that attempt to emulate some human tasks including dirty or dangerous jobs. A humanoid robot is a robot with its body shape built to resemble that of the human body. A humanoid might be designed for functional purposes, such as interacting with human tools and environments. In general, humanoid robots have a torso, a head, two arms, and two legs. Some humanoid robots may also have heads designed to replicate human sensory features such as eyes or ears.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 illustrates a network of robotic devices, in accordance with one exemplary embodiment.



FIG. 2 illustrates another view of the example network of robotic devices of FIG. 1.



FIG. 3 illustrates an example portion of a coordinated robotic control system being used by an operator.



FIG. 4 is a component block diagram illustrating an example coordinated robotic system.



FIG. 5 illustrates an example serpentine robotic crawler.



FIG. 6 illustrates an example of serpentine robotic crawler being controlled by an operator.



FIG. 7 is a flowchart illustrating an example coordinated robotic control method for a network of robotic devices.



FIG. 8 is block diagram illustrating an example of a computing device that may be used in coordinated robotic control.





DETAILED DESCRIPTION

A coordinated robotic control technology is described. In particular, the technology may allow coordinated use of two or more robots as a network of robotic devices. The network of robotic devices may be able to remotely accomplish various coordinated human-like, or complex or multifaceted tasks, or even simple coordinated tasks (hereinafter “coordinated tasks”). A human operator may provide control by manipulating a set of sensors. A single robotic device in the network of robotic devices may accomplish a portion of a human-like task in a bounded environment while the remaining robotic devices in the network of robotic devices may accomplish the remaining portions of the human-like task in the bounded environment. In such an arrangement, a remotely located operator could wear a mastering device that carried out real-time measurement of his/her joint positions, displayed by a single or stereo presentation of the views to a head-mounted display, for example, and communicate the desired motions to the collection of robotic devices within the network of robotic devices, which could carry out the motions as though they were anatomically consistent with the operator. In short, the operator would not necessarily be concerned that his interface with the objects in the environment was via a number of robotic devices, none of which is a humanoid robot, just that they are collectively able to function as though they were, within the context of the environment in which they are located. The coordinated use of multiple robots can be made to act like a single unit. This can occur without the necessity of the robotic devices being physically connected.


As one example, driving a vehicle may be a human-like task involving several different types of anthropomorphic motions to accomplish one human-like task. These human-like tasks might include, for example, turning a steering wheel, flipping on a turn signal, engaging a clutch, moving a shifter, increasing acceleration with an accelerator, and applying a brake.


As a more particular example, an operator may indicate flipping on the turn signal by moving a left hand in a downward motion on an edge of a remote steering wheel prop while keeping a right hand fixed on the remote steering wheel prop. A set of sensors may be advantageously placed on or around the operator to detect such motion. This motion may then be carried out by the network of robotic devices. One robotic device in the network of robotic devices might hold the steering wheel steady while another robotic device may flip the turn signal on, for example.


Accordingly, with reference to FIG. 1 and FIG. 2, an example network of robotic devices 100 is illustrated. In particular, the network of robotic devices 100 includes four robotic devices 102a-d. A camera 106 also acts as a feedback device providing visual information to an operator of the network of robotic devices 100.


The network of robotic devices 100, are operating in an environment 104. More particularly, the environment 104 is a driver's area in a vehicle. The robotic devices 102a-d may receive a command and actuate a motion based on the command. In this way, the network of robotic devices 100 may act in a coordinated fashion to perform human-like tasks. The operator may be remotely located and provide anthropomorphic motion through a set of sensors to control the network of robotic devices 100. More robotic devices may be added to the network of robotic devices 100 in order to perform more complex tasks, or tasks requiring a greater range of motion.


As an example, to drive a vehicle as illustrated, a human may need to be able to see, turn a steering wheel, engage a turn signal, move a shifter, press an accelerator or press a brake. Rather than implementing a single human form factor robotic device, four robotic devices 102a-d may act in a coordinated fashion to provide functionally equivalent motion to that of the operator.


More particularly, the robotic device 102a may be positioned on a seat to be able to position the camera 106 to see through a windshield and latterly out side windows and rear view mirrors of the vehicle. The robotic device 102b may also be positioned on the seat and may turn the steering wheel of the vehicle. The robotic device 102c may be positioned on the seat and may engage the turn signal and/or move the shifter. The robotic device 102d may be positioned on the floor and may press the accelerator or the brake. In this way, the network of robotic devices 100 may act in a coordinated way to perform human-like tasks.



FIG. 3 illustrates an example portion of a coordinated robotic control system being used by an operator 300. In particular, the coordinated robotic control system includes a plurality of control sensors 306, 308, 310, 312, 314, 316 to sense anthropomorphic motion from the operator 300. When the operator 300 moves, the control sensors 306, 308, 310, 312, 314, 316 sense the motion and generate an output signal representative of the motion. The control sensors 306, 308, 310, 312, 314, 316 for example, may be accelerometers or digital positioning devices that provide three-dimensional coordinates for the sensors. As another example, the sensors 306, 308, 310, 312, 314, 316 may measure angular rotation or pressure or force of the operator's 300 joints. The plurality of control sensors may be attached to an operator, or may sense motion from some distance from the operator. For example, a radar or LIDAR, or other 3D depth-sensing device may be placed a few feet from an operator and pointed towards the operator to sense a portion of the anthropomorphic motion. Various combinations of sensors may be used to sense the anthropomorphic motion of the operator 200.


The coordinated robotic control system may also include a coordinated robotic control device. The coordinated robotic control device is communicatively connected to the plurality of control sensors 306, 308, 310, 312, 314, 316 and may convert the anthropomorphic motion sensed by the plurality of control sensors 306, 308, 310, 312, 314, 316 into a collective set of commands. The collective set of commands may actuate a plurality of robotic devices, such as those described in FIGS. 1 and 2. In this way, the collective set of commands may actuate one or more serpentine robotic crawlers, such as those shown and described in U.S. patent application Ser. No. 14/026,284 filed on Sep. 13, 2013 and Ser. No. 13/665,669 filed on Oct. 31, 2012, each of which are incorporated by reference in their entirety herein.


The motion within the network of robotic devices actuated by the collective set of commands can be functionally equivalent to the anthropomorphic motion sensed by the plurality of control sensors. For example, the motion may be the operator 200 pulling back a foot, moving the foot towards the left and pushing the foot forward, to simulate movement of a foot from an accelerator pedal to a brake pedal (i.e., this may comprise a desired lower level task that may be considered or termed a sub-task to the higher level coordinated task of driving a car to be completed by the network of robotic devices). While, as depicted in FIGS. 1 and 2, the functionally equivalent motion performed by the robotic device 102d may not include a single robotic foot moving from an accelerator pedal to a brake pedal, one end of the robotic device 102d may disengage the accelerator pedal while the other end of the robotic device 102d may engage the brake pedal. In this way, the objects in the environment of the robotic device 102d, namely the accelerator pedal and the brake pedal, are acted upon in a functionally equivalent manner. Thus, the plurality of robotic devices may receive commands and actuate one or more motions based on the commands, such that the collective set of commands results in movement or motions by the robotic devices that are functionally equivalent or substantially functionally equivalent to the anthropomorphic motions of the operator (i.e., those motions within the network needed to accomplish the coordinated task). As will be described in more detail later, the functionally equivalent motions within the network of robotic devices can be carried out by one or a plurality of robotic devices, the motions being coordinated (e.g., motions by two or more robots used to carry out a task initiated by the operator).


The coordinated robotic control device may be housed in a backpack 302, as illustrated in FIG. 3, or may be housed in another location and communicatively connected to the plurality of control sensors 306, 308, 310, 312, 314, 316. Similarly, the coordinated robotic control system may also include one or more feedback devices, such as a video device 304. In this way, feedback may be sensed from the network of robotic devices.


The robotic control device may use one or more computer systems configured with executable instructions to convert the anthropomorphic motion to the collective set of commands. In this way, the backpack 302 may house a programmable computer with software instructions configured to perform the conversion. The backpack 302 may also house circuitry implementing similar processing to perform the conversion.


Once the anthropomorphic motion, as sensed by the plurality of control sensors 306, 308, 310, 312, 314, 316, has been converted into a collective set of commands, the collective set of commands may be communicated to the network of robotic devices. Distribution of the collective set of commands may be performed in a variety of ways. For example, a respective subset of commands may be communicated to each robotic device in the network of robotic devices. In this way, each robotic device in the network of robotic devices may directly receive a subset of commands. Alternatively, the network of robotic device may include a master robotic device and one or more slave robotic devices. The master robotic device may receive the collective set of commands and then may distribute the collective set of commands to the network of robotic devices as appropriate. In this way, each robotic device in the network of robotic devices may receive a subset of commands either directly (e.g. the master robotic device) or indirectly by way of the master robotic device (e.g. the slave robotic devices).



FIG. 4 is a component block diagram illustrating an example coordinated robotic system. The system 400 may be used to implement the functionality heretofore described with reference to FIGS. 1-3 or further examples discussed below with reference to FIGS. 5-8. The system 400 may include one or more computing devices 406, a network of robotic devices 410, a set of feedback devices 412 and a set of sensors 402. A network 408 may communicatively connect the set of feedback devices 412 and the network of robotic devices 410 with the computing device 406.


The network 408 may include any useful computing or signal network, including an intranet, the Internet, a local area network (LAN), a wide area network (WAN), a wireless data network, a cell network, a direct RF link, a stateless relay network or any other such network or combination thereof, and may utilize a variety of protocols for transmission thereon, including for example, Internet Protocol (IP), the transmission control protocol (TCP), user datagram protocol (UDP) and other networking protocols. Components utilized for such a system may depend at least in part upon the type of network and/or environment selected. Communication over the network may be enabled by wired, fiber optic, or wireless connections and combinations thereof.


Each of the robotic devices 460a-d in network of robotic devices 410, along with each feedback device 470a-b in the set of feedback devices 412 and each of the sensors 420a-i in the set of sensors 402 may each have certain data processing and storage capabilities. A robotic device, for instance, may have a processor, a memory and a data store, for example. Likewise a feedback device or a sensor may also have a processor, a memory and a data store. Further all devices may have network interfaces circuitry (NIC) for interfacing with the network 408. The computing device 434, for instance, depicts a NIC 436 for the computing device 434.


While the network of robotic devices 410 and the set of feedback devices 412 are depicted as being connected through the network 408 to the computing device 406, it is appreciated that the network of robotic devices 410 and the set of feedback devices 412 may be connected through separate networks to the computing device 406. Further, while the set of sensors 402 is shown as directly connecting to the computing device 406 through a local interface 404, it is appreciated that the set of sensors 402 may connect through a network such as the network 408 to communicate with the computing device 496.


The computing device 406 may comprise, for example, a server computer or any other system providing computing capability. Alternatively, a plurality of computing devices 406 may be employed that are arranged, for example, in one or more server banks or computer banks or other arrangements. For purposes of convenience, the computing device 406 may be referred to in the singular, but it is understood that a plurality of computing devices 406 may be employed in various arrangements as described above.


Various processes and/or other functionality, as discussed herein, may be executed in the system 400 according to various examples. The computing device 406, may for example, provide some central server processing services while other devices in the coordinated robotic system 400 may provide local processing services and interface processing services to interface with the services of the computing device 406. Therefore, it is envisioned that processing services, as discussed herein, may be centrally hosted functionality or a service application that may receive requests and provide output to other services or customer devices.


For example, services may be considered on-demand computing that is hosted in a server, cloud, grid, or cluster computing system. An application program interface (API) may be provided for each service to enable a second service to send requests to and receive output from the first service. Such APIs may also allow third parties to interface with the service and make requests and receive output from the service. A processor 430 may provide processing instructions by communicating with a memory 432 on the computing device 406. That is, the memory device may include instructions operable to be executed by the processor to perform a set of actions. The processor 430 and/or the memory 432 may directly or indirectly communicate with a data store 434. Each robotic device in the network of robotic devices 410, each sensor in the set of sensors 402 and each feedback device in the set of feedback devices 412, may include similar processing capabilities. Alternatively, some or all of the processing capabilities may be provided by the computing device 406 or other devices in the coordinated robotic system such as a master robotic device.


Various data may be stored in a data store 434 that is accessible to the computing device 406. The term “data store” may refer to any device or combination of devices capable of storing, accessing, organizing and/or retrieving data, which may include any combination and number of data servers, relational databases, object oriented databases, cloud storage systems, data storage devices, data warehouses, flat files and data storage configuration in any centralized, distributed, or clustered environment. The storage system components of the data store 434 may include storage systems such as a SAN (Storage Area Network), cloud storage network, volatile or non-volatile RAM, optical media, or hard-drive type media. The data store 434 may be representative of a plurality of data stores 434.


The data stored in the data store 434 may include, for example, an environment data store 440, a robot state data store 442, a sensor data store 444 and a feedback data store 446. The environment data store 440 may include details about the environment in which the robotic devices 460a-d are configured to operate effectively. In this way the environment data store may contain some information regarding the functional limitation of the system 400 and include information regarding conversion of the sensed anthropomorphic motion into a collective set of commands to actuate degrees of motion by the network of robotic devices 410. Thus an environment may be programmatically defined for a kinematic convertor module 454 which limits functional equivalency. For example, the environment may be programmatically defined with an extensible markup language (XML) document and stored in the data store 440. Another standard that may be used is the joint architecture for unmanned systems (JAUS). The robot state data store 442 may include details about the state of the robotic devices 460a-d such as positioning information, loading information, velocity information, acceleration information, location information, signal strength information, stability information, and environmental surrounds. The sensor data store 444 may act as a buffer or processing area for data provided by the sensors 420a-i. Likewise, the feedback data store 446 may also act as a buffer or processing area for data provided by feedback devices 470a-b.


A sensor module 450 may provide an interface for receiving and processing data from the set of sensors 402. In this way, the sensor module 450 may interact with the sensor data store 444 and other modules such as a kinematic converter module 454. Thus a control sensor may be configured to sense anthropomorphic motion and communicate the anthropomorphic motion to the computing device 406. A Feedback module 452 may be an interface for the set of feedback devices 412 and may interact with the feedback data store 446. In this way, a feedback device may be configured to receive robotic feedback. Sensed feedback may then be presented to an operator through actuating an operator device, generating audio, engaging vibratory pads, or through a visual display. Moreover, received feedback may be presented to an operator through a device integrated with at least one of a plurality of control sensors, for instance.


The kinematic convertor module 454 may convert the anthropomorphic motion, sensed by the sensors 420a-i, into a collective set of commands to actuate degrees of motion by one or more robotic devices 460a-d within the network of robotic devices 410. The collective set of commands can be functionally equivalent or substantially equivalent to the anthropomorphic motion, meaning that a function carried out by the user is equivalently carried out within the network of robotic devices 410 by one or a plurality of robotic devices. In this way, the kinematic convertor module 454 may interact with the sensor data store 444, the environment data store 440 and the robot state data store 442. Moreover, the kinematic convertor module 454 may also interact with other modules such as an actuation module 456 to communicate the collective set of commands to the network of robotic devices 410 for actuation thereby. In this way, the actuation module 456 may provide an interface for sending commands to the network of robotic devices 410 for actuation. Thus, the kinematic convertor module 454 may be configured to convert an anthropomorphic motion, sensed by the plurality of control sensors, into a collective set of commands to actuate degrees of motion by the network of robotic devices, wherein the collective set of commands are functionally equivalent to the anthropomorphic motion, and a communication interface may be configured to communicate with the network of robotic devices 410. Any of the feedback devices 420a-i may include devices such as a video display, an audio speaker, or a tactile feedback device. Further, the plurality of control sensors may sense three-dimensional motion.


In one embodiment, the kinematic converter module 454 may be configured beforehand to anticipate scenarios in which the robots 460a-d may be used in. For example, the kinematic converter module 454 may include pre-programmed primitives designed for a specific high level task, such as driving a car, placing physiologic sensors on an unconscious human, or operating a control panel that is in a hazardous environment. In this way, an operator may be assured that the robotic devices within the network 460a-d collectively comprise sufficient degrees of freedom in order to carry out specific tasks within the environment. Nonetheless, the collection of robots can be configured and reconfigured to carry out new or different tasks within the same or different environments. For example, the collection of robots may be configured in a travel mode and individually operated to move from one location to another, wherein upon reaching a desired destination the collection of robotic devices may then be reconfigured and located in a position capable of carrying out or accomplishing the desired task. For illustrative purposes only, a plurality of serpentine (snake-like) robots may be individually configured in a travel or drive mode to gain access to a desired environment via a dimensionally restricted route (e.g., down a borehole or mineshaft), and then reconfigured and positioned within the desired environment to provide coordinated dexterous tasks within the environment, such as being positioned within a vehicle, wherein the task to be accomplished is the driving of the vehicle. Once in position, the operator can carry out the dexterous tasks within the environment via tele-operation using, for example, a mastering device that is kinematically consistent with the operator, wherein output from the mastering device is transformed to provide commands to the robotic devices within the network to accomplish what otherwise could have been accomplished had the commands been sent to a kinematically equivalent slave robot.


In one exemplary embodiment illustrating coordinated control of the various robotic devices, the sensors 420a-i may include defined zones or bounded areas of control space. Corresponding operator movement within these zones of control space may indicate control of one or more robots 460 based upon the operator movement. Once the operator leaves a zone of control space, a sensor 420 may no longer interpret operator movement as controlling one or more robots 460. However, another zone of control for another sensor 420 may interpret operator movement as controlling one or more other robots 460 from within that zone of control (i.e., other degrees of freedom used to accomplish the task). In other words, a single motion by an operator across multiple defined zones may cause multiple degrees of freedom to activate within the network of robotic devices, which multiple degrees of freedom may be across or located about multiple robotic devices. Such zones of operation can be based on or determined by factors such as the environment itself, the ranges of motion of the various robotic devices within the network of robotic devices, the ranges of motion of the individual degrees of freedom of the various robotic devices, the particular location and number of robotic devices, and others. The zones of operation can be represented numerically (such as in terms of their Cartesian coordinates relative to a reference point), graphically (e.g., mapped environment), or by any other system or method or means apparent to those skilled in the art used to define a range of motion of a degree of freedom of a robotic device.



FIG. 5 illustrates an example serpentine robotic crawler 500, which is similar to the robotic crawler described in U.S. patent application Ser. No. 14/026,284, filed Sep. 13, 2013, and entitled, “Serpentine Robotic Crawler for Performing Dexterous Operations,” (Attorney Docket No. 2865-12.3329.US.NP), which is incorporated by reference in its entirety herein. The serpentine robotic crawler 500 may be used as a robotic device in the network of robotic devices and may include a plurality of dexterous manipulators that can be positioned and articulated to perform dexterous operations in a variety of situations, some of which will be discussed in detail below. As shown, a first dexterous manipulator 562 can be coupled about a distal end 556 of a first frame 568, and a second dexterous manipulator 522 can be coupled about a distal end 516 of a second frame 528. Frames 568, 528 can comprise a distal end 556 and 516, respectively, and a proximal end 550 and 510, respectively. The serpentine robotic crawler can further comprise a propulsion system configured to cause the frames 568 and 528 to move about a ground or other surface, and relative to one another, such as to achieve different configurations and propulsion modes. The propulsion system can comprise a drive subsystem 554, which can be supported about and operable with frame 568, and a similar drive subsystem 514, which can be supported about and operable with frame 528. The drive subsystems may include various motors, drive trains, controls, etc. The propulsion system may further comprise one or more surface contacting elements that facilitate propulsion of the serpentine robotic crawler, such as a continuous or endless track 552 rotatably supported about frame 568 operable with drive subsystem 554, and continuous or endless track 512 rotatably supported about frame 528 and operable with drive subsystem 514. Addition of the rotatable endless tracks 552 and 556 to their respective frame units provides mobility to the serpentine robotic crawler 500 in a way that allows the serpentine robotic crawler 500 to move about the ground or other surfaces, and to overcome numerous obstacles. Other types of surface contacting elements may be employed, as will be recognized by those skilled in the art, such as wheels, rotating joints, etc., each of which are contemplated herein.


The robotic serpentine device 500 can further comprise a multiple degree of freedom linkage arm 540 coupling together the frames 568 and 528 at or near proximal ends 550 and 510 of each respective frame 568 and 528. In one exemplary embodiment, the multiple degree of freedom linkage arm 540 can be moveable relative to frames 568 and 528. Movement of the multiple degree of freedom linkage arm 540 can be passive, actuated, or braked. In the embodiment shown, the multiple degree of freedom linkage arm 540 can include pivoting articulating joints 580a-e and rotating articulating joints 590a-b. All or some of these articulating joints 580a-e and 590a-b can be actuatable to achieve selective positioning of the joints relative to one another and the frames 568, 528. Indeed, the articulating joints can facilitate the serpentine robotic crawler 500 assuming a variety of configurations and positioning of the first and second frames 568, 528 relative to one another, and also the first dexterous manipulator 562 relative to the second dexterous manipulator 524. The serpentine robotic crawler 500 can assume a tank-like configuration having the frames 568, 528 and the rotatable endless tracks 552 and 512 in a side-by-side arrangement with each other. In other situations, the serpentine robotic crawler 500 can assume alternative configurations, such as configuration with the frame units in a tandem relationship relative to one another. These different configurations are discussed in more detail below.


Frame units 568 and 528 may each be equipped with stops 570 and 530, respectively, or other limiter devices or systems, which may limit the degree of rotation of the multiple degree of freedom linkage arm 540, such that the joints coupled to frames 568 and 528 are prohibited from rotating to such a degree that the joints interfere with the operation of the endless tracks 550 and 510.


The dexterous manipulators 562 and 522 may each comprise respective jointed members 558 and 518 pivotally connected or coupled to or otherwise about the distal ends 556 and 516 of frames 568 and 528, respectively. The jointed members 558 and 518 can help facilitate the dexterous manipulators 562, 522 being capable of operating or functioning in a wrist-like manner, meaning to move in multiple degrees of freedom about multiple different axes similar to the human wrist.


One or both of the dexterous manipulators 562 and 522 can further comprise an end effector (e.g., see end effectors 560 and 520 operable with jointed members 558 and 518, respectively) configured to operate on or manipulate, or otherwise interface with a work piece (e.g., an object, another end effector, the ground or other surface, etc.). Essentially, the dexterous manipulators 562 and 522, with their end effectors 560 and 520, respectively, can be configured to manipulate or otherwise interface with an object or thing for the purpose of performing a dexterous operation.


The end effectors 562, 522 can comprise a variety of different configurations, depending upon the task to be performed. For example, the end effectors can be designed to comprise components operable to apply two opposing forces on a work piece giving it some functionality similar to a human hand. In one aspect, such as in the embodiment shown, the end effectors may comprise opposing finger components that move relative to one another, and that are configured to apply opposing forces in a direction towards one another, or to constrict, similar to a human finger against an opposable thumb. In another aspect, the end effectors may comprise components configured to be operated to apply counter or opposing forces in a direction away from one another, or to expand.


The unique positioning capabilities of the frames and articulating linkage of the serpentine robotic crawler, along with the jointed members 558 and 518 in conjunction with their respective end effectors 560 and 520, facilitates dynamic positioning of the serpentine robotic crawler, and more particularly its dexterous manipulators 562, 522, relative to one or more given work pieces, and/or relative to each other. Further, similar to the stops 570 and 530 between the multiple degree of freedom linkage arm 540, stops 566 and 526 may be affixed to respective frames 568 and 528 in order to ensure that the jointed members 558 and 518 do not interfere with respective rotating endless tracks 522 and 512.


To provide additional dexterity to, and to facilitate enhanced positioning and capabilities of, the dexterous manipulators 562 and 522, one or both of the dexterous manipulators 562 and 522 may further comprise a rotational joint, such as rotational joints 564 and 524, operable with the jointed members 516, 558 and the end effectors 560, 520, respectively, to allow the dexterous manipulators 562, 522 to function in a wrist-like manner having multiple degrees of freedom (e.g., to provide pitch, yaw and roll functionality to or as imparted to the end effector). Each of the rotational joints 564 and 524 can be rotatably coupled to the jointed members 516, 558 and configured to rotate (i.e., twist) back and forth within a full 360 degrees about the end of jointed members 558, 518, respectively about axis B. Additionally, rotational joints 564 and 524 may also be configured such that they may rotate continuously, i.e. they may be able to perform infinite continuous and successive complete revolutions in a first or clockwise direction as well as in an opposing, second or counterclockwise direction. Further, each end effector 560 and 520 may be pivotally coupled to the rotational joints 564 and 524, respectively, and configured to pivot in a bi-directional manner within a range (e.g., 0-360 degrees; 0-180 degrees, etc. as measured about axis B, and depending upon the design and configuration of the dexterous manipulator and the various joints therein). The various degrees of freedom provided by the jointed members, the rotational joints and the end effector, as operably coupled together, as well as the various degrees of freedom within the articulated linkage allow the dexterous manipulators 562, 522, and particularly the end effectors 560 and 520, to be positioned in virtually any orientation with respect to their respective frames 568, 528 and a workpiece, or with respect to each other.


The various components of the serpentine robotic crawler can be actively articulated or passively articulated. For example, in one exemplary embodiment, dexterous manipulators 562 and 522, as well as the various joints making up the multiple degree of freedom linkage arm 540, may be actively actuated using servo motors, driveshaft systems, chain drive systems, hydraulic systems, tendon/pulley type systems, or any other suitable actuation means as will be recognized by those skilled in the art. Alternatively, the dexterous manipulators 562, 522, as well as the various joints in the multiple degree of freedom linkage arm 540, may be operated using one or more types of passive systems, such as braking systems, locking systems, or any other suitable system capable of maintaining these in a locked position. These active or passive articulation systems can operate to facilitate positioning of the various movable joints of each respective dexterous manipulator 562 and 522, as well as the multiple degree of freedom arm 540 to place the dexterous manipulators 562, 522 in a desired or needed position.


It should be noted that for the particular embodiment shown in FIG. 5, the configurations and features described in relation to frame 568 and associated dexterous manipulator 562 can be similarly applicable to frame 528 and its associated dexterous manipulator 522. Nonetheless, frame 528 and dexterous manipulator 562 can be configured differently, such as to employ varying end effectors, wrists and jointed members, as those skilled in the art will appreciate, which different configurations are contemplated herein.



FIG. 6 illustrates an example of serpentine robotic crawler 600 being controlled by an operator 602. The serpentine robotic crawler 600 comprising frame units and dexterous manipulators 606, 608 held to an object (e.g. to a ferromagnetic material wall) by a clamping device 604 (e.g. suction cups, gripper, or magnetic clamp, such as an electromagnet or permanent magnets with variable flux return path to control the clamping force). As described herein, the serpentine robotic crawler 10 can perform single or two-handed dexterous operations.



FIG. 7 is a flowchart illustrating an example coordinated robotic control method 700. In method element 702, a network of robotic devices is established. In method element 704, an anthropomorphic motion may be sensed from an operator. In method element 706, one or more signals representative of at least a portion of the anthropomorphic motion may be generated, and the one or more signals may be converted into a collective set of commands to actuate the network of robotic devices, as shown in method element 708. The collective set of commands is functionally equivalent to the anthropomorphic motion. The method 700 may be embodied on a non-transitory computer-readable medium.



FIG. 8 is block diagram 800 illustrating an example of a computing device 802 that may be used for discovering content. In particular, the computing device 802 is illustrates a high level example of a device on which modules of the disclosed technology may be executed. The computing device 802 may include one or more processors 804 that are in communication with memory devices 806. The computing device 802 may include a local communication interface 818 for the components in the computing device. For example, the local communication interface may be a local data bus and/or any related address or control busses as may be desired.


The memory device 806 may contain modules that are executable by the processor(s) 804 and data for the modules. Located in the memory device 806 are various modules 810 implementing functionality heretofore described. The various modules 810 are executable by the processor(s) 804. A data store 808 may also be located in the memory device 806 for storing data related to the modules and other applications along with an operating system that is executable by the processor(s) 804.


Other applications may also be stored in the memory device 806 and may be executable by the processor(s) 804. Components or modules discussed in this description that may be implemented in the form of software using high programming level languages that are compiled, interpreted or executed using a hybrid of the methods.


The computing device may also have access to I/O (input/output) devices 814 that are usable by the computing devices. An example of an I/O device is a display screen 820 that is available to display output from the computing devices. Other known I/O device may be used with the computing device as desired. Networking devices 816 and similar communication devices may be included in the computing device. The networking devices 816 may be wired or wireless networking devices that connect to the internet, a LAN, WAN, or other computing network.


The components or modules that are shown as being stored in the memory device 806 may be executed by the processor(s) 804. The term “executable” may mean a program file that is in a form that may be executed by a processor 804. For example, a program in a higher level language may be compiled into machine code in a format that may be loaded into a random access portion of the memory device 806 and executed by the processor 804, or source code may be loaded by another executable program and interpreted to generate instructions in a random access portion of the memory to be executed by a processor. The executable program may be stored in any portion or component of the memory device 806. For example, the memory device 806 may be random access memory (RAM), read only memory (ROM), flash memory, a solid state drive, memory card, a hard drive, optical disk, floppy disk, magnetic tape, or any other memory components.


The processor 804 may represent multiple processors and the memory 806 may represent multiple memory units that operate in parallel to the processing circuits. This may provide parallel processing channels for the processes and data in the system. The local interface 818 may be used as a network to facilitate communication between any of the multiple processors and multiple memories. The local interface 818 may use additional systems designed for coordinating communication such as load balancing, bulk data transfer and similar systems.


While the flowcharts presented for this technology may imply a specific order of execution, the order of execution may differ from what is illustrated. For example, the order of two more blocks may be rearranged relative to the order shown. Further, two or more blocks shown in succession may be executed in parallel or with partial parallelization. In some configurations, one or more blocks shown in the flow chart may be omitted or skipped. Any number of counters, state variables, warning semaphores, or messages might be added to the logical flow for purposes of enhanced utility, accounting, performance, measurement, troubleshooting or for similar reasons.


Some of the functional units described in this specification have been labeled as modules, in order to more particularly emphasize their implementation independence. For example, a module may be implemented as a hardware circuit comprising custom VLSI circuits or gate arrays, off-the-shelf semiconductors such as logic chips, transistors, or other discrete components. A module may also be implemented in programmable hardware devices such as field programmable gate arrays, programmable array logic, programmable logic devices or the like.


Modules may also be implemented in software for execution by various types of processors. An identified module of executable code may, for instance, comprise one or more blocks of computer instructions, which may be organized as an object, procedure, or function. Nevertheless, the executables of an identified module need not be physically located together, but may comprise disparate instructions stored in different locations which comprise the module and achieve the stated purpose for the module when joined logically together.


Indeed, a module of executable code may be a single instruction or many instructions and may even be distributed over several different code segments, among different programs and across several memory devices. Similarly, operational data may be identified and illustrated herein within modules and may be embodied in any suitable form and organized within any suitable type of data structure. The operational data may be collected as a single data set, or may be distributed over different locations including over different storage devices. The modules may be passive or active, including agents operable to perform desired functions.


The technology described here may also be stored on a computer readable storage medium that includes volatile and non-volatile, removable and non-removable media implemented with any technology for the storage of information such as computer readable instructions, data structures, program modules, or other data. Computer readable storage media include, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tapes, magnetic disk storage or other magnetic storage devices, or any other computer storage medium which may be used to store the desired information and described technology.


The devices described herein may also contain communication connections or networking apparatus and networking connections that allow the devices to communicate with other devices. Communication connections are an example of communication media. Communication media typically embodies computer readable instructions, data structures, program modules and other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. A “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example and not limitation, communication media includes wired media such as a wired network or direct-wired connection and wireless media such as acoustic, radio frequency, infrared and other wireless media. The term computer readable media as used herein includes communication media.


Reference was made to the examples illustrated in the drawings and specific language was used herein to describe the same. It will nevertheless be understood that no limitation of the scope of the technology is thereby intended. Alterations and further modifications of the features illustrated herein and additional applications of the examples as illustrated herein are to be considered within the scope of the description.


Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more examples. In the preceding description, numerous specific details were provided, such as examples of various configurations to provide a thorough understanding of examples of the described technology. It will be recognized, however, that the technology may be practiced without one or more of the specific details, or with other methods, components, devices, etc. In other instances, well-known structures or operations are not shown or described in detail to avoid obscuring aspects of the technology.


Although the subject matter has been described in language specific to structural features and/or operations, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features and operations described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims. Numerous modifications and alternative arrangements may be devised without departing from the spirit and scope of the described technology.

Claims
  • 1. A coordinated robotic control method comprising: under the control of one or more computer systems configured with executable instructions: establishing a network of individual autonomous robotic devices;sensing an anthropomorphic motion from an operator;generating one or more signals representative of at least a portion of the anthropomorphic motion; andconverting the one or more signals into a collective set of commands to actuate a plurality of robotic devices within the network of robotic devices, wherein the collective set of commands is functionally equivalent to the anthropomorphic motion.
  • 2. The method of claim 1, further comprising communicating the collective set of commands to the network of robotic devices.
  • 3. The method of claim 2, wherein communicating the collective set of commands comprises communicating to each robotic device in the network of robotic devices a respective subset of commands from the collective set of commands.
  • 4. The method of claim 1, wherein the network of robotic devices comprises a master robotic device and one or more slave robotic devices, and wherein the master robotic device receives the collective set of commands and distributes the collective set of commands to the network of robotic devices.
  • 5. The method of claim 1, further comprising: sensing feedback from the network of robotic devices; andpresenting sensed feedback to an operator.
  • 6. The method of claim 5, wherein presenting sensed feedback to the operator includes at least one of actuating an operator device, generating audio, or engaging vibratory pads.
  • 7. The method of claim 5, wherein presenting sensed feedback to the operator includes visually displaying sensed feedback.
  • 8. The method of claim 1 embodied on a non-transitory computer-readable medium.
  • 9. A coordinated robotic system comprising: a plurality of control sensors to sense an anthropomorphic motion;a plurality of robotic devices to receive a command and actuate a motion based on the command; anda coordinated robotic control device, communicatively connected to the plurality of control sensors and the plurality of individual autonomous robotic devices, to convert the anthropomorphic motion, sensed by the plurality of control sensors, into a collective set of commands to actuate the plurality of robotic devices, wherein the collective set of commands is functionally equivalent to the anthropomorphic motion.
  • 10. The system of claim 9, wherein the plurality of control sensors attach to an operator.
  • 11. The system of claim 9, wherein the plurality of robotic devices includes a serpentine robotic crawler.
  • 12. The system of claim 9, wherein the coordinated robotic control device converts the anthropomorphic motion to the collective set of commands with one or more computer systems configured with executable instructions.
  • 13. A coordinated robotic control device comprising: a processor;a memory device including instructions to be executed by the processor to perform a set of actions;a plurality of control sensors configured to sense anthropomorphic motion;a feedback device configured to receive robotic feedback;a kinematic convertor configured to convert an anthropomorphic motion, sensed by the plurality of control sensors, into a collective set of commands to actuate degrees of motion by a plurality of individual autonomous robotic devices within a network of robotic devices, wherein the collective set of commands is functionally equivalent to the anthropomorphic motion; anda communication interface configured to communicate with the network of robotic devices.
  • 14. The device of claim 13, wherein the feedback device comprises at least one of a video display, an audio speaker, or a tactile feedback device.
  • 15. The device of claim 13, wherein feedback received from the feedback device is presented to an operator through a device integrated with at least one of the plurality of control sensors.
  • 16. The device of claim 13, wherein the plurality of control sensors sense three-dimensional motion.
  • 17. The device of claim 13, wherein an environment is programmatically defined for the kinematic convertor which limits functional equivalency.
  • 18. The device of claim 17, wherein the environment is programmatically defined with an extensible markup language (XML) document.
US Referenced Citations (317)
Number Name Date Kind
1107874 Appleby Aug 1914 A
1112460 Leavitt Oct 1914 A
1515756 Roy Nov 1924 A
1975726 Martinage Oct 1934 A
2025999 Myers Dec 1935 A
2082920 Aulmont Jun 1937 A
2129557 Beach Sep 1938 A
2311475 Schmeiser Feb 1943 A
2312072 Broadwater Feb 1943 A
2329582 Bishop Sep 1943 A
2345763 Mayne Apr 1944 A
2701169 Cannon Feb 1955 A
2850147 Hill Sep 1958 A
2933143 Robinson Apr 1960 A
2967737 Moore Jan 1961 A
3037571 Zelle Jun 1962 A
3060972 Sheldon Oct 1962 A
3107643 Edwards Oct 1963 A
3166138 Dunn, Jr. Jan 1965 A
3190286 Stokes Jun 1965 A
3215219 Forsyth Nov 1965 A
3223462 Dalrymple Dec 1965 A
3266059 Stelle Aug 1966 A
3284964 Saito Nov 1966 A
3311424 Taylor Mar 1967 A
3362492 Hansen Jan 1968 A
3387896 Sobota Jun 1968 A
3489236 Goodwin Jan 1970 A
3497083 Anderson Feb 1970 A
3565198 Ames Feb 1971 A
3572325 Bazell Mar 1971 A
3609804 Morrison Oct 1971 A
3650343 Helsell Mar 1972 A
3700115 Johnson Oct 1972 A
3707218 Payne Dec 1972 A
3712481 Harwood Jan 1973 A
3715146 Robertson Feb 1973 A
3757635 Hickerson Sep 1973 A
3808078 Snellman Apr 1974 A
3820616 Juergens Jun 1974 A
3841424 Purcell Oct 1974 A
3864983 Jacobsen Feb 1975 A
3933214 Guibord Jan 1976 A
3934664 Pohjola Jan 1976 A
3974907 Shaw Aug 1976 A
4015553 Middleton Apr 1977 A
4051914 Pohjola Oct 1977 A
4059315 Jolliffe Nov 1977 A
4068905 Black Jan 1978 A
4107948 Molaug Aug 1978 A
4109971 Black Aug 1978 A
4132279 Van der Lende Jan 1979 A
4218101 Thompson Aug 1980 A
4260053 Onodera Apr 1981 A
4332317 Bahre Jun 1982 A
4332424 Thompson Jun 1982 A
4339031 Densmore Jul 1982 A
4393728 Larson Jul 1983 A
4396233 Slaght Aug 1983 A
4453611 Stacy, Jr. Jun 1984 A
4483407 Iwamoto et al. Nov 1984 A
4489826 Dubson Dec 1984 A
4494417 Larson Jan 1985 A
4551061 Olenick Nov 1985 A
4589460 Albee May 1986 A
4621965 Wilcock Nov 1986 A
4636137 Lemelson Jan 1987 A
4646906 Wilcox, Jr. Mar 1987 A
4661039 Brenhold Apr 1987 A
4671774 Owen Jun 1987 A
4700693 Lia Oct 1987 A
4706506 Lestelle Nov 1987 A
4712969 Kimura Dec 1987 A
4713896 Jennens Dec 1987 A
4714125 Stacy, Jr. Dec 1987 A
4727949 Rea Mar 1988 A
4736826 White et al. Apr 1988 A
4752105 Barnard Jun 1988 A
4756662 Tanie Jul 1988 A
4765795 Rebman Aug 1988 A
4784042 Paynter Nov 1988 A
4796607 Allred, III Jan 1989 A
4806066 Rhodes Feb 1989 A
4815319 Clement Mar 1989 A
4815911 Bengtsson Mar 1989 A
4818175 Kimura Apr 1989 A
4828339 Thomas May 1989 A
4828453 Martin et al. May 1989 A
4848179 Ubhayakar Jul 1989 A
4862808 Hedgecoxe Sep 1989 A
4878451 Siren Nov 1989 A
4900218 Sutherland Feb 1990 A
4909341 Rippingale Mar 1990 A
4924153 Toru et al. May 1990 A
4932491 Collins, Jr. Jun 1990 A
4932831 White et al. Jun 1990 A
4936639 Pohjola Jun 1990 A
4997790 Woo Mar 1991 A
5018591 Price May 1991 A
5021798 Ubhayakar Jun 1991 A
5022812 Coughlan Jun 1991 A
5046914 Holland et al. Sep 1991 A
5080000 Bubic Jan 1992 A
5130631 Gordon Jul 1992 A
5142932 Moya Sep 1992 A
5172639 Wiseman et al. Dec 1992 A
5174168 Takagi Dec 1992 A
5174405 Carra Dec 1992 A
5186526 Pennington Feb 1993 A
5199771 James Apr 1993 A
5205612 Sugden et al. Apr 1993 A
5214858 Pepper Jun 1993 A
5219264 McClure et al. Jun 1993 A
5252870 Jacobsen Oct 1993 A
5297443 Wentz Mar 1994 A
5317952 Immega Jun 1994 A
5337732 Grundfest Aug 1994 A
5337846 Ogaki et al. Aug 1994 A
5350033 Kraft Sep 1994 A
5354124 James Oct 1994 A
5363935 Schempf Nov 1994 A
5386741 Rennex Feb 1995 A
5413454 Movsesian May 1995 A
5426336 Jacobsen Jun 1995 A
5428713 Matsumaru Jun 1995 A
5435405 Schempf Jul 1995 A
5440916 Stone et al. Aug 1995 A
5443354 Stone et al. Aug 1995 A
5451135 Schempf Sep 1995 A
5465525 Mifune Hiroshi Nov 1995 A
5466056 James Nov 1995 A
5469756 Feiten Nov 1995 A
5516249 Brimhall May 1996 A
5519814 Rodriguez et al. May 1996 A
5551545 Gelfman Sep 1996 A
5556370 Maynard Sep 1996 A
5562843 Yasumoto Oct 1996 A
5567110 Sutherland Oct 1996 A
5570992 Lemelson Nov 1996 A
5573316 Wankowski Nov 1996 A
5588688 Jacobsen Dec 1996 A
5672044 Lemelson Sep 1997 A
5697285 Nappi Dec 1997 A
5712961 Matsuo Jan 1998 A
5749828 Solomon May 1998 A
5770913 Mizzi Jun 1998 A
5816769 Bauer Oct 1998 A
5821666 Matsumoto Oct 1998 A
5842381 Feiten Dec 1998 A
5845540 Rosheim Dec 1998 A
RE36025 Suzuki Jan 1999 E
5878783 Smart Mar 1999 A
5888235 Jacobsen Mar 1999 A
5902254 Magram May 1999 A
5906591 Dario May 1999 A
5984032 Gremillion Nov 1999 A
5996346 Maynard Dec 1999 A
6016385 Yee et al. Jan 2000 A
6030057 Fikse Feb 2000 A
6056237 Woodland May 2000 A
6107795 Smart Aug 2000 A
6109705 Courtemanche Aug 2000 A
6113343 Goldenberg et al. Sep 2000 A
6132133 Muro et al. Oct 2000 A
6138604 Anderson Oct 2000 A
6162171 Ng Dec 2000 A
6186604 Fikse Feb 2001 B1
6203126 Harguth Mar 2001 B1
6232735 Baba et al. May 2001 B1
6260501 Agnew Jul 2001 B1
6263989 Won Jul 2001 B1
6264293 Musselman Jul 2001 B1
6264294 Musselman et al. Jul 2001 B1
6272396 Taitler Aug 2001 B1
6281489 Tubel et al. Aug 2001 B1
6323615 Khairallah Nov 2001 B1
6325749 Inokuchi et al. Dec 2001 B1
6333631 Das et al. Dec 2001 B1
6339993 Comello Jan 2002 B1
6380889 Herrmann et al. Apr 2002 B1
6394204 Haringer May 2002 B1
6405798 Barrett et al. Jun 2002 B1
6408224 Okamoto Jun 2002 B1
6411055 Fujita Jun 2002 B1
6422509 Yim Jul 2002 B1
6430475 Okamoto Aug 2002 B2
6431296 Won Aug 2002 B1
6446718 Barrett et al. Sep 2002 B1
6450104 Grant Sep 2002 B1
6477444 Bennett et al. Nov 2002 B1
6484083 Hayward Nov 2002 B1
6488306 Shirey et al. Dec 2002 B1
6505896 Boivin Jan 2003 B1
6512345 Borenstein Jan 2003 B2
6522950 Conca et al. Feb 2003 B1
6523629 Buttz Feb 2003 B1
6529806 Licht Mar 2003 B1
6535793 Allard Mar 2003 B2
6540310 Cartwright Apr 2003 B1
6557954 Hattori May 2003 B1
6563084 Bandy May 2003 B1
6574958 Macgregor Jun 2003 B1
6576406 Jacobsen et al. Jun 2003 B1
6595812 Haney Jul 2003 B1
6610007 Belson Aug 2003 B2
6619146 Kerrebrock Sep 2003 B2
6636781 Shen et al. Oct 2003 B1
6651804 Thomas Nov 2003 B2
6652164 Stiepel et al. Nov 2003 B2
6668951 Won Dec 2003 B2
6708068 Sakaue Mar 2004 B1
6715575 Karpik Apr 2004 B2
6725128 Hogg et al. Apr 2004 B2
6772673 Seto Aug 2004 B2
6773327 Felice Aug 2004 B1
6774597 Borenstein Aug 2004 B1
6799815 Krishnan Oct 2004 B2
6820653 Schempf Nov 2004 B1
6831436 Gonzalez Dec 2004 B2
6835173 Couvillon, Jr. Dec 2004 B2
6837318 Craig Jan 2005 B1
6840588 Deland Jan 2005 B2
6866671 Tierney Mar 2005 B2
6870343 Borenstein Mar 2005 B2
6889118 Murray et al. May 2005 B2
6917176 Schempf Jul 2005 B2
6923693 Borgen Aug 2005 B2
6936003 Iddan Aug 2005 B2
6959231 Maeda Oct 2005 B2
6971141 Tak Dec 2005 B1
7017687 Jacobsen et al. Mar 2006 B1
7020701 Gelvin et al. Mar 2006 B1
7040426 Berg May 2006 B1
7044245 Anhalt et al. May 2006 B2
7069124 Whittaker et al. Jun 2006 B1
7090637 Dankitz Aug 2006 B2
7137465 Kerrebrock Nov 2006 B1
7144057 Young et al. Dec 2006 B1
7171279 Buckingham et al. Jan 2007 B2
7188473 Asada Mar 2007 B1
7188568 Stout Mar 2007 B2
7228203 Koselka et al. Jun 2007 B2
7235046 Anhalt et al. Jun 2007 B2
7331436 Pack et al. Feb 2008 B1
7387179 Anhalt et al. Jun 2008 B2
7415321 Okazaki et al. Aug 2008 B2
7475745 DeRoos Jan 2009 B1
7539557 Yamauchi May 2009 B2
7546912 Pack et al. Jun 2009 B1
7597162 Won Oct 2009 B2
7597762 Albanese et al. Oct 2009 B2
7600592 Goldenberg et al. Oct 2009 B2
7645110 Ogawa et al. Jan 2010 B2
7654348 Ohm et al. Feb 2010 B2
7762362 Cutkosky et al. Jul 2010 B2
7775312 Maggio Aug 2010 B2
7798264 Hutcheson et al. Sep 2010 B2
7843431 Robbins et al. Nov 2010 B2
7845440 Jacobsen Dec 2010 B2
7860614 Reger Dec 2010 B1
7865266 Moll et al. Jan 2011 B2
7874386 Ben-Tzvi et al. Jan 2011 B2
7974736 Morin et al. Jul 2011 B2
8002365 Jacobsen Aug 2011 B2
8002716 Jacobsen Aug 2011 B2
8042630 Jacobsen Oct 2011 B2
8162410 Hirose et al. Apr 2012 B2
8185241 Jacobsen May 2012 B2
8205695 Jacobsen et al. Jun 2012 B2
8225892 Ben-Tzvi Jul 2012 B2
8317555 Jacobsen et al. Nov 2012 B2
8392036 Jacobsen Mar 2013 B2
8393422 Pensel Mar 2013 B1
8571711 Jacobsen et al. Oct 2013 B2
9031698 Smith May 2015 B2
20070289786 Cutkowsky et al. Dec 2007 A1
20080115687 Gal et al. May 2008 A1
20080164079 Jacobsen Jul 2008 A1
20080167662 Kurtz Jul 2008 A1
20080167752 Jacobsen Jul 2008 A1
20080168070 Naphade Jul 2008 A1
20080192569 Ray et al. Aug 2008 A1
20080215185 Jacobsen Sep 2008 A1
20080217993 Jacobsen Sep 2008 A1
20080272647 Hirose et al. Nov 2008 A9
20080281231 Jacobsen Nov 2008 A1
20080281468 Jacobsen Nov 2008 A1
20080284244 Hirose et al. Nov 2008 A1
20090025988 Jacobsen et al. Jan 2009 A1
20090030562 Jacobsen Jan 2009 A1
20090035097 Loane Feb 2009 A1
20090095209 Jamieson Apr 2009 A1
20090132088 Taitler May 2009 A1
20090171151 Choset et al. Jul 2009 A1
20090212157 Arlton et al. Aug 2009 A1
20090248202 Osuka et al. Oct 2009 A1
20100030377 Unsworth Feb 2010 A1
20100036544 Mashiach Feb 2010 A1
20100174422 Jacobsen Jul 2010 A1
20100201185 Jacobsen Aug 2010 A1
20100201187 Jacobsen Aug 2010 A1
20100228548 Liu et al. Sep 2010 A1
20100258365 Jacobsen Oct 2010 A1
20100268470 Kamal et al. Oct 2010 A1
20100317244 Jacobsen Dec 2010 A1
20100318242 Jacobsen et al. Dec 2010 A1
20120072019 Sanders et al. Mar 2012 A1
20120185095 Rosenstein et al. Jul 2012 A1
20120205168 Flynn et al. Aug 2012 A1
20120264414 Fung Oct 2012 A1
20120277914 Crow et al. Nov 2012 A1
20120292120 Ben-Tzvi Nov 2012 A1
20130054021 Murai et al. Feb 2013 A1
20140121835 Smith May 2014 A1
20150081092 Jacobsen Mar 2015 A1
20150127150 Ponulak et al. May 2015 A1
20150127155 Passot et al. May 2015 A1
Foreign Referenced Citations (124)
Number Date Country
2512299 Sep 2004 CA
1603068 Apr 2005 CN
2774717 Apr 2006 CN
1970373 May 2007 CN
101583820 May 2011 CN
3025840 Feb 1982 DE
3626238 Feb 1988 DE
3626328 Feb 1988 DE
19617852 Oct 1997 DE
19714464 Oct 1997 DE
19704080 Aug 1998 DE
10018075 Jan 2001 DE
102004010089 Sep 2005 DE
0105418 Apr 1984 EP
0584520 Mar 1994 EP
0818283 Jan 1998 EP
0924034 Jun 1999 EP
1444043 Aug 2004 EP
1510896 Mar 2005 EP
1832501 Sep 2007 EP
1832502 Sep 2007 EP
2081814 Jul 2009 EP
2082159 Jul 2009 EP
2170683 Apr 2010 EP
2444006 Apr 2012 EP
2549165 Jan 2013 EP
2092265 Apr 2013 EP
2638813 May 1990 FR
2660730 Oct 1991 FR
2609335 Jul 1998 FR
2850350 Jul 2004 FR
1199729 Jul 1970 GB
S50-108110 Aug 1975 JP
S51-106391 Sep 1976 JP
52 57625 May 1977 JP
S52-122431 Sep 1977 JP
S58-032870 Feb 1983 JP
58-89480 May 1983 JP
S58-80387 May 1983 JP
S59-139494 Sep 1984 JP
60015275 Jan 1985 JP
560-047771 Mar 1985 JP
60060516 Apr 1985 JP
60139576 Jul 1985 JP
SHO 60-211315 Oct 1985 JP
S61-001581 Jan 1986 JP
SHO 61-180885 Jan 1986 JP
S61-020484 Feb 1986 JP
SHO61-054378 Mar 1986 JP
SHO61-075069 Apr 1986 JP
61089182 May 1986 JP
S61-260988 Nov 1986 JP
SHO 62-36885 Mar 1987 JP
62165207 Jul 1987 JP
62-162626 Oct 1987 JP
S61-51353 Oct 1987 JP
S63-32084 Mar 1988 JP
S63-501208 May 1988 JP
S63-170174 Jul 1988 JP
63306988 Dec 1988 JP
H02-109691 Apr 1990 JP
H03-007388 Jan 1991 JP
H03-104572 May 1991 JP
04092784 Mar 1992 JP
4126656 Apr 1992 JP
H05-003087 Jan 1993 JP
H05-069350 Mar 1993 JP
05147560 Jun 1993 JP
HEI05-270454 Oct 1993 JP
H05-286460 Nov 1993 JP
06-115465 Apr 1994 JP
2007-216936 Aug 1995 JP
7329841 Dec 1995 JP
H07-329837 Dec 1995 JP
H08-133141 May 1996 JP
H08-133151 May 1996 JP
H09-109069 Apr 1997 JP
H09-109070 Apr 1997 JP
H09-142347 Jun 1997 JP
H10-277976 Oct 1998 JP
H11-277466 Oct 1999 JP
H11-347970 Dec 1999 JP
2001-010560 Jan 2001 JP
2001-195113 Jul 2001 JP
2003-019985 Jan 2003 JP
2003-237618 Aug 2003 JP
2003315486 Nov 2003 JP
2003-334783 Nov 2003 JP
2004080147 Mar 2004 JP
03535508 Jun 2004 JP
2004-536634 Dec 2004 JP
2005-19331 Jan 2005 JP
2005-081447 Mar 2005 JP
2005-111595 Apr 2005 JP
2005111595 Apr 2005 JP
2005-169561 Jun 2005 JP
2006-510496 Mar 2006 JP
2006-107024 Apr 2006 JP
2006-173782 Jun 2006 JP
2007-237991 Sep 2007 JP
2010-509129 Mar 2010 JP
2012-187698 Oct 2012 JP
2013-010165 Jan 2013 JP
2013-091114 May 2013 JP
WO 8702635 May 1987 WO
WO 9637727 Nov 1996 WO
WO9726039 Jul 1997 WO
WO0010073 Feb 2000 WO
WO 0010073 Feb 2000 WO
WO0216995 Feb 2002 WO
WO 02095517 Nov 2002 WO
WO03030727 Apr 2003 WO
WO03037515 May 2003 WO
WO 2004056537 Jul 2004 WO
WO2005018428 Mar 2005 WO
WO 2006068080 Jun 2006 WO
WO 2008049050 Apr 2008 WO
WO 2008073203 Jun 2008 WO
WO 2008076194 Jun 2008 WO
WO 2008127310 Oct 2008 WO
WO 2008135978 Nov 2008 WO
WO 2009009673 Jan 2009 WO
WO 2010070666 Jun 2010 WO
WO 2012061932 May 2012 WO
Non-Patent Literature Citations (73)
Entry
Hutchison et al., Development of Control for a Serpentine Robot, 2007, Proceedings of the 7th IEEE International Symposium on Computational Intelligence in Robotics and Automation (CIRA2007), Jun. 20-23, 2007, Jacksonville, USA, pp. 1-6.
Office Action for U.S. Appl. No. 14/026,284 dated Aug. 11, 2015, 29 pages.
Arnold, Henry, “Cricket the robot documentation.” online manual available at http://www.parallaxinc.com, 22 pages.
Iagnemma, Karl et al., “Traction control of wheeled robotic vehicles in rough terrain with application to planetary rovers.” International Journal of Robotics Research, Oct.-Nov. 2004, pp. 1029-1040, vol. 23, No. 10-11.
Ijspeert et al; From Swimming to Walking with a Salamander Robot Driven by a Spinal Cord Model; Science; Mar. 2007; pp. 1416-1419; vol. 315; American Association for the Advancement of Science.
Hirose, et al., “Snakes and strings; new robotic components for rescue operations,” International Journal of Robotics Research, Apr.-May 2004, pp. 341-349, vol. 23, No. 4-5.
Paap et al., “A robot snake to inspect broken buildings,” IEEE, 2000, pp. 2079-2082, Japan.
Braure, Jerome, “Participation to the construction of a salamander robot: exploration of the morphological configuration and the locomotion controller”, Biologically Inspired Robotics Group, master thesis, Feb. 17, 2004, pp. 1-46.
Jacobsen, et al., Advanced intelligent mechanical sensors (AIMS), Proc. IEEE Trandsucers Jun. 24-27, 1991, abstract only, San Fransico, CA.
Jacobsen, et al., “Research robots for applications in artificial intelligence, teleoperation and entertainment”, International Journal of Robotics Research, 2004, pp. 319-330, vol. 23.
Jacobsen, et al., “Multiregime MEMS sensor networks for smart structures,” Procs. SPIE 6th Annual Inter. Conf. on Smart Structues and Materials, Mar. 1-5, 1999, pp. 19-32, vol. 3673, Newport Beach CA.
Maclean et al., “A digital MEMS-based strain gage for structural health monitoring,” Procs. 1997 MRS Fall Meeting Symposium, Nov. 30-Dec. 4, 1997, pp. 309-320, Boston Massachusetts.
Berlin et al., “MEMS-based control of structural dynamic instability”, Journal of Intelligent Material Systems and Structures, Jul. 1998 pp. 574-586, vol. 9.
Goldfarb, “Design and energetic characterization of a liquid-propellant-powered actuator for self-powered robots,” IEEE Transactions on Mechatronics, Jun. 2003, vol. 8 No. 2.
Dowling, “Limbless Locomotion: Learning to crawl with a snake robot,” The Robotics Institute at Carnegie Mellon University, Dec. 1997, pp. 1-150.
Matthew Heverly & Jaret Matthews: “A wheel-on-limb rover for lunar operation” Internet article, Nov. 5, 2008, pp. 1-8, http://robotics.estec.esa.int/i-SAIRAS/isairas2008/Proceedings/SESSION%2026/m116-Heverly.pdf.
NASA: “Nasa's newest concept vehicles take off-roading out of this world” Internet article, Nov. 5, 2008, http://www.nasa.gov/mission—pages/constellation/main/lunar—truck.html.
Revue Internationale De defense, “3-D vision and urchin” Oct. 1, 1988, p. 1292, vol. 21, No. 10, Geneve CH.
Advertisement, International Defense review, Jane's information group, Nov. 1, 1990, p. 54, vol. 23, No. 11, Great Britain.
Ren Luo “Development of a multibehavior-based mobile robot for remote supervisory control through the internet” IEEE/ ASME Transactions on mechatronics, IEEE Service Center, Piscataway, NY, Dec. 1, 2000, vol. 5, No. 4.
Nilas Sueset et al., “A PDA-based high-level human-robot interaction” Robotics, Automation and Mechatronics, IEEE Conference Singapore, Dec 1-3, 2004, vol. 2, pp. 1158-1163.
Mehling et al; A Minimally Invasive Tendril Robot for In-Space Inspection; Feb. 2006; The First IEEE/RAS-EMBS International Conference on Biomedical Robotics and Biomechatronics (BioRob '06) p. 690-695.
Mahabatra et al; “Design and Implementation of Coordinated Multipurpose Robotic System with RF and Light Communication Channels”; Paper entirely based on study, research and experiments.
Simmons et al; “Coordinated Deployment of Multiple, Heterogeneous Robots”; School of Computer Science, Carnegie Mellon University, Pittsburgh PA.; Honeywell Technology Center, Minneapolis, MN; Intelligent Robot Systems, 2000; vol. 3 pp. 2254-2260.
Blackburn, et al.; Improved mobility in a multi-degree-of-freedom unmanned ground vehicle; Unmanned Ground Vehicles Technology VI; Sep. 2, 2004; 124-134; Proceedings of SPIE vol. 5422.
Schenker, et al.; Reconfigurable robots for all terrain exploration; Jet Propulsion Laboratory, California Institute of Technology; 2000; 15 pages.
Celaya et al.; Control of a Six-Legged Robot Walking on Abrupt Terrain; Proceedings of the 1996 IEE International Conference on Robotics and Automation, Minneapolis, Minnesota; Apr. 1996; 6 pages.
Van Der Burg et al.; Anti-Lock Braking and Traction Control Concept for All-Terrain Robotic Vehicles; Proceedings of the 1997 IIEE International Conference on Robotics and Automation; Albuquerque, New Mexico; Apr. 1997; 6 pages.
Hutchison et al.; Development of Control for a Serpentine Robot; Proceedings of the 7th IEEE International Symposium on Computational Intelligence in Robotics and Automation (CIRA2007); Jun. 2007; 6 pages; Jacksonville, FL.
PCT Application PCT/US10/38331; filed Jun. 11, 2009; Stephen C. Jacobsen; international search report issued Dec. 1, 2010.
PCT Application PCT/US2010/038339; filed Jun. 11, 2010; Stephen C. Jacobsen; international search report mailed Feb. 9, 2011.
PCT Application PCT/US2013/042739; filed May 24, 2013; Raytheon Company; international search report dated Aug. 21, 2013.
PCT Application PCT/US2013/067840; filed Oct. 31, 2013; Raytheon Company; international search report mailed Aug. 29, 2014.
PCT Application PCT/US2014//055673; filed Sep. 15, 2014; Raytheon Company; international search report mailed Jun. 25, 2015.
EP Application 08826145.8; filed Jul. 10, 2008; Raytheon Company; European Search Report dated Apr. 5, 2013.
U.S. Appl. No. 12/171,144, filed Jul. 10, 2008; Stephen C. Jacobsen; office action issued Aug. 11, 2010.
U.S. Appl. No. 12/694,996, filed Jan. 27, 2010; Stephen C. Jacobsen; office action issued Sep. 30, 2010.
U.S. Appl. No. 11/985,324, filed Nov. 13, 2007; Stephen C. Jacobsen; office action issued Nov. 1, 2010.
U.S. Appl. No. 12/151,730, filed May 7, 2008; Stephen C. Jacobsen; office action issued Nov. 15, 2010.
U.S. Appl. No. 12/820,881, filed Jun. 22, 2010; Stephen C. Jacobsen; office action issued Nov. 30, 2010.
U.S. Appl. No. 12/765,618, filed Apr. 22, 2010; Stephen C. Jacobsen; office action issued Apr. 6, 2011.
U.S. Appl. No. 11/985,320, filed Nov. 13, 2007; Stephen C. Jacobsen; office action issued Apr. 12, 2011.
U.S. Appl. No. 11/985,336, filed Nov. 13, 2007; Stephen C. Jacobsen; office action issued Jun. 14, 2011.
U.S. Appl. No. 12/350,693, filed Jan. 8, 2009; Stephen C. Jacobsen; office action issued Oct. 12, 2011.
U.S. Appl. No. 11/985,320, filed Nov. 13, 2007; Stephen C. Jacobsen; office action issued Nov. 25, 2011.
U.S. Appl. No. 12/814,302, filed Jun. 11, 2010; Stephen C. Jacobsen; office action issued Jan. 10, 2012.
U.S. Appl. No. 12/171,146, filed Jul. 10, 2008; Stephen C. Jacobsen; office action issued Feb. 9, 2012.
U.S. Appl. No. 12/350,693, filed Jan. 8, 2009; Stephen C. Jacobsen; office action issued Mar. 28, 2012.
U.S. Appl. No. 13/181,380, filed Jul. 12, 2011; Stephen C. Jacobsen; office action dated Jul. 17, 2012.
U.S. Appl. No. 12/171,146, filed Jul. 10, 2008; Stephen C. Jacobsen; office action dated Aug. 20, 2012.
U.S. Appl. No. 12/814,304, filed Jun. 11, 2010; Stephen C. Jacobsen; office action dated Nov. 13, 2012.
U.S. Appl. No. 12/117, 233, filed May 8, 2008; Stephen C. Jacobsen; office action dated Nov. 23, 2012.
U.S. Appl. No. 12/117,233, filed May 8, 2008; Stephen C. Jacobsen; office action dated Aug. 15, 2013.
U.S. Appl. No. 12/814,304, filed Jun. 11, 2010; Stephen C. Jacobsen; office action dated Oct. 24, 2013.
U.S. Appl. No. 12/814,304, filed Jun. 11, 2010; Stephen C. Jacobsen; office action dated May 22, 2014.
U.S. Appl. No. 13/665,669, filed Oct. 31, 2012; Fraser M. Smith; office action dated Jul. 7, 2014.
U.S. Appl. No. 14/026,284, filed Sep. 13, 2013; Stephen C. Jacobsen; office action issued Apr. 3, 2015.
U.S. Appl. No. 14/196,951, filed Mar. 4, 2014; Fraser M. Smith; office action dated Jun. 1, 2015.
U.S. Appl. No. 12/151,730, filed May 7, 2008; Stephen C. Jacobsen; notice of allowance issued Apr. 15, 2011.
U.S. Appl. No. 11/985,324, filed Nov. 13, 2007; Stephen C. Jacobsen; notice of allowance issued Apr. 18, 2011.
U.S. Appl. No. 12/820,881, filed Jun. 22, 2010; Stephen C. Jacobsen; notice of allowance issued Jun. 9, 2011.
U.S. Appl. No. 11/985,336, filed Nov. 13, 2007; Stephen C. Jacobsen; notice of allowance issued Jan. 19, 2012.
U.S. Appl. No. 12/765,618, filed Apr. 22, 2010; Stephen C. Jacobsen; notice of allowance issued Feb. 2, 2012.
U.S. Appl. No. 12/814,302, filed Jun. 11, 2010; Stephen C. Jacobsen; notice allowance dated Jul. 25, 2012.
U.S. Appl. No. 12/350,693, filed Jan. 8, 2009; Stephen C. Jacobsen; notice allowance dated Sep. 20, 2012.
U.S. Appl. No. 13/481,631, filed May 25, 2012; Ralph W. Pensel; notice of allowance dated Sep. 24, 2012.
U.S. Appl. No. 13/181,380, filed Jul. 12, 2011; Stephen C. Jacobsen; notice of allowance dated Dec. 24, 2012.
U.S. Appl. No. 12/171,146, filed Jul. 10, 2008; Stephen C. Jacobsen; notice of allowance dated Jun. 24, 2013.
U.S. Appl. No. 12/814,304, filed Jun. 11, 2010; Stephen C. Jacobsen; notice of allowance mailed Sep. 10, 2014.
U.S. Appl. No. 13/665,669, filed Oct. 31, 2012; Fraser Smith.
U.S. Appl. No. 14/026,284, filed Sep. 13, 2013; Stephen C. Jacobsen.
U.S. Appl. No. 14/196,951, filed Mar. 4, 2014; Fraser Smith.
Akin et al, “MORPHbots: Lightweight Modular Self Reconfigurable Robotics for Space Assembly, Inspection, and Servicing”, Space, Sep. 2006, 11 pages, University of Maryland.
Related Publications (1)
Number Date Country
20150251316 A1 Sep 2015 US