Robotics is an active area of research, and many different types of robotic vehicles have been developed for various tasks. For example, unmanned aerial vehicles have been quite successful in military aerial reconnaissance. Less success has been achieved with unmanned ground vehicles, however, in part because the ground environment is significantly more difficult to traverse than the airborne environment.
Unmanned ground vehicles face many challenges when attempting mobility. Terrain can vary widely, including for example, loose and shifting materials, obstacles, vegetation, limited width or height openings, steps, and the like. A vehicle optimized for operation in one environment may perform poorly in other environments.
There are also tradeoffs associated with the size of vehicle. Large vehicles can handle some obstacles better, including for example steps, drops, gaps, and the like. On the other hand, large vehicles cannot easily negotiate narrow passages or crawl inside pipes, and are more easily deterred by vegetation. Large vehicles also tend to be more readily spotted, and thus can be less desirable, such as for discrete surveillance applications. In contrast, while small vehicles are more discrete, surmounting obstacles becomes a greater navigational challenge.
A variety of mobility configurations have been adapted to traverse difficult terrain. These options include legs, wheels, and tracks. Legged robots can be agile, but use complex control mechanisms to move and achieve stability. Wheeled vehicles can provide high mobility, but provide limited traction and require width in order to achieve stability.
To operate and control the various functions of these various robotic devices, such as the drive systems, any sensor systems, any processing systems, or any other type of on-board system, the robot platforms they are built upon comprise dedicated systems fully integrated into the design and configuration of the robotic devices. These dedicated systems comprise individual component parts that are specifically configured for use within each robotic device. In other words, these dedicated systems are a part of the very design of the robotic devices. They are integrated, built-in components that are not interchangeable, modular, or intended for operation as a stand-alone electronic devices, nor are they self-contained (comprise their own chassis, framework, etc.) and removable relative to the robotic device.
Features and advantages of the invention will be apparent from the detailed description which follows, taken in conjunction with the accompanying drawings, which together illustrate, by way of example, features of the invention; and, wherein:
Reference will now be made to the exemplary embodiments illustrated, and specific language will be used herein to describe the same. It will nevertheless be understood that no limitation of the scope of the invention is thereby intended.
An initial overview of technology embodiments is provided below and then specific technology embodiments are described in further detail later. This initial summary is intended to aid readers in understanding the technology more quickly but is not intended to identify key features or essential features of the technology nor is it intended to limit the scope of the claimed subject matter.
A variety of robotic devices are known and have traditionally been configured with dedicated onboard control systems for controlling both the robotic device as well as various internal and external sensors. Such dedicated control systems can be complicated, costly, and difficult to reconfigure if modifications to the system are desired. For example, adding an external sensor can necessitate a reconfiguration of the dedicated control system. Robotic devices and robotic systems are disclosed that include non-dedicated smart control devices that can be associated therewith and that allow a high degree of customization. It is noted that the present scope includes any type of robotic system or device.
In one embodiment, for example, a robotic device is disclosed that can have a plurality of non-dedicated, smart control devices. Each smart control device can provide smart functionality to control an operational function of the robotic device.
A robotic system is also disclosed that can include a robotic device having a local non-dedicated, smart control device providing smart functionality to control an operational function of the robotic device. The robotic system can also include a remote control device to communicate operational information with the local smart control device to facilitate user control of the robotic device.
The robotic device 101 can include a communication subsystem 120 functionally or operationally coupled to and interfaced with (e.g., electrically coupled or in communication with, mechanically coupled, or both of these) the local smart control device 110 to facilitate control by the local smart control device 110 of one or more operational subsystems of the robotic device, such as a sensor subsystem 122, a motion subsystem (e.g., a drive subsystem 124 or a pose control subsystem 126), a local control subsystem (e.g., a local or degree of freedom (DOF) control subsystem 125), etc. The communication subsystem 120 can comprise any communication medium capable of controlling the robotic device 101. Non-limiting examples of such communication media can include electrical coupling, optical coupling, wireless coupling, Bluetooth coupling, and the like, including combinations thereof. As such, the type of communication medium may dictate the structure of the communication subsystem itself. For example, a communication subsystem utilizing electrical coupling can include wired connections from the various components of the robotic device. A communication subsystem utilizing wireless communication, on the other hand, can include a series of transmitters and receivers coupling the smart control device to the various subsystems.
Although
For example, the smart control device 110 can control one or more operational functions of the robotic device 101, such as controlling movement or pose of the robotic device 101, or controlling sensors of the robotic device 101. The smart control device 110 can control movement or pose of the robotic device 101 through actuation of the drive subsystem 124 or actuation of the pose control subsystem 126. Information from one or more sensors 123 of the sensor subsystem 122 or information from one or more sensors 111 of the smart control device 110 can be used to determine movement or pose of the robotic device 101. Operation of the sensors 111, 123 can be controlled by the smart control device 110. The smart control device 110 an also include a high-level functions module 112, which can provide functions such as localization and mapping, path retrace, etc., as discussed in more detail below. The communication subsystem 120 can be functionally coupled to the drive subsystem 124, the local control subsystem 125, the pose control subsystem 126, or the sensor subsystem 122. The communication subsystem 120 is also functionally coupled to the smart control device 110. The smart control device 110 can perform sensing, data processing, logic processing, command execution, and other functions germane to the robot to control movement or pose of the robotic device 101. Commands from the smart control device 110 can be delivered to the drive subsystem 124, the local control subsystem 125, the pose control subsystem 126, the sensor subsystem 122 or any other system or subsystem of the robotic device 101 in order to control the operation, control, movement or positioning of the robotic device 101. Similarly, feedback communication (e.g. from position or movement sensors) from the drive subsystem 124, the local control subsystem 125, the pose control subsystem 126, the sensor subsystem 122 or any other system or subsystem of the robotic device 101 can be delivered to the smart control device 110 via the communication subsystem 120. The sensor 111 of the smart control device 110 can also provide feedback. Thus, the smart control device 110 can be in communication with each component of the robotic device 101 via the communication subsystem 120 as applicable to facilitate selective or concurrent actuation of any portion of the drive subsystem 124, the local control subsystem 125, the pose control subsystem 126, deployment and operation of sensors, or control and operation of any other functionality of the robotic device 101.
As described herein, robotic devices according to aspects of the present disclosure can be designed such that logic processing, command execution, data processing, sensing, or any other “smart” function are not an integral part of the build of the robotic device, but rather performed on a non-dedicated smart control device operable to be mechanically and electrically coupled to the robotic device (i.e., supported by or about the robotic device and operable to interface with one or more systems or subsystems of the robotic device), and in many cases one that is removable. In this disclosure, the term non-dedicated refers to at a smart control device capable of operating as a stand-alone electronic device, removable relative to the robotic device. By operating a smart control device with a robotic device, “smart” functions can be performed on the smart control device as opposed to a dedicated system, such as one that is not a stand-alone device or fully integrated into (and perhaps non-removable from) the robotic device. In other words, the dedicated control, sensing, and processing operations (i.e., “smart” functions) that have been previously performed by the robotic device can now be more efficiently performed on the smart control device, with a significant amount, if not all, of the “smart” functionality of various integrated systems being packaged in the smart control device. Various advantages are provided by doing this. For example, a damaged smart control device can thus be easily removed and replaced without significant downtime for diagnostics and repair procedures, the smart control device can also be removed from the robotic device in order to facilitate reprogramming, data manipulation, and the like. In another aspect, a single smart control device can be interchangeable and used in multiple different robotic devices. It should be noted that a non-dedicated smart control device can include a pre-existing type of device (e.g., a smartphone, a tablet computer, or other portable or mobile personal computing device) programmable with software adapted for use with the robotic device. Moreover, the smart control device can comprise a self-contained device independent of the robotic device configured to support at least some, if not all, of the components needed to facilitate “smart” functionality within the robotic device. In other words, the smart control device can itself comprise a housing, a framework or chassis, with all of the internal logic, circuitry, and other components needed to provide the “smart” functions within the robotic device, and that is capable of interfacing with the robotic device 101 (e.g., via the communications subsystem) mechanically, electrically, and in any other manner to enable the smart control device to operate, activate, control, communicate with, etc. the robotic device 101, its systems and subsystems, and in some cases a remote control device.
The function of the smart control device 110 can extend far beyond controlling movement and pose of the robotic device 101 that it controls. For example, the smart control device 110 (e.g., the high-level function module 112), when used in conjunction with other sensors (e.g. IMU, cameras, RGBD cameras, LIDAR, GPS modules, etc.), can be used to implement and execute high-level functions such as location and mapping of the environment (e.g., using a camera and IMU) in which the robotic device is used. In another example, the smart control device 110, the high-level function module 112 can enable various behaviors, such as self-stabilization, target identification, target following, path recording, retrace a path and follow pre-defined paths, detect and identify objects, terrain and environment features, animals and people, and detect and avoid and/or overcome obstacles (including, developing behaviors through trial and errors combined with machine learning algorithms). In yet another example, the smart control device 110 can respond to robot internal and environment monitoring sensor inputs (e.g., map and travel along a chemical concentration gradient toward its source, identify hot spots, etc.). In addition, the smart control device 110 (e.g., via the high-level function module 112) can implement any other behavior that aims to deal with uncertainty in the environment.
Various smart control devices are contemplated, and any such device capable of controlling a robotic device is considered to be within the present scope. In general, a smart control device can include one or more sensors and has the ability to process data and provide communication (e.g., one-way, two-way, Bluetooth-based, RF-based, optical-based). Smart control device sensors can include a camera, a microphone, an accelerometer, a gyroscope, a gravity sensor, a compass, a barometer, a proximity sensor, a temperature sensor, a relative humidity sensor, a light sensor, a magnetic field sensor, an orientation sensor, or any other suitable sensor that may be incorporated into a smart device. Non-limiting examples of smart control devices can include cellular devices, wireless network devices, Bluetooth devices, and the like, including combinations thereof. In one aspect, the smart control device can be an existing device, such as a cellular phone having on-board computing and processing capabilities and sensors, such as a smartphone. In other aspects, the smart control device can be a tablet computer, a laptop computer, a wearable computing device, or again, any other portable or mobile computing device. Many smartphones, tablet computers, wearable computing devices, and laptop computers contain sufficient computational resources, sensors, and other “smart” functionality capable of controlling and interfacing with the robotic device, including movement of the drive subsystems, movement of the linkage subsystem, sensing and sensor data processing, data processing, video, audio, and the like. As such, upon interfacing with the systems of the robotic device, a smart control device can allow enhanced functionality of a robotic device without the need for complex integrated and dedicated control and processing systems.
Additionally, in some aspects the smart control device 110 can provide enhanced functionality to the robotic device 101. Such functionality is only limited by the capabilities of the particular smart control device being used. For example, using a smartphone allows the functionality available to or on the smartphone to be utilized in the control and operation of the robotic device 101. For example, a smartphone can provide wireless communication to the device, wireless communication external to the device, GPS data, accelerometer data, vibration detection, a user interface, camera or video functionality, audio input, audio output, sensor processing, navigation, control, and the like, including combinations thereof. Additionally, applications can be created for the smartphone that enhance the functionality of the device. Such applications can be very dynamic in nature, and can be updated directly to the smartphone without the need for disconnection or rebooting. Accordingly, a smartphone can provide similar functionality as a dedicated control system at a fraction of the cost, with an enhanced capability of dynamic reconfiguration, and built-in communication with a user over a wireless network, such as a cellular, Bluetooth, or Wi-Fi network to name a few.
As has been described, the smart control device 110 can be any device with sufficient on-board resources, functions and components (e.g., computational resources and components, sensors, etc.) to control a robotic device according to aspects of the present disclosure. In some cases, the smart control device 110 can be self-contained, portable, and allow independent user interaction separate from the robotic device. Thus, the robotic device 101 can be left in one environment while changes are made to the smart control device 110 in another environment. The smart control device 110 can also be considered to be a general-purpose communication device having its own form factor. Utilizing such a device having built-in communication capabilities can greatly simplify the design and cost of a robotic device. Additionally, the smart control device 110 can be disposed of and replaced with another unit for comparably low cost and effort, including being replaced by another smart control device having enhanced size, weight and power, and computation (SWaP-C) capabilities as technology improves and new devices become commercially available.
As mentioned above, the smart control device 110 can control sensors of the robotic device 101. This includes control of the sensor 123 of the sensor subsystem 122 as well as the sensor 111 of the smart control device 110. The sensor 123 can represent external data collection devices. The communication subsystem 120 can provide communication between the smart control device 110 and any such external data collection devices. Such ancillary external data collection devices can be devices or systems utilized to collect data from or otherwise manipulate the environment external to the device. Non-limiting examples of such can include any suitable sensor (e.g., an optical sensor or camera, a RGBD camera, an RFID reader, a gas analyzer, a spectrometer (e.g., a chemical spectrometer, spectrometers operating in different part of the electromagnetic spectrum, such as visible, infrared, x-ray, and gamma ray, and the like), a vibration sensor, an accelerometer, barometers, a pressure sensor, an inertial sensor, a gyroscope, a compass, a magnetometer, an explosive detector, radio isotopes sensors, alpha and beta particles sensors, a neutron detector, an RF detector, an electro-magnetic emission sensor, a physiologic sensor, a LIDAR, a stereo camera, a thermal sensor, an IR imager, an acoustic sensor, a strain sensor, a load sensor, a velocity sensor, a sound triangulation and location sensor, an electric field sensor), measurement devices or systems, inspection devices or systems, mass spectrometers, ion mobility sensors, chemiluminescent sensors, electron capture detection devices, electrochemical analyzers, specialized gas sensors, (spectroscopy—methane, propane, ammonia, CO, smoke, etc.), surface acoustic wave sensors, tactile whiskers, radiation detectors, metal detectors, other detector types, magnetometers, inertial measurement units, non-destructive inspection methods (x-ray, dye penetrant, ultrasonic, eddy current, magnetic particle, interferometry), and the like, including associated applications to handle the processing, storage and real-time or eventual communication and use of the results. A camera can be sensitive to a certain portion of the EM spectrum. In some aspects, the external data collection device can be an additional non-dedicated smart control device, such as, for example, a smartphone.
In one aspect, the smart control device 110 can be integrated into the robotic device 101 to allow the device to operate autonomously, in some cases for an extended period of time. In one example, the high-level functions module 112 can implement various movement primitives and/or higher level behaviors (e.g. simultaneous localization and mapping, path following and path retracing, target identification and following, object avoidance, obstacle detection and overcoming behaviors, machine learning enabled behaviors, and the like), which can be preprogrammed into the smart control device 110, including for example primitives to assume certain poses, and primitives for movement (e.g., forward, backwards). Control can include feedback from force sensors (e.g., at a joint of the robotic device 101) and environmental sensors (e.g. move to follow an increasing chemical concentration gradient; move toward a temperature hot zone; move toward a gamma radiation zone, and the like). In this way, hybrid human and automated control can be combined, and behavior that require coordinated motion of multiple smart control devices equipped robotic devices can be enabled. For example, high-level manual commands/primitives can be implemented using automated low-level feedback loops that execute the commands/primitives. Control function can be divided into subsystems, including for example, pose control, compliance control, movement control, force control, and hybrid combinations thereof. In one aspect, the local control subsystem 125 can be operable to implement behaviors at the joint level (e.g. controlling compliance, enabling fail-safe mode of operation, and the like).
In addition to controlling functionality, the smart control device 110 can provide benefits due to its removability from the robotic device 101. For example, upgrades, programming modifications, and the like can be accomplished remote from the robotic device 101 on the smart control device 110, which can then be physically or electrically (including wirelessly) coupled to the robotic device 101. As such, the smart control device 110 can be “plug-and-play,” and thus can provide functionality to the robotic device 101 upon connection thereto.
Furthermore, data collected by the sensor 123, the sensor 111, or other devices on the robotic device 101 can be stored as well as processed on the smart control device 110. Data can be removed from the robotic device 101 by merely removing the smart control device 110, upon which the data is resident. Additionally, data can be transmitted from the smart control device 110 to a remote location over a medium such as a cellular network, a computer network (e.g., LAN, internet, via Wi-Fi, etc.), a peer-to-peer network (e.g., via Bluetooth), or any other suitable wireless network. For those aspects utilizing a cellular device such as a smartphone, the cellular functionality is already resident on the smartphone, which can be readily utilized for communication to and from the smartphone. Thus, the smart control device 110 can facilitate connection to the internet 130 or other network, which can allow the smart control device 110 to share processor computational tasks with other computational assets on the network, such as a remote processor 131, and/or another smart control device.
The remote control device 102 can be operable to wirelessly communicate with the smart control device 110 of the robotic device 101 to facilitate remote control and access of the robotic device 101 by a user. The remote control device 102 can function as a user interface to allow a user to provide control instructions to the smart control device 110 over a wireless communication medium such as a cellular network, a computer network (e.g., LAN, internet, via Wi-Fi, etc.), a peer-to-peer network (e.g., via Bluetooth), or any other suitable wireless network. As such, a user can provide communication to the smart control device 110 via the remote control device 102 in order to control the robotic device 101. The data from the various sensors 123 on the robotic device 101 or the sensor 111 of the smart control device 110 can further be transmitted by the smart control device 110 to the remote control device 102. Thus, data from sensors, video, or audio from the integrated hardware of the smart control device 110, and the like can be communicated to the user, where further control commands can be delivered. Communication with the remote control device 102 can allow the smart control device 110 to share processor computational tasks with the remote control device 102, which may include a processor or other such computational assets, or communication with another smart control device. It is noted that the wireless communication medium can include any known wireless communication medium including, without limitation, cellular communication, Wi-Fi communication, Bluetooth communication, and the like, including combinations thereof. The remote control device 102 or the smart control device 110 can be operable to encrypt and decrypt data to provide secure communication.
The remote control device 102 can include any appropriate device capable of sending and receiving commands to and from the smart control device. Non-limiting examples of such remote control devices can include cellular phones, smartphones, tablet devices, laptop computers, desktop computers, and the like. In one embodiment, the remote control device 102 can comprise a non-dedicated, smart control device as described herein that can provide smart functionality facilitating user control of the robotic device 101. Thus, one non-dedicated, smart control device (e.g., the remote control device 102) can be used to control the robotic device 101, which is equipped with another non-dedicated, smart control device 110.
As a communication interface, the remote control device 102 can be configured to display information related to control or operation of the robotic device 101. Such information can be displayed in any suitable manner, such as visually, audibly, or haptically. For example, the remote control device 102 can be operable to display a video of images or reproduce audio captured by the robotic device 101. A visual display can be in a numerical or a graphical format, for example. The remote control device 102 can display information, such as related to position or force data of joints of the robotic device 101, by producing vibrations. In one aspect, the remote control device 102 can be operable to display sensor data from the robotic device 101, such as data collected by the sensor 123 or the sensor 111. In another aspect, the remote control device 102 can be operable to display commands given by an operator or user. In addition, the remote control device 102 can be operable to display feedback from the robotic device 101 to the operator of the actual state of the robotic device 101, such as position or force information of one or more degrees of freedom of the robotic device 101. The remote control device 102 can also be operable to provide a comparison of the commands given by the operator compared with the feedback from the feedback from the robotic device 101.
Additionally, in some embodiments the remote control device 102 can include or comprise a master control device, such as a replica or other type of master, for control of the robotic device 101. In a replica master control system, a replica master is located remotely from the robotic device 101. The replica master can contain the same joints as the robotic device 101, and can be manually manipulated into the desired poses. Sensors located at the joints sense the position of the joints, and these positions are communicated to the smart control device 110 on the robotic device 101 to actuate the pose control subsystem 126, the local control subsystem 125, or the drive subsystem 124 to attempt to establish the same pose. Optionally, the joints in a linkage subsystem can include force sensors, torque sensors, or both, allowing the force or torque on the joints to be measured. The joint forces or torques can optionally be communicated back to the replica master, providing force feedback into the control system. Various force feedback control systems are known that can be applied to embodiments of the present disclosure.
The remote control device 102 can comprise any suitable master controller or other device for remotely controlling the robotic device 101. For example, the remote control device 102 can comprise a replica master controller as discussed above, a video game controller, a video game interface device, a master controller that is wearable by an operator, a master controller that measures one or more joint positions of the operator, a master controller that measures forces generated at one or more joints of the operator, a master controller that measures at least one of position and force information and converts the information as a set of commands to the robotic device, a transformation master controller where human kinematics are transformed to kinematics of the robotic device 101 (with or without constraints that may be required to compensate for information that may be missing in kinetically redundant robotic devices), a force reflection master controller operable to provide force information from the robotic device 101 to the operator in at least one of a force applied to the operator, a vibration generated by at least one vibration source, a sound, or a visual display, or any other suitable remote control device. Information can be displayed haptically to an operator by a master control device, such as by applying forces or providing resistance to movement (e.g., a force reflection master controller). For example, forces at one or more joints of the robotic device 101 can be represented at a corresponding joint of a master control device. A master control device can also be configured to control an end effector of the robotic device 101. In this case, forces experienced by the end effector can also be indicated by the master control device.
In one aspect, the remote control device 102 can comprise a master control device that can communicate position or force information of the master and communicate this information to the robotic device 101 as a set of commands. In another aspect, a master control device can receive position or force information from the robotic device 101 for feedback to the user. For example, the smart control device 110 can communicate position or force information as a set of commands for controlling the master control device to provide feedback to the operator. In one embodiment, a master control device can be configured to transform human kinematics to kinematics of the robotic device 101 as required to carry out one or more tasks. In another example, the remote control device 102 can receive position or force data and can generate its own set of commands for providing feedback to the operator through a master control device.
The smart control device 110 can be coupled to the robotic device 101 by any useful coupling technique, including, for example, a physical mount. Such a physical mount can be coupled to the robotic device 101 in a variety of ways, all of which are included or contemplated in the present scope. For example, a mounting bracket can be coupled between a physical mount and a subsystem structure (e.g., a structure of the drive subsystem 124) or a component of a subsystem. The physical mount can provide additional functionality, such as a protective case, shock absorption, etc., depending upon its configuration. It should be recognized that the smart control device 110 can be coupled anywhere on the robotic device 101, such as to a component or structure of the drive subsystem 124 or the pose control subsystem 126.
The communication subsystem 120 can be functionally or operationally (e.g., electrically, mechanically, etc.) coupled between the smart control device 110 and the robotic device 101 to facilitate control by the smart control device 110 of the drive subsystem 124, the local control subsystem 125, the pose control subsystem 126, the sensor subsystem 122, or any other system or subsystem of the robotic device 101. The design of the physical mount can thus vary depending on the communication medium between the smart control device 110 and the robotic device 101. In some aspects, the physical mount can be primarily a physical support structure. Such may be the case for wireless communication between the smart control device 110 and the robotic device 101. In other aspects, the physical mount can also include or support a physical electrical communication connector such as a pinned connection or other type of physical connector for interfacing with the smart control device 110. Such may be the case for wired communications between the smart control device 110 and the robotic device 101.
In one aspect, a mount for the non-dedicated smart control device 110 can be configured to accept and removably support the smart control device 110, which can be configured to initiate and control operational functionality within the robotic device 101 upon being connected to the robotic device 101. The communication subsystem 120 can be functionally coupled between the robotic device 101 and the mount such that the robotic device 101 and the smart control device 110 are functionally coupled upon connection. The communication subsystem 120 facilitates control by the smart control device 110 (once connected) of one or more subsystems of the robotic device 101.
Various drive subsystems are contemplated, and any useful drive mechanism is considered to be within the present scope. Non-limiting examples of drive mechanisms can include tracks, wheels, legs, arms, constriction-mediated movement devices, propellers, and the like. In one specific embodiment, a drive subsystem can include a continuous track that is movable by a motor about a support frame. Similarly, various pose control subsystems are contemplated and any useful pose control mechanism is considered to be within the present scope. In one example, the pose control subsystem 126 can comprise a linkage subsystem. A linkage subsystem can include any suitable type of linkage mechanism to facilitate pose or motion of at least a portion of the robotic device 101. In one aspect, the linkage subsystem can include a multi-degree of freedom linkage. It should be recognized that joints or linkages of a linkage subsystem discussed herein can be actuated or passive, in any combination. In one aspect, a linkage mechanism can include an actuated joint to provide motion in one or more degrees of freedom. In another aspect, a linkage mechanism can include a passive joint that can be manipulated or movable to a selectively fixed position and held in place, for example, by incorporating an adjustable fastener. In a further aspect, a passive joint or linkage can include a dampener or a spring to control various aspects (e.g., those related to movement) of the joint or linkage. Joints of a linkage mechanism can be uni-axial, bi-axial, tri-axial joints, or any other suitable joint configuration. Joints need not be limited to revolute joints which provide bending or rotational movement. Prismatic joints that provide translational movement can also be included. Joints may incorporate both revolute and prismatic features to provide, for example, eccentric motions. Thus, joints can be configured to provide movement in rotational or translational degrees of freedom. It should be recognized that components or features of the linkage subsystem discussed herein can be applicable to a drive subsystem as well. Those skilled in the art will recognize the many different configurations or types of robotic devices that could be designed and used with a smart control device as taught herein to provide smart functionality to the robotic device as discussed herein. For example, these can include snake-like or serpentine type robotic devices, legged robotic devices, wheeled robotic devices, tracked robotic devices, and other ground traversing robotic devices. These can also include aerial or amphibious robotic devices. Essentially, it is contemplated that the smart device and associated smart functionality can be used to control the operational functions of any type of robotic device. As such, the present technology should not be limited in any way to those types of robotic devices specifically discussed herein.
In one embodiment, the drive subsystem 124 can include a drive to facilitate rotation, bending, or movement of the various components of the pose control subsystem 126. Similarly, a drive can be utilized to facilitate movement of the drive subsystem 124. In some aspects, the drive that actuates the drive subsystem 124 can also actuate all or a portion of the pose control subsystem 126. In other words, the same drive that causes movement of a drive subsystem can also cause movement of an associated part of the pose control subsystem 126. In other aspects, the drive that facilitates rotation, bending or movement of the components of the pose control subsystem 126 can be a dedicated drive. As will be recognized by those skilled in the art once in possession of the present disclosure, various types of drives and coupling techniques for applying drive power to a drive subsystem or a pose control subsystem can be utilized.
In addition, the robotic system 200 can include a remote control device 202, which can be configured in any suitable manner disclosed herein. The remote control device 202 can communicate operational information with the local smart control devices 210a, 210b to facilitate user control of the robotic devices 201a, 201b. As with other examples disclosed herein, the robotic devices 201a, 201b can be in communication with the remote control device 202 (which may include a processor), a remote processor (not shown), or other computational assets over a wireless communication medium such as a cellular network, a computer network (e.g., LAN, internet, via etc.), a peer-to-peer network (e.g., via Bluetooth), or any other suitable wireless network. Communication with the remote control device 202, a remote processor, or other computational assets can allow one or more of the smart control devices 210a, 210b to share processor computational tasks with other computational assets on a network, or with other smart control devices communicating with each other either via one or more communication hubs or peer-to-peer communication channels. In one aspect, the remote control device 202 can comprise a remote non-dedicated, smart control device as disclosed herein.
In addition, the smart control devices 210a, 210b can be in communication with one another (e.g., peer-to-peer) over a wireless communication medium (e.g., Bluetooth), which can facilitate sharing of computational resources among the local smart control devices, an exchange of data, communication of commands (i.e., in addition to or as an alternative to communication of commands from the remote control device 202). In one aspect, peer-to-peer communication between smart control devices can facilitate the creation of a self-organizing network and/or implementation of coordinated behaviors, sensing, detection, classification, and/or other capabilities. In one example, peer-to-peer communication capability can allow multiple robotic devices equipped with smart control devices to act as swarms, creating wide-aperture arrays where distributed multi-modal sensing capabilities along with smart control devices (each having limited SWaP-C) can significantly outperform the geo-location, mapping, detection, and effect that can be achieved by a single smart control device equipped system or a large group of independently operating smart control devices equipped system.
In one aspect, the remote control device 202 can be configured to display data collected by the sensors 223a, 223b of the robotic devices 201a, 201b. For example, multiple robotic devices can be deployed over an area or along a line to effectively provide a broad aperture array for data collection and data processing. Data collected from the one or more robotic devices (e.g., one or both of robotic devices 201a, 201b, and any others) can be coordinated and displayed by and on the remote control device 202.
The remote control device 302 can communicate operational information with one or more of the local smart control devices 310, 310′ to facilitate user control of the robotic device 301. As with other examples disclosed herein, the robotic device 301 can be in communication with the remote control device 302 (which may include a processor), a remote processor 331, or other computational assets over a wireless communication medium such as a cellular network, a computer network (e.g., LAN, internet 330, via Wi-Fi, etc.), or any other suitable wireless network. Communication with the remote control device 302, the remote processor 331, or other computational assets can allow one or more of the smart control devices 310, 310′ to share processor computational tasks with other computational assets on a network. In one aspect, the remote control device 302 can comprise a remote non-dedicated, smart control device as disclosed herein. In addition, the smart control devices 310, 310′ can be in communication with one another over a wired or a wireless communication medium, which can facilitate sharing of computational resources among the local smart control devices 310, 310′ such that processor computational tasks can be shared among multiple local smart control devices.
The sensor subsystem 322 can include one or more sensors 323. The smart control devices 310, 310′ can also include sensors 311, 311′, respectively. In addition, the smart control devices 310, 310′ can include high-level functions modules 312, 312′, respectively. Multiple local smart control devices 310, 310′ functionally coupled to the communication subsystem 320 can facilitate multiple communication paths for the on-board sensor 323 for data processing or for communication to an operator via the remote control device 302.
In some embodiments, at least two of the sensors 311, 311′, 323 can be of a common type (e.g., optical sensor type, audio/acoustic sensor type, etc.). In one aspect, sensors of a common type can provide the ability to compare data obtained from different sensors, such as a comparison between the sensor 311 or 311′ of a smart control device with the sensor 323 resident or on-board the robotic device 301. In another aspect, sensors of a common type can be used together to provide enhanced imaging. For example, at least two of the sensors 311, 311′, 323 can be optical sensors (e.g., cameras) to facilitate acquisition of three-dimensional imaging for viewing of images by an operator. In another example, at least two of the sensors 311, 311′, 323 can be audio/acoustic sensors (e.g., microphones) to facilitate acquisition of stereo sound. In yet another aspect, sensors of a common type can be operated independent of one another, which can provide multiple data references (e.g., multiple views when using cameras).
When using optical sensors, it may be desirable for the sensor 323 to include a camera having a higher resolution as compared to a camera of the smart control devices 310, 310′. In addition, cameras or camera filters sensitive to desired portions of the electromagnetic spectrum can be associated with the robotic device 301 to obtain data that may not be available to an onboard imaging system of the smart control devices 310, 310′, such as, for example, IR, UV, x-ray, gamma ray, microwave, and the like. Similarly, when sensing audio, such as from an audio pick-up device (e.g., a microphone), it may be desirable for the sensor 323 to include an audio pick-up device that is more sensitive as compared to an audio pick-up device of the smart control devices 310, 310′ for enhanced audio data collection.
The present disclosure further sets forth a method for operating and controlling a robotic device comprising obtaining a robotic device, the robotic device comprising one or more systems or subsystems, each operable to facilitate control of an operational function of the robotic device; operationally coupling or connecting, and supporting on-board the robotic device, one or more non-dedicated smart control devices with at least one of the one or more systems or subsystems of the robotic device, each smart control device providing smart functionality to the robotic device, such as to control an operational function of the robotic device; and operating the smart control device to initiate and control one or more of the operational functions of the robotic device. The method can further comprise operating the robotic device with a remote control device operable to communicate operational information to the local smart control device on-board the robotic device to facilitate remote user control of the robotic device. The operation of coupling or connecting the one or more non-dedicated smart control devices with one or more systems or subsystems of the robotic device can comprise operationally coupling or connecting the smart control device to a communication subsystem of the robotic device to facilitate control by the smart control devices of at least one operational subsystem of the robotic device. For example, the communication subsystem can be in communication and operable with a drive subsystem, a pose control subsystem, a sensor subsystem, or any other system or subsystem, or any combination of these.
In one example, the method can further comprise coupling or connecting a plurality of non-dedicated smart control devices with one or more systems or subsystems on a single robotic device, each of the plurality of smart control devices being operable to function as described herein.
The method can further comprise operating the at least one smart control device within at least one of a computer network or a cellular network, or both of these, wherein the at least one smart control device is operable to communicate with the computer or cellular network. In one example, at least one of the plurality of smart control devices can be operable to share processor computational tasks with a remote processor via the computer or cellular network. In another example, two or more smart control devices can be operable to communicate with each other and to share processor computational tasks with one another.
It is to be understood that the embodiments of the invention disclosed are not limited to the particular structures, process steps, or materials disclosed herein, but are extended to equivalents thereof as would be recognized by those ordinarily skilled in the relevant arts. It should also be understood that terminology employed herein is used for the purpose of describing particular embodiments only and is not intended to be limiting.
Reference throughout this specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment.
As used herein, the term “substantially” refers to the complete or nearly complete extent or degree of an action, characteristic, property, state, structure, item, or result. For example, an object that is “substantially” enclosed would mean that the object is either completely enclosed or nearly completely enclosed. The exact allowable degree of deviation from absolute completeness may in some cases depend on the specific context. However, generally speaking the nearness of completion will be so as to have the same overall result as if absolute and total completion were obtained. The use of “substantially” is equally applicable when used in a negative connotation to refer to the complete or near complete lack of an action, characteristic, property, state, structure, item, or result.
As used herein, a plurality of items, structural elements, compositional elements, or materials may be presented in a common list for convenience. However, these lists should be construed as though each member of the list is individually identified as a separate and unique member. Thus, no individual member of such list should be construed as a de facto equivalent of any other member of the same list solely based on their presentation in a common group without indications to the contrary. In addition, various embodiments and example of the present invention may be referred to herein along with alternatives for the various components thereof. It is understood that such embodiments, examples, and alternatives are not to be construed as de facto equivalents of one another, but are to be considered as separate and autonomous representations of the present invention.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided, such as examples of lengths, widths, shapes, etc., to provide a thorough understanding of embodiments of the invention. One skilled in the relevant art will recognize, however, that the invention can be practiced without one or more of the specific details, or with other methods, components, materials, etc. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring aspects of the invention.
While the foregoing examples are illustrative of the principles of the present invention in one or more particular applications, it will be apparent to those of ordinary skill in the art that numerous modifications in form, usage and details of implementation can be made without the exercise of inventive faculty, and without departing from the principles and concepts of the invention. Accordingly, it is not intended that the invention be limited, except as by the claims set forth below.
Number | Name | Date | Kind |
---|---|---|---|
1107874 | Appleby | Aug 1914 | A |
1112460 | Leavitt | Oct 1914 | A |
1515756 | Roy | Nov 1924 | A |
1975726 | Martinage | Oct 1934 | A |
2082920 | Tye | Jun 1937 | A |
2311475 | Schmeiser | Feb 1943 | A |
2312072 | Broadwater | Feb 1943 | A |
2329582 | Bishop | Sep 1943 | A |
2345763 | Mayne | Apr 1944 | A |
2701169 | Cannon | Feb 1955 | A |
2850147 | Hill | Sep 1958 | A |
2933143 | Robinson et al. | Apr 1960 | A |
2967737 | Moore | Jan 1961 | A |
3037571 | Zelle | Jun 1962 | A |
3060972 | Sheldon | Oct 1962 | A |
3107643 | Edwards | Oct 1963 | A |
3166138 | Dunn, Jr. | Jan 1965 | A |
3190286 | Stokes | Jun 1965 | A |
3215219 | Forsyth et al. | Nov 1965 | A |
3223462 | Dalrymple | Dec 1965 | A |
3266059 | Stelle | Aug 1966 | A |
3284964 | Saito | Nov 1966 | A |
3311424 | Taylor | Mar 1967 | A |
3362492 | Hansen | Jan 1968 | A |
3387896 | Sobota | Jun 1968 | A |
3489236 | Goodwin | Jan 1970 | A |
3497083 | Anderson et al. | Feb 1970 | A |
3565198 | Ames | Feb 1971 | A |
3572325 | Bazell et al. | Mar 1971 | A |
3609804 | Morrison | Oct 1971 | A |
3650343 | Helsell | Mar 1972 | A |
3700115 | Johnson et al. | Oct 1972 | A |
3707218 | Payne et al. | Dec 1972 | A |
3712481 | Harwood | Jan 1973 | A |
3715146 | Robertson | Feb 1973 | A |
3757635 | Hickerson et al. | Sep 1973 | A |
3808078 | Snellman et al. | Apr 1974 | A |
3820616 | Juergens | Jun 1974 | A |
3841424 | Purcell et al. | Oct 1974 | A |
3864983 | Jacobsen | Feb 1975 | A |
3933214 | Guibord et al. | Jan 1976 | A |
3934664 | Pohjola | Jan 1976 | A |
3974907 | Shaw et al. | Aug 1976 | A |
4015553 | Middleton | Apr 1977 | A |
4051914 | Pohjola | Oct 1977 | A |
4059315 | Jolliffe et al. | Nov 1977 | A |
4068905 | Black et al. | Jan 1978 | A |
4107948 | Molaug | Aug 1978 | A |
4109971 | Black et al. | Aug 1978 | A |
4132279 | van der Lende et al. | Jan 1979 | A |
4218101 | Thompson | Aug 1980 | A |
4260053 | Onodera | Apr 1981 | A |
4332317 | Bahre et al. | Jun 1982 | A |
4332424 | Thompson | Jun 1982 | A |
4339031 | Densmore | Jul 1982 | A |
4393728 | Larson et al. | Jul 1983 | A |
4396233 | Slaght | Aug 1983 | A |
4453611 | Stacy, Jr. | Jun 1984 | A |
4483407 | Iwamoto et al. | Nov 1984 | A |
4489826 | Dubson | Dec 1984 | A |
4494417 | Larson et al. | Jan 1985 | A |
4551061 | Olenick | Nov 1985 | A |
4589460 | Albee | May 1986 | A |
4621965 | Wilcock | Nov 1986 | A |
4636137 | Lemelson | Jan 1987 | A |
4646906 | Wilcox, Jr. et al. | Mar 1987 | A |
4661039 | Brenholt | Apr 1987 | A |
4671774 | Owsen | Jun 1987 | A |
4700693 | Lia et al. | Oct 1987 | A |
4706506 | Lestelle | Nov 1987 | A |
4712969 | Kimura | Dec 1987 | A |
4713896 | Jennens | Dec 1987 | A |
4714125 | Stacy, Jr. | Dec 1987 | A |
4727949 | Rea et al. | Mar 1988 | A |
4736826 | White et al. | Apr 1988 | A |
4752105 | Barnard | Jun 1988 | A |
4756662 | Tanie et al. | Jul 1988 | A |
4765795 | Rebman | Aug 1988 | A |
4784042 | Paynter | Nov 1988 | A |
4796607 | Allred, III et al. | Jan 1989 | A |
4806066 | Rhodes et al. | Feb 1989 | A |
4815319 | Clement et al. | Mar 1989 | A |
4815911 | Bengtsson et al. | Mar 1989 | A |
4818175 | Kimura | Apr 1989 | A |
4828339 | Thomas et al. | May 1989 | A |
4828453 | Martin et al. | May 1989 | A |
4848179 | Ubhayakar | Jul 1989 | A |
4862808 | Hedgecoxe et al. | Sep 1989 | A |
4878451 | Siren | Nov 1989 | A |
4900218 | Sutherland | Feb 1990 | A |
4909341 | Rippingale et al. | Mar 1990 | A |
4924153 | Toru et al. | May 1990 | A |
4932491 | Collins, Jr. | Jun 1990 | A |
4932831 | White et al. | Jun 1990 | A |
4936639 | Pohjola | Jun 1990 | A |
4951758 | Sonku et al. | Aug 1990 | A |
4997790 | Woo et al. | Mar 1991 | A |
5018591 | Price | May 1991 | A |
5021798 | Ubhayakar | Jun 1991 | A |
5022812 | Coughlan et al. | Jun 1991 | A |
5046914 | Holland et al. | Sep 1991 | A |
5080000 | Bubic | Jan 1992 | A |
5130631 | Gordon et al. | Jul 1992 | A |
5142932 | Moya et al. | Sep 1992 | A |
5172639 | Wiesman et al. | Dec 1992 | A |
5174168 | Takagi et al. | Dec 1992 | A |
5174405 | Carra et al. | Dec 1992 | A |
5186526 | Pennington | Feb 1993 | A |
5199771 | James et al. | Apr 1993 | A |
5205612 | Sugden et al. | Apr 1993 | A |
5214858 | Pepper et al. | Jun 1993 | A |
5219264 | McClure et al. | Jun 1993 | A |
5252870 | Jacobsen et al. | Oct 1993 | A |
5297443 | Wentz | Mar 1994 | A |
5317952 | Immega | Jun 1994 | A |
5337732 | Grundfest et al. | Aug 1994 | A |
5337846 | Ogaki et al. | Aug 1994 | A |
5350033 | Kraft | Sep 1994 | A |
5354124 | James | Oct 1994 | A |
5363935 | Schempf et al. | Nov 1994 | A |
5386741 | Rennex | Feb 1995 | A |
5413454 | Movsesian | May 1995 | A |
5426336 | Jacobsen et al. | Jun 1995 | A |
5428713 | Matsumaru | Jun 1995 | A |
5435405 | Schempf et al. | Jul 1995 | A |
5440916 | Stone et al. | Aug 1995 | A |
5443354 | Stone et al. | Aug 1995 | A |
5451135 | Schempf et al. | Sep 1995 | A |
5465525 | Mifune et al. | Nov 1995 | A |
5466056 | James et al. | Nov 1995 | A |
5469756 | Feiten | Nov 1995 | A |
5516249 | Brimhall | May 1996 | A |
5519814 | Rodriguez et al. | May 1996 | A |
5551545 | Gelfman | Sep 1996 | A |
5556370 | Maynard | Sep 1996 | A |
5562843 | Yasumoto | Oct 1996 | A |
5567110 | Sutherland | Oct 1996 | A |
5570992 | Lemelson | Nov 1996 | A |
5573316 | Wankowski | Nov 1996 | A |
5588688 | Jacobsen et al. | Dec 1996 | A |
5672044 | Lemelson | Sep 1997 | A |
5697285 | Nappi et al. | Dec 1997 | A |
5712961 | Matsuo | Jan 1998 | A |
5749828 | Solomon et al. | May 1998 | A |
5770913 | Mizzi | Jun 1998 | A |
5816769 | Bauer et al. | Oct 1998 | A |
5821666 | Matsumoto et al. | Oct 1998 | A |
5842381 | Feiten | Dec 1998 | A |
RE36025 | Suzuki | Jan 1999 | E |
5878783 | Smart | Mar 1999 | A |
5888235 | Jacobsen et al. | Mar 1999 | A |
5902254 | Magram | May 1999 | A |
5906591 | Dario et al. | May 1999 | A |
5984032 | Gremillion et al. | Nov 1999 | A |
5996346 | Maynard | Dec 1999 | A |
6016385 | Yee et al. | Jan 2000 | A |
6030057 | Fikse | Feb 2000 | A |
6056237 | Woodland | May 2000 | A |
6107795 | Smart | Aug 2000 | A |
6109705 | Courtemanche | Aug 2000 | A |
6113343 | Goldenberg et al. | Sep 2000 | A |
6132133 | Muro et al. | Oct 2000 | A |
6138604 | Anderson et al. | Oct 2000 | A |
6162171 | Ng et al. | Dec 2000 | A |
6186604 | Fikse | Feb 2001 | B1 |
6203126 | Harguth | Mar 2001 | B1 |
6260501 | Agnew | Jul 2001 | B1 |
6263989 | Won | Jul 2001 | B1 |
6264293 | Musselman et al. | Jul 2001 | B1 |
6264294 | Musselman et al. | Jul 2001 | B1 |
6281489 | Tubel et al. | Aug 2001 | B1 |
6323615 | Khairallah | Nov 2001 | B1 |
6325749 | Inokuchi et al. | Dec 2001 | B1 |
6333631 | Das et al. | Dec 2001 | B1 |
6339993 | Comello et al. | Jan 2002 | B1 |
6380889 | Herrmann et al. | Apr 2002 | B1 |
6394204 | Haringer | May 2002 | B1 |
6405798 | Barrett et al. | Jun 2002 | B1 |
6408224 | Okamoto et al. | Jun 2002 | B1 |
6411055 | Fujita et al. | Jun 2002 | B1 |
6422509 | Yim | Jul 2002 | B1 |
6430475 | Okamoto et al. | Aug 2002 | B2 |
6431296 | Won | Aug 2002 | B1 |
6446718 | Barrett et al. | Sep 2002 | B1 |
6450104 | Grant et al. | Sep 2002 | B1 |
6477444 | Bennett, III et al. | Nov 2002 | B1 |
6484083 | Hayward et al. | Nov 2002 | B1 |
6488306 | Shirey et al. | Dec 2002 | B1 |
6505896 | Boivin et al. | Jan 2003 | B1 |
6512345 | Borenstein et al. | Jan 2003 | B2 |
6522950 | Conca et al. | Feb 2003 | B1 |
6523629 | Buttz et al. | Feb 2003 | B1 |
6529806 | Licht | Mar 2003 | B1 |
6535793 | Allard | Mar 2003 | B2 |
6540310 | Cartwright | Apr 2003 | B1 |
6557954 | Hattori | May 2003 | B1 |
6563084 | Bandy et al. | May 2003 | B1 |
6574958 | MacGregor | Jun 2003 | B1 |
6576406 | Jacobsen et al. | Jun 2003 | B1 |
6595812 | Haney | Jul 2003 | B1 |
6610007 | Belson et al. | Aug 2003 | B2 |
6619146 | Kerrebrock | Sep 2003 | B2 |
6636781 | Shen et al. | Oct 2003 | B1 |
6651804 | Thomas et al. | Nov 2003 | B2 |
6652164 | Stiepel et al. | Nov 2003 | B2 |
6668951 | Won | Dec 2003 | B2 |
6708068 | Sakaue | Mar 2004 | B1 |
6715575 | Karpik | Apr 2004 | B2 |
6725128 | Hogg et al. | Apr 2004 | B2 |
6772673 | Seto et al. | Aug 2004 | B2 |
6773327 | Felice et al. | Aug 2004 | B1 |
6774597 | Borenstein | Aug 2004 | B1 |
6799815 | Krishnan et al. | Oct 2004 | B2 |
6802236 | Richardson | Oct 2004 | B1 |
6820653 | Schempf et al. | Nov 2004 | B1 |
6831436 | Gonzalez | Dec 2004 | B2 |
6835173 | Couvillon, Jr. | Dec 2004 | B2 |
6837318 | Craig et al. | Jan 2005 | B1 |
6840588 | Deland et al. | Jan 2005 | B2 |
6866671 | Tierney et al. | Mar 2005 | B2 |
6870343 | Borenstein et al. | Mar 2005 | B2 |
6889118 | Murray, IV et al. | May 2005 | B2 |
6917176 | Schempf et al. | Jul 2005 | B2 |
6923693 | Borgen et al. | Aug 2005 | B2 |
6936003 | Iddan | Aug 2005 | B2 |
6959231 | Maeda | Oct 2005 | B2 |
6971141 | Tak | Dec 2005 | B1 |
7017687 | Jacobsen et al. | Mar 2006 | B1 |
7020701 | Gelvin et al. | Mar 2006 | B1 |
7040426 | Berg | May 2006 | B1 |
7044245 | Anhalt et al. | May 2006 | B2 |
7069124 | Whittaker et al. | Jun 2006 | B1 |
7090637 | Danitz et al. | Aug 2006 | B2 |
7137465 | Kerrebrock et al. | Nov 2006 | B1 |
7144057 | Young et al. | Dec 2006 | B1 |
7171279 | Buckingham et al. | Jan 2007 | B2 |
7188473 | Asada et al. | Mar 2007 | B1 |
7188568 | Stout | Mar 2007 | B2 |
7228203 | Koselka et al. | Jun 2007 | B2 |
7235046 | Anhalt et al. | Jun 2007 | B2 |
7331436 | Pack et al. | Feb 2008 | B1 |
7387179 | Anhalt et al. | Jun 2008 | B2 |
7415321 | Okazaki et al. | Aug 2008 | B2 |
7475745 | DeRoos | Jan 2009 | B1 |
7539557 | Yamauchi | May 2009 | B2 |
7546912 | Pack et al. | Jun 2009 | B1 |
7597162 | Won | Oct 2009 | B2 |
7600592 | Goldenberg et al. | Oct 2009 | B2 |
7645110 | Ogawa et al. | Jan 2010 | B2 |
7654348 | Ohm et al. | Feb 2010 | B2 |
7762362 | Cutkosky et al. | Jul 2010 | B2 |
7775312 | Maggio | Aug 2010 | B2 |
7798264 | Hutcheson et al. | Sep 2010 | B2 |
7843431 | Robbins et al. | Nov 2010 | B2 |
7845440 | Jacobsen | Dec 2010 | B2 |
7860614 | Reger | Dec 2010 | B1 |
7874386 | Ben-Tzvi et al. | Jan 2011 | B2 |
7974736 | Morin et al. | Jul 2011 | B2 |
8002365 | Jacobsen et al. | Aug 2011 | B2 |
8002716 | Jacobsen et al. | Aug 2011 | B2 |
8042630 | Jacobsen | Oct 2011 | B2 |
8162410 | Hirose et al. | Apr 2012 | B2 |
8185241 | Jacobsen | May 2012 | B2 |
8205695 | Jacobsen et al. | Jun 2012 | B2 |
8225892 | Ben-Tzvi | Jul 2012 | B2 |
8317555 | Jacobsen et al. | Nov 2012 | B2 |
8392036 | Jacobsen et al. | Mar 2013 | B2 |
8393422 | Pensel | Mar 2013 | B1 |
8434576 | Ferguson | May 2013 | B1 |
8571711 | Jacobsen et al. | Oct 2013 | B2 |
8935014 | Jacobsen et al. | Jan 2015 | B2 |
9031698 | Smith | May 2015 | B2 |
9409292 | Smith et al. | Aug 2016 | B2 |
10646993 | Wiley | May 2020 | B1 |
11209887 | Jung | Dec 2021 | B1 |
20010037163 | Allard | Nov 2001 | A1 |
20020038168 | Kasuga et al. | Mar 2002 | A1 |
20020128714 | Manasas et al. | Sep 2002 | A1 |
20020140392 | Borenstein et al. | Oct 2002 | A1 |
20020189871 | Won | Dec 2002 | A1 |
20030000747 | Sugiyama et al. | Jan 2003 | A1 |
20030069474 | Couvillon, Jr. | Apr 2003 | A1 |
20030097080 | Esashi et al. | May 2003 | A1 |
20030110938 | Seto et al. | Jun 2003 | A1 |
20030223844 | Schiele et al. | Dec 2003 | A1 |
20040030571 | Solomon | Feb 2004 | A1 |
20040099175 | Perrot et al. | May 2004 | A1 |
20040103740 | Townsend et al. | Jun 2004 | A1 |
20040168837 | Michaud et al. | Sep 2004 | A1 |
20040216931 | Won | Nov 2004 | A1 |
20040216932 | Giovanetti et al. | Nov 2004 | A1 |
20050007055 | Borenstein et al. | Jan 2005 | A1 |
20050027412 | Hobson et al. | Feb 2005 | A1 |
20050085693 | Belson et al. | Apr 2005 | A1 |
20050107669 | Couvillon, Jr. | May 2005 | A1 |
20050115337 | Tarumi | Jun 2005 | A1 |
20050166413 | Crampton | Aug 2005 | A1 |
20050168068 | Courtemanche et al. | Aug 2005 | A1 |
20050168070 | Dandurand | Aug 2005 | A1 |
20050225162 | Gibbins | Oct 2005 | A1 |
20050235898 | Hobson et al. | Oct 2005 | A1 |
20050235899 | Yamamoto et al. | Oct 2005 | A1 |
20050288819 | de Guzman | Dec 2005 | A1 |
20060000137 | Valdivia y Alvarado et al. | Jan 2006 | A1 |
20060005733 | Rastegar et al. | Jan 2006 | A1 |
20060010702 | Roth et al. | Jan 2006 | A1 |
20060010998 | Lloyd et al. | Jan 2006 | A1 |
20060070775 | Anhalt et al. | Apr 2006 | A1 |
20060117324 | Alsafadi et al. | Jun 2006 | A1 |
20060156851 | Jacobsen et al. | Jul 2006 | A1 |
20060225928 | Nelson | Oct 2006 | A1 |
20060229773 | Peretz | Oct 2006 | A1 |
20060290779 | Reverte et al. | Dec 2006 | A1 |
20070029117 | Goldenberg et al. | Feb 2007 | A1 |
20070156286 | Yamauchi | Jul 2007 | A1 |
20070193790 | Goldenberg et al. | Aug 2007 | A1 |
20070260378 | Clodfelter | Nov 2007 | A1 |
20070289786 | Cutkosky et al. | Dec 2007 | A1 |
20070293989 | Norris | Dec 2007 | A1 |
20080115687 | Gal | May 2008 | A1 |
20080136254 | Jacobsen | Jun 2008 | A1 |
20080164079 | Jacobsen | Jul 2008 | A1 |
20080168070 | Naphade et al. | Jul 2008 | A1 |
20080192569 | Ray et al. | Aug 2008 | A1 |
20080215185 | Jacobsen et al. | Sep 2008 | A1 |
20080272647 | Hirose et al. | Nov 2008 | A9 |
20080281468 | Jacobsen et al. | Nov 2008 | A1 |
20080284244 | Hirose et al. | Nov 2008 | A1 |
20090025988 | Jacobsen et al. | Jan 2009 | A1 |
20090035097 | Loane | Feb 2009 | A1 |
20090095209 | Jamieson | Apr 2009 | A1 |
20090171151 | Choset et al. | Jul 2009 | A1 |
20090212157 | Arlton et al. | Aug 2009 | A1 |
20090248202 | Osuka et al. | Oct 2009 | A1 |
20100012320 | Vail, III | Jan 2010 | A1 |
20100030377 | Unsworth | Feb 2010 | A1 |
20100036544 | Mashiach | Feb 2010 | A1 |
20100258365 | Jacobsen | Oct 2010 | A1 |
20100268470 | Kamal et al. | Oct 2010 | A1 |
20100318242 | Jacobsen et al. | Dec 2010 | A1 |
20110288684 | Farlow | Nov 2011 | A1 |
20120185095 | Rosenstein et al. | Jul 2012 | A1 |
20120205168 | Flynn et al. | Aug 2012 | A1 |
20120264414 | Fung | Oct 2012 | A1 |
20120277914 | Crow et al. | Nov 2012 | A1 |
20120292120 | Ben-Tzvi | Nov 2012 | A1 |
20140121835 | Smith | May 2014 | A1 |
20140246287 | Zhang | Sep 2014 | A1 |
20150081092 | Jacobsen et al. | Mar 2015 | A1 |
20150142252 | Hutson | May 2015 | A1 |
20170355405 | Podnar | Dec 2017 | A1 |
20190291277 | Oleynik | Sep 2019 | A1 |
20190358822 | Wojciechowski | Nov 2019 | A1 |
20220019236 | Ben-David | Jan 2022 | A1 |
20230371769 | Kwak | Nov 2023 | A1 |
20240107281 | Siswick | Mar 2024 | A1 |
20240149446 | Tomiie | May 2024 | A1 |
Number | Date | Country |
---|---|---|
2512299 | Sep 2004 | CA |
1603068 | Apr 2005 | CN |
2774717 | Apr 2006 | CN |
1970373 | May 2007 | CN |
3025840 | Feb 1982 | DE |
3626238 | Feb 1988 | DE |
3626328 | Feb 1988 | DE |
19617852 | Oct 1997 | DE |
19714464 | Oct 1997 | DE |
19704080 | Aug 1998 | DE |
10018075 | Jan 2001 | DE |
102004010089 | Sep 2005 | DE |
0105418 | Apr 1984 | EP |
0584520 | Mar 1994 | EP |
0818283 | Jan 1998 | EP |
0924034 | Jun 1999 | EP |
1510896 | Mar 2005 | EP |
1444043 | Dec 2005 | EP |
1832501 | Sep 2007 | EP |
1832502 | Sep 2007 | EP |
2659321 | May 2015 | EP |
2638813 | May 1990 | FR |
2660730 | Oct 1991 | FR |
2850350 | Jul 2004 | FR |
1199729 | Jul 1970 | GB |
51-106391 | Aug 1976 | JP |
52 57625 | May 1977 | JP |
58-89480 | May 1977 | JP |
52122431 | Sep 1977 | JP |
S58-80387 | May 1983 | JP |
59139494 | Sep 1984 | JP |
6015275 | Jan 1985 | JP |
6047771 | Mar 1985 | JP |
6060516 | Apr 1985 | JP |
60139576 | Jul 1985 | JP |
S60-211315 | Oct 1985 | JP |
61-1581 | Jan 1986 | JP |
S61-180885 | Jan 1986 | JP |
S61-20484 | Feb 1986 | JP |
61-54378 | Mar 1986 | JP |
61-75069 | Apr 1986 | JP |
61-89182 | May 1986 | JP |
S62-36885 | Mar 1987 | JP |
62-165207 | Jul 1987 | JP |
62-162626 | Oct 1987 | JP |
SHO 63-32084 | Mar 1988 | JP |
63306988 | Dec 1988 | JP |
H02-109691 | Apr 1990 | JP |
04092784 | Mar 1992 | JP |
4126656 | Apr 1992 | JP |
5-3087 | Jan 1993 | JP |
5147560 | Jun 1993 | JP |
5-270454 | Oct 1993 | JP |
5-286460 | Nov 1993 | JP |
6-115465 | Apr 1994 | JP |
8-133141 | Nov 1994 | JP |
7-216936 | Aug 1995 | JP |
7-329837 | Dec 1995 | JP |
7-329841 | Dec 1995 | JP |
9-142347 | Jun 1997 | JP |
H11-277466 | Oct 1999 | JP |
11-347970 | Dec 1999 | JP |
2003-019985 | Jan 2003 | JP |
2003-237618 | Aug 2003 | JP |
2003-315486 | Nov 2003 | JP |
2004-080147 | Mar 2004 | JP |
3535508 | Jun 2004 | JP |
2004-536634 | Dec 2004 | JP |
2005-019331 | Jan 2005 | JP |
2005-081447 | Mar 2005 | JP |
2005-111595 | Apr 2005 | JP |
2006-510496 | Mar 2006 | JP |
2007-237991 | Sep 2007 | JP |
2010-509129 | Mar 2010 | JP |
2012-056001 | Mar 2012 | JP |
2001-0001624 | Jan 2001 | KR |
2006-0131167 | Dec 2006 | KR |
WO 8702635 | May 1987 | WO |
WO 9637727 | Nov 1996 | WO |
WO 9726039 | Jul 1997 | WO |
WO 0010073 | Feb 2000 | WO |
WO 0216995 | Feb 2002 | WO |
WO 02095517 | Nov 2002 | WO |
WO 03030727 | Apr 2003 | WO |
WO 03037515 | May 2003 | WO |
WO 2004056537 | Jul 2004 | WO |
WO 2005018428 | Mar 2005 | WO |
WO 2006068080 | Jun 2006 | WO |
WO 2008049050 | Apr 2008 | WO |
WO 2008076194 | Jun 2008 | WO |
WO 2008127310 | Oct 2008 | WO |
WO 2008135978 | Nov 2008 | WO |
WO 2009009673 | Jan 2009 | WO |
WO 2010070666 | Jun 2010 | WO |
WO 2012061932 | May 2012 | WO |
WO 2012125903 | Sep 2012 | WO |
Entry |
---|
Liquidhandwash. (Aug. 26, 2017). Mobile Phone Controlled Robot. Instructables. Retrieved Jan. 19, 2023, from https://www.instructables.com/Mobile-Phone-Controlled-Robot-1/ (Year: 2017). |
Samanfern. (Oct. 20, 2018). Bluetooth Controlled Car. Arduino Project Hub. Retrieved Jan. 19, 2023, from https://projecthub.arduino.cc/samanfern/c71cd04b-79fd-4d0a-8a4b-b1dacc2f7725 (Year: 2018). |
Jithin. (Oct. 1, 2020). How to Make a Smart Phone Controlled Robot? Complete Step by Step Instructions. Rootsaid. Retrieved Jan. 19, 2023, from https://rootsaid.com/smart-phone-controlled-robot/ (Year: 2020). |
Benefits of encryption and How It works | global payments integrated. Global Payments Integrated. (Mar. 10, 2020). https://www.globalpaymentsintegrated.com/en-us/blog/2020/03/10/the-benefits-of-encryption-and-how-it-works (Year: 2020). |
Strickland, J. (Apr. 9, 2008). How shared computing works. HowStuffWorks. https://computer.howstuffworks.com/shared-computing.htm#:˜:text=Shared%20computing%20is%20a%20kind,to%20help%20achieve%20a%20goal. (Year: 2008). |
Advertisement, International Defense review, Jane's information group, Nov. 1, 1990, p. 54, vol. 23, No. 11, Great Britain. |
Arduino, How to Make a Bluetooth Controlled RC Car at Home—Arduino Project Hub, https://create.arduino.cc/projecthub/shubhamsuresh/how-to-make-a-bluetooth-controlled-rc-car-at-home/521212, Sep. 30, 2019, 28 pages, New York, New York. |
Arnold, Henry, “Cricket the robot documentation.” online manual available at http://www.parallaxinc.com, 22 pages. |
Berlin et al., “MEMS-based control of structural dynamic instability”, Journal of Intelligent Material Systems and Structures, Jul. 1998 pp. 574-586, vol. 9. |
Blackburn, et al.; Improved mobility in a multi-degree-of-freedom unmanned ground vehicle; Unmanned Ground Vehicles Technology VI; Sep. 2, 2004; 124-134; Proceedings of SPIE vol. 5422. |
Braure, Jerome, “Participation to the construction of a salamander robot: exploration of the morphological configuration and the locomotion controller”, Biologically Inspired Robotics Group, master thesis, Feb. 17, 2004, pp. 1-46. |
Burg et al; Anti-Lock Braking and Traction Control Concept for All-Terrain Robotic Vehicles; Proceedings of the 1997 IEE International Conference on Robotics and Automation; Albuquerque, New Mexico; Apr. 1997; 6 pages. |
Celaya et al; Control of a Six-Legged Robot Walking on Abrupt Terrain; Proceedings of the 1996 IEE International Conference on Robotics and Automation, Minneapolis, Minnesota; Apr. 1996; 6 pages. |
Dowling, “Limbless Locomotion: Learning to crawl with a snake robot,” The Robotics Institute at Carnegie Mellon University, Dec. 1997, pp. 1-150. |
Goldfarb, “Design and energetic characterization of a liquid-propellant-powered actuator for self-powered robots,” IEEE Transactions On Mechatronics, Jun. 2003, vol. 8 No. 2. |
Hirose, et al., “Snakes and strings; new robotic components for rescue operations,” International Journal of Robotics Research, Apr.-May 2004, pp. 341-349, vol. 23, No. 4-5. |
Iagnemma, Karl et al., “Traction control of wheeled robotic vehicles in rough terrain with application to planetary rovers.” International Journal of Robotics Research, Oct.-Nov. 2004, pp. 1029-1040, vol. 23, No. 10-11. |
Jacobsen, et al., Advanced intelligent mechanical sensors (AIMS), Proc. IEEE Trandsucers 1991, Jun. 24-27, abstract only, San Fransico, CA. |
Jacobsen, et al., “Research robots for applications in artificial intelligence, teleoperation and entertainment”, International Journal of Robotics Research, 2004, pp. 319-330, vol. 23. |
Jacobsen, et al., “Multiregime MEMS sensor networks for smart structures,” Procs. SPIE 6th Annual Inter. Conf. on Smart Structues and Materials, Mar. 1-5, 1999, pp. 19-32, vol. 3673, Newport Beach CA. |
MacLean et al., “A digital MEMS-based strain gage for structural health monitoring,” Procs. 1997 MRS Fall Meeting Symposium, Nov. 30-Dec. 4, 1997, pp. 309-320, Boston Massachusetts. |
Mahabatra et al; “Design and Implementation of Coordinated Multipurpose Robotic System with RF and Light Communication Channels”; Paper entirely based on study, research and experiments. |
Matthew Heverly & Jaret Matthews: “A wheel-on-limb rover for lunar operation” Internet article, Nov. 5, 2008, pp. 1-8, http://robotics.estec.esa.int/i-SAIRAS/isairas2008/Proceedings/SESSION%2026/m116-Heverly.pdf. |
Mehling, et al.; “A Minimally Invasive Tendril Robot for In-Space Inspection”; Biomedical Robotics and Biomechatronis, 2006. |
NASA: “Nasa's newest concept vehicles take off-roading out of this world” Internet article, Nov. 5, 2008, http://www.nasa.gov/mission_pages/constellation/main/lunar_truck.html. |
Nilas Sueset et al., “A PDA-based high-level human-robot interaction” Robotics, Automation and Mechatronics, IEEE Conference Singapore, Dec. 1-3, 2004, vol. 2, pp. 1158-1163. |
Paap et al., “A robot snake to inspect broken buildings,” IEEE, 2000, pp. 2079-2082, Japan. |
Ren Luo “Development of a multibehavior-based mobile robot for remote supervisory control through the internet” IEEE/ ASME Transactions on mechatronics, IEEE Service Center, Piscataway, NY, Dec. 1, 2000, vol. 5, No. 4. |
Revue Internationale De defense, “3-D vision and urchin” Oct. 1, 1988, p. 1292, vol. 21, No. 10, Geneve CH. |
Schenker, et al.; “Reconfigurable robots for all terrain exploration”; 2000, CIT. |
Simmons et al; “Coordinated Deployment of Multiple, Heterogeneous Robots”; School of Computer Science, Carnegie Mellon University, Pittsburgh PA.; Honeywell Technology Center, Minneapolis, MN; Intelligent Robot Systems, 2000; vol. 3 pp. 2254-2260. |
Number | Date | Country | |
---|---|---|---|
20220203545 A1 | Jun 2022 | US |