The present invention relates to methods and systems for sensing forces (e.g., applied to a soft robotic system).
Over the past decade, the emerging field of soft robotics has shown increasing potential to dramatically expand the capabilities of the field of robotics. Currently, however, most demonstrations have been limited to precisely that, potential. For soft robotics to emerge as a truly useful technology and become widely used in real-world application, advances are required in actuation, controls, fabrication and system design, and integration with larger rigid systems. Also critically important is development of novel sensor technologies, able to quickly provide robust state information, both as individual sensors and as integrated sensing systems. Soft robots hold the potential for unprecedented levels of state awareness and perception of the surrounding environment impossible with traditional rigid-linked robots. This innate ability to yield to the environment and to sense and learn from that interaction is one of the biggest potential advantages of soft robots. By embracing this ability to interact, soft robots hold the potential to fundamentally change human-robot-interaction, and usher in the era of Robots Among Us that has been promised for decades. To achieve this leap forward in state awareness and embodied intelligence, a rethinking of soft sensing is necessary.
Many technologies have been presented to achieve myriad sensing modes in soft robots. Soft sensors (sensors composed of compliant materials, gels, liquids, or a combination of these housed inside a soft robotics component) have been developed using conductive grease [1], capacitive liquid [2], resistive ionic gels [3], waveguides [4], and many demonstrations with liquid metals [5-7], primarily focusing on a Eutectic of Gallium and Indium (EGaIn) [8]. These many sensor technologies can measure changes in length [9], bending [10], pressure [11], even temperature in the distal end of a soft finger [12], and several mixed-mode sensing models [2], including an extremely compelling sensor from Park et al. [13], able to sense two modes of stretch and pressure, all in one sensor. Other sensing techniques used in soft robotics have involved adhering traditional bend sensors to a soft actuator[14], embedded magnets and hall effect sensors[15], and optical methods including the SOFTcell project by Bajcsy and Fearing [16] in which tactile response was determined through optical analysis of a deformed membrane, and video tracking of markers adhered to or embedded in soft components[17].
While these studies present compelling sensors, further sensor development is necessary particularly in utilizing a suite of sensors to increase overall state awareness. In both traditional and soft robotics many have studied proprioceptive sensor systems, robot skin, and bioinspired sensing/proprioception. Thorough discussion of these broad fields can be found in several reviews of various subspecialties [18-22]. The value of multi-sensor systems to perceive different proprioceptive or exteroceptive phenomena is widely appreciated. However, as the number of sensors increases, the computation, data acquisition, and signal processing loads dramatically increase. Each sensor requires a dedicated channel to a data acquisition system or an analog to digital converter, and requires signal processing and computation. A sensor-skin with a grid of ten-by-ten sensors would be of relatively modest requirement for many applications. Using discrete nodes, this would require one hundred dedicated sensors. Multiplexing by separating signals into ten horizontal and ten vertical sensors reduces the load to twenty separate sensors, still a considerable burden for a single sensor-skin device. What is needed, then, are improved methods for tracking motion and sensing of soft robotic systems. The present disclosure satisfies this need.
Current sensor technologies to sense pressure, force, or various modes of displacement in robotics are largely electrical (resistive or capacitive). Thus, each sensor requires a dedicated analog-input to a controller (e.g., for ten sensors, one needs ten analog inputs). Illustrative techniques described herein use displacement and deformation of a sensor (e.g., elastomeric components, fibers, and liquids) to change a state (e.g., visual state) which is recordable by a digital camera. Robust, low cost digital cameras can record and transmit to a computer/controller megapixels of information at 30 hertz, for example. The method and system is able to harness existing machine-vision technology to dramatically broaden sensing bandwidth in the fields of robotics and soft robotics.
Example embodiments include, but are not limited to, the following.
1. A sensor system, comprising:
a material;
one or more sensors attached to the material, each of the sensors comprising a chamber containing a marker;
a digital imager (e.g., camera) positioned for capturing a series of digital images of the markers as a function of time;
an image processor for image processing the one or more images to detect:
one or more changes in the marker resulting from one or more motions of the chamber in response to one or more forces applied to the material, and
from the changes, a pressure or one or more displacement modes of the material in response to the one or more forces, the displacement modes comprising at least one of a bending mode, an elongation mode, or a twist mode.
2. The sensor system of clam 1, wherein:
the chamber comprises a channel containing a cable or fluid capable of moving along the channel in response to the one or more motions, and
the marker comprises a colored portion of the cable or the fluid.
3. The sensor system of example 2, wherein the changes consist essentially of a linear displacement of the colored portion along a coordinate axis.
4. The sensor system of example 3, further comprising a display assembly guiding movement of the markers in along the axis in a two dimensional plane imaged by the digital imager to form the images.
5. The sensor system of example 1, wherein the chamber contains the marker comprising a fluid and the changes consist essentially of a size of the marker in response to the motions comprising an expansion or contraction of the chamber.
6. The sensor system of example 1, further comprising a display assembly comprising the markers, wherein the display assembly outside a region of the material deforming in response to the one or more forces, such that the image processor tracks the changes even when the region is outside a field of view of the digital imagers.
7. The sensor system of example 1, further comprising a display assembly comprising the markers and a lighting system, wherein the lighting system controls lighting conditions for the capturing of the images so as to enhance identification of the markers in the images during the image processing.
8. The sensor system of example 1, further comprising a network or array of the sensors (e.g., between 5 and 100 sensors), each of the sensors comprising the chamber transmitting the one or more of the motions, or one or more components of the motions, to the markers. In one or more examples, the imager is a single camera or single array of the digital imagers capturing the images each comprising all of the markers.
9. The sensor system of example 8, wherein the image processor assigns each of a plurality of arrangements of the markers, or arrangements of the changes, to a different one of the displacement modes or combination of the displacement modes.
10. The sensor system of example 9, wherein:
the chambers each comprise a channel comprising a first end and a second end,
the first ends are distributed in three dimensions throughout a volume of the material deforming in response to the forces, and
the second ends containing the markers are arranged in a two dimensional plane imaged in the one or more images by the digital camera.
11. The sensor system of example 10, wherein the image processor:
associates each of the markers with locations of the first ends in the material;
determines the linear displacements of each of the markers; and
compares the linear displacements of each of the markers, taking into account the locations of the first ends associated with the each of the markers, so as to detect the displacement mode.
12. The sensor system of example 11, wherein the sensors comprise fibers, cables, or fluid moving in the channels, the first ends are distributed in array, and the markers are configured in a display assembly, so that for the displacement mode comprising:
the bending mode having a center of curvature:
a first set of the markers, attached to the first ends in a first row of array closest to the center of curvature, have the linear displacement in an opposite direction in the one or more images, as compared to a second set of the markers attached to the second ends in a second row of the array furthest from the center of curvature.
the elongation mode: all the markers have the linear displacement in the same direction in the one or more images,
the twist mode about a central twist axis, a third set of the markers, attached to the first ends at corners of the array furthest from the twist axis, have the linear displacement that is larger in the one or more images as compared to a fourth set of the markers attached to the first ends closer to the twist axis.
14. The sensor system of example 1, further comprising:
a computer comprising one or more processors including the image processor; one or more memories; and one or more programs stored in the one or more memories, wherein the one or more programs executed by the one or more image processors execute the image processing using a machine vision algorithm or machine learning.
15. The sensor system of example 1, wherein:
the marker comprises a colored cable inserted in the chamber comprising a casing, wherein the casing is attached to the material so that the cable is free to slide inside the casing in response to the displacement modes changing a shape of the casing.
16. The sensor system of example 1, wherein the chamber comprises a microfluidic channel comprising a colored fluid comprising the marker and the digital imager records displacement of the colored fluid in response to the force or pressure.
17. The sensor system of example 1, wherein the chamber comprises a channel comprising a compressible sensing part connected to a flexible incompressible transmission part passing through a display assembly, so that when the force is applied to the sensing part through the material, the channel is compressed, reducing a volume of the sensor part and forcing the marker into the transmission part in the display assembly.
18. The sensor system of example 1, wherein the chamber is embedded in or mounted on a surface of the material.
19. The sensor system of example 1, further comprising:
a display assembly comprising a window forming a boundary around each of the markers, the boundary delimiting an extent of an image frame for each of the series of images being processed by the image processing, wherein, for each image frame, the image processing:
obtains the image comprising image data;
crops the image frame to include only the a portion of the image within the boundary;
converts the image data to gray scale to accentuate differences in light and dark colors and to eliminate possible noise from reflection;
scales up every pixel value within the image frame to further accentuate the difference between a white background behind the marker;
detects a line edge of each of the markers using an edge detector algorithm;
returns at least one end point pixel of each of the line edges using a probability algorithm;
uses the end point pixel of each of the line edges to calculate the change comprising a displacement of the marker between successive ones of the image frames.
20. The sensor system of example 1, further comprising a tool comprising the material, wherein the image processor:
detects, from the changes, the pressure or the one or more displacement modes of the component in response to the one or more forces, and outputs a measure of the one or more displacement modes as proprioceptive feedback to a robotic system controlling the tool.
21. In one embodiment, the system and method senses displacement modes including bending, elongation, and twist using machine vision and encased cables. in another embodiment, the system and method senses pressure and force on a surface using machine vision of a fluid-filled tube and displacement of the enclosed fluid. In yet a further embodiment, the system and method senses force and pressure on a surface using machine vision to observe shape change of one or more liquid or elastomeric dots inside an elastomer.
Referring now to the drawings in which like reference numbers represent corresponding parts throughout:
In the following description of the preferred embodiment, reference is made to the accompanying drawings which form a part hereof, and in which is shown by way of illustration a specific embodiment in which the invention may be practiced. It is to be understood that other embodiments may be utilized and structural changes may be made without departing from the scope of the present invention.
Technical Description
Disclosed herein are methods and systems for measuring a force by capturing images of the deformations of a sensor and processing the images.
The fiber-based deformation sensor illustrated in
The sensor can be used individually or in groups to synergistically leverage the concepts from beam theory and mechanics of materials to infer system state from a strategically located system of sensors. A system comprising a soft robot comprising a properly configured array of these deformation and pressure sensors can provide state awareness far beyond that of individual sensors.
Euler-Bernoulli beam theory and classical mechanics of materials describe that beams experience stress and tension/compression throughout their cross-sections based on the mode of the applied loading (bending, tension/compression, twist, combined loading) [33].
For simple elongation, deformation is uniform across the cross-section, and proportional to the load applied (
Where δ is total displacement, P is applied load, L is total beam length, A is cross-section area, and E is Young's Modulus.
In bending, material closer to the center of curvature (smaller bend radius) experiences compression, material farther from the center of curvature (larger bend radius) experiences tension, and material along the neutral axis experiences neither tension nor compression. Within the linear elastic range, stress from bending (
where σx is tensile or compressive stress, M is applied bending moment, y is the distance from the neutral surface (positive toward the center of curvature), and I is the second moment of inertia. The negative sign indicates compression toward the center of bending. Strain follows the equation
where ϵx is the strain in the beam axis, y is the distance from the neutral surface (positive toward the center of curvature), and ρ is the radius of curvature of the bent beam. The negative indicates shortening toward the center of curvature.
Shearing stress due to torsion (
Where τ is shear stress, T is applied torque, ρ is distance from the axis of rotation, and J is polar moment of inertia. The angle of twist follows the equation
where ϕ is the total twist of the beam, L is beam length, J is the polar moment of inertia, and G is the shear modulus. We can find the change in length of a line (linear initially, helical after twist) parallel to the axis of the beam, a distance r from the twist axis. Initially of length L, the line becomes a helix after the beam twists by an angle ϕ, about its central axis. The helix (former line, now helix) length is found from the formula
L
helix=√{square root over (L2+(ϕr)2)} 6)
where Lhelix is the length of the helix, ϕ is the angle of twist found above, L is the beam length, and r is the distance from twist axis (see 23 for derivation). Thus, the change in length of a fiber parallel to the longitudinal axis is found to be:
ΔL=Lhelix−L 7)
where ΔL is the change in length. With L and ϕ constant for any given beam and loading condition, we see that Lhelix increases as r increases. Thus, the farther an element is from the axis of rotation, the more it will increase in length when experiencing a twist. Thus fibers in the corners of a square cross-section will experience more displacement than fibers at the center of the square faces, and a fiber at the center of the square face will not elongate at all.
While these formulae hold for beams within the linear elastic region, the principles (while not necessarily the magnitudes) remain true in the large deformation regime. See [23] for details on the mechanics of materials described here, further figures on bending modes, and sign conventions [33]).
While each fiber sensor provides local deformation information, significantly more information can be obtained from groups of the devices integrated at scale without undue hardware requirements.
Both microfluidic methods transmit to the same display assembly used to record fiber position, thus a single digital camera can capture data from fiber-based deformation sensors as well as microfluidic pressure sensors. The presented configuration records eleven sensors (nine fiber, one integrated microfluidic, and one surface mount microfluidic) captured by one digital camera, as that was sufficient for this proof of concept. Minimizing scale could greatly increase the number of discrete sensors possible with one camera.
a. Fiber-Based Deformation Sensor
Similar in concept to many soft robots, a square column-shaped soft sensor (elastomeric finger) is fabricated using multiple molding steps.
Mold 1 (first molding step). Three plastic bars (diameter 0.9 mm) were used to create the center cable chamber and two microfluidic chambers. The matrix material of the finger was a readily available elastomer, Ecoflex 00-30 (Smooth-On, Inc. Macungie, Pa., USA) in molds printed from a 3D Printer (Form 3, Formlabs, Somerville, Mass., USA).
Mold 2 (second molding step). Retaining the center plastic bar in the mold, the two other plastic bars were demolded. The silicon tube (inner diameter 0.5 mm, outer diameter 1 mm) was used to connect the microfluidic chambers on the top holes and extend the bottom holes. Then, the top carrier was attached to the center plastic bar and aligned with the other eight plastic bars into the second mold. The top carrier embedded in the soft sensor provides a surface to fix the cables.
Mold 3 (third molding step). The soft sensor was demolded from the second mold, keeping all the plastic bars and two silicon tubes inside the sensor, and then aligned them to the base holder. After alignment, the sensor was secured into the final mold and the finger was connected to the solid base holder once cured.
Integration. The high-strength fiber cables (Monofilament nylon thread, diameter 0.5 mm) were inserted into the soft sensor chambers and fixed using screws on the top carrier. The colored liquid was then injected into the microfluidic chambers.
For illustrative purposes, the fabrication of the deformation sensor is described along with the integrated microfluidic pressure sensor because the device should be fabricated concurrently into one integrated unit. However, the deformation sensor can also be fabricated without the integrated microfluidic pressure sensor. Moreover, the integrated pressure sensor in the finger motif can also be designed into most actuator systems that use a matrix of molded elastomer.
b. Microfluidic Pressure Sensor
While this integrated sensor described above provides useful overall pressure of the soft finger, a surface-mount microfluidic pressure sensor was also developed to expand sensing capabilities (
c. Color Cell Pressure Sensor
Chromatophore cells filled with pigment appear as small dark dots. To change perceived color, radial muscle fibers stretch the chromatophore cell from roughly spherical to a wide-thin disk shape of the same volume. Thus, when viewed from an axis normal to the disk-plane, the appearance changes from a small, dark dot in a near-transparent matrix to a larger colored disk. An array of these chromatophores in various colors allows the animal to present a variety of appearances. While cephalopods use their chromatophore cells to actively modulate their appearance (camouflage), the present invention uses passive cells as sensors. Fabricated into an elastomeric matrix, external pressure causes these spherical cells to deform into disks in a plane normal to the applied force. When viewed from an axis normal to the disk plane, the diameter of the disk increases with applied force.
An algorithm was designed to process two different possible image stream inputs: a real-time camera stream, or a previously recorded video. Real-time processing was implemented using a video stream from a Raspberry Pi Camera Module 2, with the constraint of the camera being aligned such that the painted filaments are approximately parallel to the horizontal axis. The videos recorded on a separate device were filmed with the same constraint. To address alignment issues across multiple runs, boundaries were digitally positioned around each of the channels with the filaments in the camera frame (current frame for live stream, first frame for recorded videos) before beginning the algorithm.
The OpenCV Python library for image processing was used to facilitate detection in each frame. Each frame was first cropped to include only the boundaries and then converted to be in grayscale to accentuate differences in light and dark colors and to eliminate possible noise from reflection. Every pixel value within the frame was then scaled up to further accentuate the difference between the white background and the black filaments. The Canny edge detector algorithm was then used to determine the edges of the filaments, and the Hough Lines Probability algorithm was used to return the start and endpoint pixel coordinates of each line edge. The algorithm was then allowed to iterate over each detected line, and the endpoint furthest to the right within each boundary was recorded as a pixel location in a CSV. Further details are provided in Appendix B.
a. Measurement Setup
The performance of the elastomeric finger containing fiber-based displacement sensors and fluid-based pressure sensor was evaluated in each actuation mode individually, although many applications may require mixed-mode sensing (elongation and twist combined, or bending along a non-primary axis) using the real-time vision algorithms described above. For the fiber-based sensor, separate characterization fixtures were employed for each mode of evaluation (bending, elongation, twist). As shown in
b. Results
Data was divided into fiber-based deformation sensors, (estimating soft finger displacement) and fluid-based pressure sensors (microfluidic and color cells). Fiber-based sensor characterization investigates displacement of a 3×3 grid of fibers as described in the Methods section.
(i) Fiber-Based Deformation Sensor
As the elastomeric finger undergoes displacement in the described mode, material distorts locally, consistent with theory from classical mechanics of materials ([23]). Fibers, attached at the distal end of the elastomeric finger are free to move inside their respective tubes, thus they do not elongate or compress. Rather they move along their tube and back through the display assembly. Thus, when the finger undergoes Bending direction 1 (
When the finger was rotated 90° and Bending direction 2 was investigated (
Tests in elongation (
(ii) Hysteresis
The hysteresis loop (wherein the actuation path does not overlay with release path, but instead creates a loop in bend angle, elongation, or twist vs fiber displacement) was also studied. If the hysteresis were due to the internal properties of the fiber sensors, it would not negate the value of the sensing system, but it should be addressed. Analysis of the still frames from the motion capture videos indicated that the actuation and release paths of the elastomeric finger do not trace out a similar path. In other words, the shape of the elastomeric finger is different at a given angle in the actuation (0°→90°) path than in the release (90°→0°) path. Thus, it would be expected that the fibers sense different finger geometry based on the path.
(iii) Microfluidic Pressure Sensors
The elastomeric finger was configured with an integrated microfluidic pressure sensor along its entire length. Consisting of a liquid-filled microfluidic channel, this sensor was intended to sense the overall pressure state in the elastomeric finger. Thus, repeatability and range are highly desirable; however maximizing sensitivity (ability to perceive a light touch) was not required for this sensor.
An example elastomeric finger described herein comprises nine fiber sensors to determine its pose, and two fluidic sensors to determine overall and local pressure states. By configuring fibers in a 3×3 matrix, classical Mechanics of Materials (See [23]) was used to determine pose during states of bending in both primary planes, twist about the primary axis, and elongation along the primary axis. While this example illustrated using a finger designed specifically to illustrate adherence to classical mechanics of materials theory, this state estimation could be applied to a range of soft actuators and soft robots in general. The technique can also be implemented in actuator design similar to the soft finger described in [15,39] with a roughly square cross-section. These fiber and fluidic sensors could be used in many soft robots with actuators having rectangular, round, or trapezoidal cross-sections, requiring sensors to be placed at based on beam theory for that cross-section. With their innate under-actuation and deformability, defining the pose of a soft robot with reasonable accuracy requires far more sensors than do traditional robots. One can readily imagine a soft robot requiring nine sensors (3×3 matrix) for each actuator to estimate its pose. Thus, using a convention technology, a three-fingered gripper would require 27 sensors, a simple quadruped would require 36, and a more complex robot would require many more. The circuitry and wiring required for this many discrete electrical sensors would quickly become burdensome. With the method and system according to embodiments presented herein, passive sensors are all routed back to one central display assembly and recorded by one digital camera. While the examples shows the number of sensors (11 sensors) in the display assembly was chosen as it was the number required to characterize the soft finger (nine deformation and two pressure sensors), other sensor numbers can be used. Any upgrading (to increase sampling frequency or resolution) could be contained to the camera system, while upgrading dozens of electrical sensors (as in conventional systems) would also be a comparatively sizeable task. As illustrated herein, many fibers could be routed back to one remote display assembly, where a single digital camera can track the motion of all markers in a controlled environment, optimally lit for contrast and marker tracking.
Example Hardware Environment
In one embodiment, the computer 702 operates by the hardware processor 704A performing instructions defined by the computer program 710 under control of an operating system 708. The computer program 710 and/or the operating system 708 may be stored in the memory 706 and may interface with the user and/or other devices to accept input and commands and, based on such input and commands and the instructions defined by the computer program 710 and operating system 708, to provide output and results. Output/results may be presented on the display 722 or provided to another device for presentation or further processing or action. In one embodiment, the display 722 comprises a liquid crystal display (LCD) having a plurality of separately addressable liquid crystals. Alternatively, the display 722 may comprise a light emitting diode (LED) display having clusters of red, green and blue diodes driven together to form full-color pixels. Each liquid crystal or pixel of the display 722 changes to an opaque or translucent state to form a part of the image on the display in response to the data or information generated by the processor 704 from the application of the instructions of the computer program 710 and/or operating system 708 to the input and commands. The image may be provided through a graphical user interface (GUI) module 718. Although the GUI module 718 is depicted as a separate module, the instructions performing the GUI functions can be resident or distributed in the operating system 708, the computer program 710, or implemented with special purpose memory and processors.
In one or more embodiments, the display 722 is integrated with/into the computer 702 and comprises a multi-touch device having a touch sensing surface (e.g., track pod or touch screen) with the ability to recognize the presence of two or more points of contact with the surface. Examples of multi-touch devices include mobile devices (e.g., IPHONE, NEXUS S, DROID devices, etc.), tablet computers (e.g., IPAD, HP TOUCHPAD, SURFACE Devices, etc.), portable/handheld game/music/video player/console devices (e.g., IPOD TOUCH, MP3 players, NINTENDO SWITCH, PLAYSTATION PORTABLE, etc.), touch tables, and walls (e.g., where an image is projected through acrylic and/or glass, and the image is then backlit with LEDs).
Some or all of the operations performed by the computer 702 according to the computer program 710 instructions may be implemented in a special purpose processor 704B. In this embodiment, some or all of the computer program 710 instructions may be implemented via firmware instructions stored in a read only memory (ROM), a programmable read only memory (PROM) or flash memory within the special purpose processor 704B or in memory 706. The special purpose processor 704B may also be hardwired through circuit design to perform some or all of the operations to implement the present invention. Further, the special purpose processor 704B may be a hybrid processor, which includes dedicated circuitry for performing a subset of functions, and other circuits for performing more general functions such as responding to computer program 710 instructions. In one embodiment, the special purpose processor 704B is an application specific integrated circuit (ASIC).
The computer 702 may also implement a compiler 712 that allows an application or computer program 710 written in a programming language such as C, C++, Assembly, SQL, PYTHON, PROLOG, MATLAB, RUBY, RAILS, HASKELL, or other language to be translated into processor 704 readable code. Alternatively, the compiler 712 may be an interpreter that executes instructions/source code directly, translates source code into an intermediate representation that is executed, or that executes stored precompiled code. Such source code may be written in a variety of programming languages such as JAVA, JAVASCRIPT, PERL, BASIC, etc. After completion, the application or computer program 710 accesses and manipulates data accepted from I/O devices and stored in the memory 706 of the computer 702 using the relationships and logic that were generated using the compiler 712.
The computer 702 also optionally comprises an external communication device such as a modem, satellite link, Ethernet card, or other device for accepting input from, and providing output to, other computers 702.
In one embodiment, instructions implementing the operating system 708, the computer program 710, and the compiler 712 are tangibly embodied in a non-transitory computer-readable medium, e.g., data storage device 720, which could include one or more fixed or removable data storage devices, such as a zip drive, floppy disc drive 724, hard drive, CD-ROM drive, tape drive, etc. Further, the operating system 708 and the computer program 710 are comprised of computer program 710 instructions which, when accessed, read and executed by the computer 702, cause the computer 702 to perform the steps necessary to implement and/or use the present invention or to load the program of instructions into a memory 706, thus creating a special purpose data structure causing the computer 702 to operate as a specially programmed computer executing the method steps described herein. Computer program 710 and/or operating instructions may also be tangibly embodied in memory 706 and/or sensor system 730, 100, thereby making a computer program product or article of manufacture according to the invention. As such, the terms “article of manufacture,” “program storage device,” and “computer program product,” as used herein, are intended to encompass a computer program accessible from any computer readable device or media.
Of course, those skilled in the art will recognize that any combination of the above components, or any number of different components, peripherals, and other devices, may be used with the computer 702.
A network 804 such as the Internet connects clients 802 to server computers 806. Network 804 may utilize ethernet, coaxial cable, wireless communications, radio frequency (RF), etc. to connect and provide the communication between clients 802 and servers 806. Further, in a cloud-based computing system, resources (e.g., storage, processors, applications, memory, infrastructure, etc.) in clients 802 and server computers 806 may be shared by clients 802, server computers 806, and users across one or more networks. Resources may be shared by multiple users and can be dynamically reallocated per demand. In this regard, cloud computing may be referred to as a model for enabling access to a shared pool of configurable computing resources.
Clients 802 may execute a client application or web browser and communicate with server computers 806 executing web servers 810. Such a web browser is typically a program such as MICROSOFT INTERNET EXPLORER/EDGE, MOZILLA FIREFOX, OPERA, APPLE SAFARI, GOOGLE CHROME, etc. Further, the software executing on clients 802 may be downloaded from server computer 806 to client computers 802 and installed as a plug-in or ACTIVEX control of a web browser. Accordingly, clients 802 may utilize ACTIVEX components/component object model (COM) or distributed COM (DCOM) components to provide a user interface on a display of client 802. The web server 810 is typically a program such as MICROSOFT'S INTERNET INFORMATION SERVER.
Web server 810 may host an Active Server Page (ASP) or Internet Server Application Programming Interface (ISAPI) application 812, which may be executing scripts. The scripts invoke objects that execute business logic (referred to as business objects). The business objects then manipulate data in database 816 through a database management system (DBMS) 814. Alternatively, database 816 may be part of, or connected directly to, client 802 instead of communicating/obtaining the information from database 816 across network 804. When a developer encapsulates the business functionality into objects, the system may be referred to as a component object model (COM) system. Accordingly, the scripts executing on web server 810 (and/or application 812) invoke COM objects that implement the business logic. Further, server 806 may utilize MICROSOFT'S TRANSACTION SERVER (MTS) to access required data stored in database 816 via an interface such as ADO (Active Data Objects), OLE DB (Object Linking and Embedding DataBase), or ODBC (Open DataBase Connectivity).
Generally, these components 800-816 all comprise logic and/or data that is embodied in/or retrievable from device, medium, signal, or carrier, e.g., a data storage device, a data communications device, a remote computer or device coupled to the computer via a network or via another data communications device, etc. Moreover, this logic and/or data, when read, executed, and/or interpreted, results in the steps necessary to implement and/or use the present invention being performed.
Although the terms “user computer”, “client computer”, and/or “server computer” are referred to herein, it is understood that such computers 802 and 806 may be interchangeable and may further include thin client devices with limited or full processing capabilities, portable devices such as cell phones, notebook computers, pocket computers, multi-touch devices, and/or any other devices with suitable processing, communication, and input/output capability.
Of course, those skilled in the art will recognize that any combination of the above components, or any number of different components, peripherals, and other devices, may be used with computers 802 and 806. Embodiments of the invention are implemented as a software application on a client 802 or server computer 806. Further, as described above, the client 802 or server computer 806 may comprise a thin client device or a portable device that has a multi-touch-based display.
In one or more examples, the one or more processors, memories, and/or computer executable instructions are specially designed, configured or programmed for performing machine learning or machine vision. The computer program instructions may include a pattern matching component for pattern recognition or applying a machine learning model (e.g., for analyzing data or training data input from a data store to perform the machine vision). In one or more examples, the processors may comprise a logical circuit for performing pattern matching or recognition, or for applying a machine learning model for analyzing data or train data input from a memory/data store or other device (e.g., an image from a camera). Data store/memory may include a database. In some examples, the pattern matching model applied by the pattern matching logical circuit may be a machine learning model, such as a convolutional neural network, a logistic regression, a decision tree, or other machine learning model. In one or more examples, the logical circuit comprises a semantic segregation logical circuit, a natural language processing/image captioning logical circuit, and an image reconstruction logical circuit.
The computer can be an embedded computer or processor, for example.
Example Process Steps
Block 900 represents fabricating or obtaining one or more sensors each comprising a chamber containing a marker.
Block 902 represents attaching the one or more sensors to a material or a tool (e.g., finger, arm) comprising the material.
Block 904 represents positioning/coupling a digital imager (e.g., a digital camera, charge coupled device (CCD), focal plane array) for capturing a series of digital images of the markers as a function of time.
Block 906 represents connecting an image processor for image processing the one or more images. The image processor is configured to detect:
one or more changes in the marker resulting from one or more motions of the chamber in response to one or more forces applied to the material, and
from the changes, a pressure or one or more displacement modes of the material in response to the one or more forces, the displacement modes comprising at least one of a bending mode, an elongation mode, or a twist mode.
Block 908 represents the end result, a sensor system. Embodiments include, but are not limited to, the following (referring also to
1. A sensor system 100, comprising:
a material 102;
one or more sensors 104 attached to the material 102, each of the sensors 104 comprising a chamber 106 containing a marker 108;
a digital imager 110 positioned for capturing one or more (or a series of) digital images 113 of the markers 108 as a function of time;
an image processor 700 for image processing the one or more images 113 to detect:
one or more changes 112 in the marker 108 resulting from one or more motions of the chamber 106 in response to one or more forces F applied to the material 102, and
from the changes 112, a pressure or one or more displacement modes of the material 102 in response to the one or more forces F, the displacement modes comprising at least one of a bending mode, an elongation mode, or a twist mode.
2. A proprioceptive sensor 100 for a soft robotic finger 190 or arm, comprising a microchannel system coupled to machine vision system that senses what's happening to the arm or finger 190 (e.g. whether the arm or finger is being compressed, elongated lengthwise or from one of the sides, or twisted, or bent, etc.) by analyzing images of the deformation of the arm or finger.
3. A sensor system 100, comprising:
a marker 108 attached to a compliant member 114 in a soft robot;
a digital camera 110 positioned to capture a series of images 113 of the marker 108 as a function of time as the compliant member 114 is displaced or subjected to a force F or pressure P; and
a computer 700 comprising one or more processors; one or more memories; and one or more programs stored in the one or more memories, wherein the one or more programs executed by the one or more processors execute a machine vision algorithm:
identifying a change 112 in the marker 108 from the recorded in the images 113; and
measuring or quantifying, from the change 112, at least one of a displacement mode of the compliant member 114 or a pressure/force applied to the compliant member.
4. The sensor system 100 of example 1, wherein:
the chamber 106 comprises a channel 116 containing a cable 118 or fluid 120 capable of moving along the channel 116 in response to the one or more motions M, and
the marker comprises a colored portion 122 of the cable or the fluid.
5. The sensor system of example 1 or 4, wherein the changes 112 consist essentially of a linear displacement 124 of the colored portion 122 along a coordinate axis 126.
6. The sensor system of any of the examples 1, 3, 4-5 further comprising a display assembly 128 guiding movement of the markers 108 in along the axis in a two dimensional plane 130 imaged by the digital imager 110 to form the images 113.
7. The sensor system of any of the examples 1 or 4-6, wherein the chamber 106 contains the marker 108 comprising a fluid 120 and the changes consist essentially of a size or area A of the marker 108 in response to the motions comprising an expansion or contraction of the chamber 106.
8. The sensor system of any of the examples 1, 3, 4-7, further comprising a display assembly 128 comprising the markers 108, wherein the display assembly 128 is outside a region 132 of the material 102 deforming in response to the one or more forces F, such that the image processor 700 tracks the changes 112 even when the region 132 is outside a field of view 134 of the digital imager 110.
9. The sensor system of any of the examples 1, 3, 4-8 further comprising a display assembly 128 comprising the markers 108 and a lighting system, wherein the lighting system controls lighting conditions for the capturing of the images 113 so as to enhance identification of the markers 108 in the images 113 during the image processing.
10. The sensor system of any of the examples 1-9, further comprising a network 135 or array of the sensors 104, each of the sensors 104 comprising the chamber 106 transmitting the one or more of the motions M, or one or more components of the motions, to the markers 108.
11. The sensor system of any of the examples 1 or 3-10, wherein the image processor 700 assigns each of a plurality of arrangements 136 of the changes (e.g., linear displacements) of all the markers 108 to a different one of the displacement modes or combination of the displacement modes.
12. The sensor system of any of the examples 1 or 4-11, wherein:
the chambers 106 each comprise a channel 116 comprising a first end 138 and a second end 140,
the first ends 138 are distributed in three dimensions throughout a volume of the material 102 deforming in response to the forces F, and
the second ends 140 containing/comprising the markers 108 are arranged in a two dimensional plane 130 imaged in the one or more images 113 by the digital camera 110.
13. The sensor system of example 12, wherein the image processor 700:
associates each of the markers 108 with locations of the first ends 138 in the material 102;
determines the linear displacements 124 of each of the markers 108; and
compares the changes 112 (e.g., linear displacements 124) of each of the markers 108, taking into account the locations of the first ends 138 associated with the each of the markers 108, so as to detect the displacement mode.
14. The sensor system of any of the examples 10-13, wherein the sensors 104 comprise fibers, cables 118, or fluid 120 moving in the channels 116, the first ends 138 are distributed in array 142, and the markers 108 are configured in a display assembly 128, so that for the displacement mode comprising:
the bending mode having a center of curvature:
a first set 144 of the markers 108, attached to the first ends 138 in a first row 146 of array 142 closest to the center of curvature, have the linear displacement 124 in an opposite direction 147 in the one or more images 113, as compared to a second set 148 of the markers attached to the second ends 140 in a second row 150 of the array furthest from the center of curvature; and/or
the elongation mode: all the markers 158 have the linear displacement 124 in the same direction in the one or more images 113,
the twist mode about a central twist axis 159, a third set 152 of the markers 108, attached to the first ends at corners 154 of the array 142 furthest from the twist axis, have the linear displacement 124 that is larger in the one or more images 113 as compared to a fourth set 155 of the markers attached to the first ends 138 closer to the twist axis 159.
15. The sensor system of any of the examples 1 or 4-14, further comprising:
a computer 700 comprising one or more processors including the image processor; one or more memories; and one or more programs stored in the one or more memories, wherein the one or more programs executed by the one or more image processors execute the image processing using a machine vision algorithm or machine learning.
16. The sensor system 100 of any of the examples 1 or 4-15, wherein:
the marker 108 comprises a colored cable 300 inserted in the chamber 106 comprising a casing 302, wherein the casing 302 is attached to the material 102 so that the cable 118 is free to slide inside the casing 302 in response to the displacement modes changing a shape of the casing 302.
17. The sensor system of any of the examples 1 or 4-15, wherein the chamber 106 comprises a microfluidic channel 116 comprising a colored fluid 127 comprising the marker 108 and the digital imager 110 records displacement of the colored fluid 127 in response to the force F or pressure P.
18. The sensor system of any of the examples 1 or 3-15, wherein the chamber 106 comprises a channel 116 comprising a compressible sensing part connected to a flexible incompressible transmission part passing through a display assembly 128, so that when the force is applied to the sensing part through the material, the channel is compressed, reducing a volume of the sensor part and forcing the marker into the transmission part in the display assembly.
19. The sensor system of any of the examples 1-18, wherein the chamber 106 is embedded in or mounted on a surface 160 of the material.
20. The sensor system of any of the examples 1-19, further comprising:
a display assembly 128 comprising a window 170 forming a boundary 172 around each of the markers, the boundary delimiting an extent of an image frame 174 for each of the series of images being processed by the image processing, wherein, for each image frame, the image processing:
obtains the image comprising image data;
crops the image frame to include only the a portion of the image within the boundary;
converts the image data to gray scale to accentuate differences in light and dark colors and to eliminate possible noise from reflection;
scales up every pixel value within the image frame to further accentuate the difference between a white background behind the marker;
detects a line edge 176 of each of the markers using an edge detector algorithm;
returns at least one end point pixel 178 of each of the line edges using a probability algorithm;
uses the end point pixel of each of the line edges to calculate the change comprising a displacement of the marker between successive ones of the image frames.
21. The sensor system of any of the examples 1 or 4-20, further comprising a tool 190 comprising the material 102, wherein the image processor 700:
detects, from the changes 112, the pressure or the one or more displacement modes of the component in response to the one or more forces F, and
outputs a measure of the one or more displacement modes as proprioceptive feedback to a robotic system controlling the tool.
22. The sensor system of any of the examples 1-21, wherein:
the marker comprises a plurality of colored cables 300 each inserted in a casing 302, wherein the casings 302 are attached to the compliant member so that one or more of the cables are free to slide inside their respective casing in response to the displacement modes changing a shape of the respective casings.
23. The sensor system of any of the example 3, wherein the soft robot further comprises a display assembly attached to the compliant member and the digital camera is positioned to capture the images of the cables moving in the display assembly.
24. The sensor system of any of any of the examples, wherein the displacement modes comprise elongation along and twist about a longitudinal axis, or bending about two orthogonal axes perpendicular to the longitudinal axis.
25. The sensor system of any of example 3, wherein the compliant member includes microfluidic channels comprising a colored liquid comprising the marker and the digital camera records displacement of the colored fluid in response to the force or pressure.
26. The sensor system of example 3, wherein the markers comprise liquid or elastomeric dots 200 and the machine vision algorithm identifies the change comprising a change in shape of the dots 200 in response to the force or pressure so as to quantify the pressure or the force.
27. The sensor system of any of the examples 1 or 3 or 22, wherein the markers comprise filaments.
28. The sensor system of any of the examples including a compliant member, wherein the compliant member comprises a finger or arm.
29. The sensor system of example 28, wherein the compliant member comprises an elastomer.
30. A vision-based method or system of sensing deformation and pressure in soft robots, including only passive components inside the soft robot.
31. A fiber-based deformation sensor wherein local material displacement in a soft robot is transmitted to a remote display assembly and tracked by a digital camera.
32. A fluidic sensor, wherein a pressure in a soft robot displaces liquid inside a microfluidic channel which is transmitted back to a display assembly for readout and analysis.
33. An integrated microfluidic pressure sensor, by which the overall pressure state inside the body of a soft robot is tracked.
34. A surface-mount pressure sensor to track contacts locally on the surface of a soft robot.
35. A color-cell pressure sensor 202, wherein the passive spherical color cell 200 is embedded in an elastomeric matrix. When an external force is applied to the elastomer, the color cell is compressed in the direction normal to the force, expanding it radially.
36. A multi-channel data acquisition device in the form of one or more CCD camera(s) coupled to a soft robotic system, wherein the CCD cameras are configured to record displacement in embedded liquid or fiber-based components inside an elastomeric finger-like structure. In one embodiment, the system is able to quantify elongation along and twist about a longitudinal axis, and bending about the two orthogonal axes perpendicular to the longitudinal axis. In another embodiment, the system is able to quantify contact pressure at various locations on the finger-like structure. The device may be used to detect mixed-mode perturbances (bending off axis or elongation and bending) as well as dynamic effects.
37. The system of any of the examples, wherein the sensor translates the deformation mode to linear displacement or size change of a marker.
38.
39.
40. A color cell pressure sensor wherein the color cell pressure sensor utilize active modulation for passive sensing. Spherical cells of colored liquid are embedded in an elastomeric substrate. When the substrate undergoes external pressure, local deformation causes the spherical cells to deform into a disk-like shape. Viewed from an axis normal to the disk plane, this causes the disks to appear larger than the original spheres. The applied force can then be determined from observation of a change in the diameter of the disk.
41. The system of any of the examples, comprising a network of sensors/markers (e.g., at least 10, or a number in a range of 5-20), comprising a single digital camera, CCD, or imaging sensor array for measuring the motion of the markers and the image processing of the images (each of the images containing all the markers) is used to determine the force(s).
42. A sensor outputting position and pressure data to a digital camera for real-time or offline data processing. A single camera can record and interpret data from many deformation and pressure sensors, providing a platform for state perception and embodied intelligence research. The camera does not record the elastomeric finger itself, but records instead the remotely located display assembly (
43. The system of any of the examples, comprising a bus (e.g., a mechanical bus) comprising the sensors (e.g., chambers or channels) transmitting the motions to the markers in a display.
The method may further include coupling/integrating the sensor system 100 of any of the examples in a robotic system or robot.
Method of Operation
Block 1000 represents capturing, using a digital camera, one or more digital images of one or more changes in a sensor in response to application of a force to the sensor.
Block 1002 represents computing a measurement of the response from the changes captured in the one or more images.
Embodiments of the method include, but are not limited to, the following.
1. The method comprising sensing the response comprising displacement modes of a the sensor in a soft robotic system, including bending, elongation, and twist, using the machine vision and encased cables attached to the soft robotic system.
2. The method of any of the examples, further comprising sensing pressure and force on a surface of the soft robotic system using machine vision of a fluid-filled tube attached to the soft robotic system and displacement of the enclosed fluid in the tube.
3. The method of any of the examples, further comprising sensing force and pressure on a surface of the soft robotic system comprising an elastomer, using the machine vision to observe a shape change of one or more liquid or elastomeric dots inside the elastomer.
4. The method of any of the examples, wherein the computing comprises the machine vision algorithm.
5. The method of any of the examples, wherein the computing comprises measuring the changes in position coordinates of the sensor in the images in response to the force.
6. The method of any of the examples, wherein the sensor comprises one or more cables, the one or more changes comprise one or more changes in one or more positions of the one or more cables, and the measurement of the response comprises the measurement of one or more displacement modes of the sensor including at least one of a bending mode, an elongation mode, or a twist mode.
7. The method of any of the examples, performed using the system of any of the examples illustrated in the example of
Advantages and Improvements
The field of robotics has long sought methods of perceiving various modes of displacement as well as methods of perceiving contact force/pressure with high resolution across surfaces (similar to nerves in skin). These needs are becoming amplified with the growth of the Co-Bot movement, in which robots are placed among humans in the workplace and daily life. Our techniques provide solutions for robust, low cost sensing using widely available digital cameras. Our method uses displacement of components (fibers inside channels, liquid inside tubes, and deformation of liquid cell) captured by digital camera to sense phenomena in the environment.
More specifically, the present disclosure describes an elastomeric finger with nine embedded fiber deformation sensors, one integrated pressure sensor, and one surface-mounted pressure sensor. The fiber sensors have been experimentally characterized in two orthogonal directions of bending, twist about the finger's primary axis, and extension. All modes of deformation followed the responses expected from by mechanics of materials and beam theory. The integrated microfluidic pressure sensor demonstrated a highly repeatable response to externally applied pressure with no saturation detected at 7N externally applied force. The surface-mounted pressure sensor (to sense contact locally) sensed much smaller applied forces (0.05-0.3N) but saturated when as little as 2N force was applied. As a contact sensor, early detection is more useful than high saturation levels. These results on a single elastomeric finger provide a foundation upon which a wide variety of sensorized actuators utilizing the present invention can be built (including actuators described in [39].
While the sensor designs presented here have value individually, a key advantage is that the sensors are fundamentally designed to be used in groups. Intended to be designed into a soft robot at the system level, a properly configured array of these deformation and pressure sensors can give state awareness far beyond that of individual sensors. Most sensors used in soft robots (and many sensors in general) vary in resistance or capacitance in response to a change in a physical parameter such as length, bend angle, or contact pressure. Each sensor requires wiring, electronic circuitry, and a dedicated input to a data acquisition system before the resulting signal is sent to a computer. Five sensors require five times the infrastructure. Embodiments described herein, on the other hand, use a digital camera to record the movement of markers on fiber sensors and colored liquid in microfluidic channels. Thus dozens of markers and fluid channels can be monitored almost as easily as one. Other camera-based soft robot state-estimation systems exist, but they primarily record the pose of the robot directly, thus requiring specific lighting conditions, unobstructed line-of-sight access to all parts of the robot.
Recording the sensor states rather than the elastomeric finger itself (as illustrated herein) presents several advantages:
The following references are incorporated by reference herein.
1. Muth, J. T.; Vogt, D. M.; Truby, R. L.; Mengüç, Y.; Kolesky, D. B.; Wood, R. J.; Lewis, J. A. Embedded 3D Printing of Strain Sensors within Highly Stretchable Elastomers. Advanced Materials 2014, 26, 6307-6312, doi:10.1002/adma.201400334.
2. Roberts, P.; Damian, D. D.; Shan, W.; Lu, T.; Majidi, C. Soft-Matter Capacitive Sensor for Measuring Shear and Pressure Deformation. In Proceedings of the 2013 IEEE International Conference on Robotics and Automation; May 2013; pp. 3529-3534.
3. Boivin, M.; Milutinović, D.; Wehner, M. Movement Error Based Control for a Firm Touch of a Soft Somatosensitive Actuator. In Proceedings of the 2019 American Control Conference (ACC); July 2019; pp. 7-12.
4. Zhao, H.; O'Brien, K.; Li, S.; Shepherd, R. Optoelectronically Innervated Soft Prosthetic Hand via Stretchable Optical Waveguides. Science Robotics 2016, 1, eaai7529, doi:10.1126/scirobotics.aai7529.
5. Cho, G.-S.; Park, Y.-J. Soft Gripper with EGaIn Soft Sensor for Detecting Grasp Status. Applied Sciences 2021, 11, 6957, doi:10.3390/app11156957.
6. Kim, T.; Lee, S.; Hong, T.; Shin, G.; Kim, T.; Park, Y.-L. Heterogeneous Sensing in a Multifunctional Soft Sensor for Human-Robot Interfaces. Science Robotics 2020, 5, eabc6878, doi:10.1126/scirobotics.abc6878.
7. Hammond, F. L.; Mengüç, Y.; Wood, R. J. Toward a Modular Soft Sensor-Embedded Glove for Human Hand Motion and Tactile Pressure Measurement. In Proceedings of the 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems; September 2014; pp. 4000-4007.
8. Chossat, J.-B.; Park, Y.-L.; Wood, R. J.; Duchaine, V. A Soft Strain Sensor Based on Ionic and Metal Liquids. IEEE Sensors Journal 2013, 13, 3405-3414, doi:10.1109/JSEN.2013.2263797.
9. Daalkhaijav, U.; Yirmibesoglu, O. D.; Walker, S.; Mengüç, Y. Rheological Modification of Liquid Metal for Additive Manufacturing of Stretchable Electronics. Advanced Materials Technologies 2018, 3, 1700351, doi:10.1002/admt.201700351.
10. Truby, R. L.; Wehner, M.; Grosskopf, A. K.; Vogt, D. M.; Uzel, S. G.; Wood, R. J.; Lewis, J. A. Soft Somatosensitive Actuators via Embedded 3D Printing. Advanced Materials 2018, 30, 1706383.
11. Vogt, D.; Menguc, Y.; Park, Y.-L.; Wehner, M.; Kramer, R. K.; Majidi, C.; Jentoft, L. P.; Tenzer, Y.; Howe, R. D.; Wood, R. J. Progress in Soft, Flexible, and Stretchable Sensing Systems. In Proceedings of the Proceedings of the International Workshop on Research Frontiers in Electronics Skin Technology at ICRA; 2013; Vol. 13.
12. Truby, R. L. Designing Soft Robots as Robotic Materials. Acc. Mater. Res. 2021, 2, 854-857, doi:10.1021/accountsmr.1c00071.
13. Park, Y.-L.; Chen, B.-R.; Wood, R. J. Design and Fabrication of Soft Artificial Skin Using Embedded Microchannels and Liquid Conductors. IEEE Sensors Journal 2012, 12, 2711-2718, doi:10.1109/JSEN.2012.2200790.
14. Gerboni, G.; Diodato, A.; Ciuti, G.; Cianchetti, M.; Menciassi, A. Feedback Control of Soft Robot Actuators via Commercial Flex Bend Sensors. IEEE/ASME Transactions on Mechatronics 2017, 22, 1881-1888.
15. Fast Probabilistic 3-D Curvature Proprioception with a Magnetic Soft Sensor|IEEE Conference Publication|IEEE Xplore Available online: https://ieeexplore.ieee.org/abstract/document/9551572?casa_token=PeqhRYVUnWwAAAA A:pzzoRF3McivXXhlOd56BhlouOZBsG9mZd8TqIldmzRxRRAZuQLN9CIVlOfbpp7r-4oekd3U2Yw (accessed on 27 Oct. 2021).
16. McInroe, B. W.; Chen, C. L.; Goldberg, K. Y.; Goldberg, K. Y.; Bajcsy, R.; Fearing, R. S. Towards a Soft Fingertip with Integrated Sensing and Actuation. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS); October 2018; pp. 6437-6444.
17. Li, D.; Dornadula, V.; Lin, K.; Wehner, M. Position Control for Soft Actuators, Next Steps toward Inherently Safe Interaction. Electronics 2021, 10, 1116, doi:10.3390/electronics10091116.
18. Tapia, J.; Knoop, E.; Mutny, M.; Otaduy, M. A.; Bacher, M. MakeSense: Automated Sensor Design for Proprioceptive Soft Robots. Soft Robotics 2020, 7, 332-345, doi:10.1089/soro.2018.0162.
19. Otero, T. F. Towards Artificial Proprioception from Artificial Muscles Constituted by Self-Sensing Multi-Step Electrochemical Macromolecular Motors. Electrochimica Acta 2021, 368, 137576, doi:10.1016/j.electacta.2020.137576.
20. Shih, B.; Shah, D.; Li, J.; Thuruthel, T. G.; Park, Y.-L.; Iida, F.; Bao, Z.; Kramer-Bottiglio, R.; Tolley, M. T. Electronic Skins and Machine Learning for Intelligent Soft Robots. Science Robotics 2020, 5, eaaz9239, doi:10.1126/scirobotics.aaz9239.
21. Holmes, P.; Full, R. J.; Koditschek, D.; Guckenheimer, J. The Dynamics of Legged Locomotion: Models, Analyses, and Challenges. SIAM Rev. 2006, 48, 207-304, doi:10.1137/50036144504445133.
22. Dahiya, R. S.; Mittendorfer, P.; Valle, M.; Cheng, G.; Lumelsky, V. J. Directions Toward Effective Utilization of Tactile Skin: A Review. IEEE Sensors Journal 2013, 13, 4121-4138, doi:10.1109/JSEN.2013.2279056.
23. Appendix B and Appendix C in the priority applications U.S. provisional patent application Ser. No. 63/282,379 filed Nov. 23, 2021 and U.S. provisional patent application Ser. No. 63/291,229 filed Dec. 17, 2021, by Keng-Yu Lin, Arturo Gamboa-Gonzalez, and Michael Wehner, entitled “SOFT ROBOTIC SENSING AND PROPRIOCEPTION VIA CABLE AND MICROFLUIDIC TRANSMISSION.”
24. Boivin, M.; Milutinović, D.; Wehner, M. Movement Error Based Control for a Firm Touch of a Soft Somatosensitive Actuator. In Proceedings of the 2019 American Control Conference (ACC); July 2019; pp. 7-12.
25. Truby, R. L.; Wehner, M.; Grosskopf, A. K.; Vogt, D. M.; Uzel, S. G.; Wood, R. J.; Lewis, J. A. Soft Somatosensitive Actuators via Embedded 3D Printing. Advanced Materials 2018, 30, 1706383.
26. Park, Y.-L.; Chen, B.-R.; Wood, R. J. Design and Fabrication of Soft Artificial Skin Using Embedded Microchannels and Liquid Conductors. IEEE Sensors Journal 2012, 12, 2711-2718, doi:10.1109/JSEN.2012.2200790.
27. FLOREY, E. Ultrastructure and Function of Cephalopod Chromatophores. American Zoologist 1969, 9, 429-442, doi:10.1093/icb/9.2.429.
28. Cloney, R. A.; Brocco, S. L. Chromatophore Organs, Reflector Cells, Iridocytes and Leucophores in Cephalopods. Am Zool 1983, 23, 581-592, doi:10.1093/icb/23.3.581.
29. Williams, T. L.; Senft, S. L.; Yeo, J.; Martin-Martinez, F. J.; Kuzirian, A. M.; Martin, C. A.; DiBona, C. W.; Chen, C.-T.; Dinneen, S. R.; Nguyen, H. T.; et al. Dynamic Pigmentary and Structural Coloration within Cephalopod Chromatophore Organs. Nat Commun 2019, 10, 1004, doi:10.1038/s41467-019-08891-x.
30. Giordano, G.; Carlotti, M.; Mazzolai, B. A Perspective on Cephalopods Mimicry and Bioinspired Technologies toward Proprioceptive Autonomous Soft Robots. Advanced Materials Technologies n/a, 2100437, doi:10.1002/admt.202100437.
31. Zeng, S.; Zhang, D.; Huang, W.; Wang, Z.; Freire, S. G.; Yu, X.; Smith, A. T.; Huang, E. Y.; Nguon, H.; Sun, L. Bio-Inspired Sensitive and Reversible Mechanochromisms via Strain-Dependent Cracks and Folds. Nat Commun 2016, 7, 11802, doi:10.1038/ncomms11802.
32. Rossiter, J.; Yap, B.; Conn, A. Biomimetic Chromatophores for Camouflage and Soft Active Surfaces. Bioinspir. Biomim. 2012, 7, 036009, doi:10.1088/1748-3182/7/3/036009.
33. Beer, F. P.; Johnston, E. R.; DeWolf, J. T.; Mazurek, D. F. Mechanics of Materials. New York 1992.
34. Timoshenko, S. History of Strength of Materials: With a Brief Account of the History of Theory of Elasticity and Theory of Structures; Courier Corporation, 1983;
35. Young, W. C.; Budynas, R. G.; Sadegh, A. M. Roark's Formulas for Stress and Strain; McGraw-Hill Education, 2012;
36. Boresi, A. P.; Schmidt, R. J.; Sidebottom, O. M. Advanced Mechanics of Materials; Wiley New York, 1985; Vol. 6;.
37. Aziz, M. S.; El sherif, A. Y. Biomimicry as an Approach for Bio-Inspired Structure with the Aid of Computation. Alexandria Engineering Journal 2016, 55, 707-714, doi:10.1016/j.aej.2015.10.015.
38. Soter, G.; Garrad, M.; Conn, A. T.; Hauser, H.; Rossiter, J. Skinflow: A Soft Robotic Skin Based on Fluidic Transmission. In Proceedings of the 2019 2nd IEEE International Conference on Soft Robotics (RoboSoft); IEEE: Seoul, Korea (South), April 2019; pp. 355-360.
39. Lin, K.-Y.; Gupta, S. K. Soft Fingers with Controllable Compliance to Enable Realization of Low Cost Grippers. In Proceedings of the Biomimetic and Biohybrid Systems; Mangan, M., Cutkosky, M., Mura, A., Verschure, P. F. M. J., Prescott, T., Lepora, N., Eds.; Springer International Publishing: Cham, 2017; pp. 544-550.
40. OpenCV: Canny Edge Detection Available online: https://docs.opencv.org/3.4/da/d22/tutorial_py_canny.html (accessed on 16 Nov. 2021).
41. Lee, S. Lines Detection with Hough Transform Available online: https://towardsdatascience.com/lines-detection-with-hough-transform-84020b3b1549 (accessed on 16 Nov. 2021).
This concludes the description of the preferred embodiment of the present invention. The foregoing description of one or more embodiments of the invention has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. It is intended that the scope of the invention be limited not by this detailed description, but rather by the claims appended hereto.
This application claims the benefit under 35 U.S.C. Section 119(e) of co-pending and commonly-assigned U.S. provisional patent application Ser. No. 63/282,379 filed Nov. 23, 2021 and U.S. provisional patent application Ser. No. 63/291,229 filed Dec. 17, 2021, by Keng-Yu Lin, Arturo Gamboa-Gonzalez, and Michael Wehner, entitled “SOFT ROBOTIC SENSING AND PROPRIOCEPTION VIA CABLE AND MICROFLUIDIC TRANSMISSION,” client reference 2022-822, which applications are incorporated by reference herein.
Number | Date | Country | |
---|---|---|---|
63291229 | Dec 2021 | US | |
63282379 | Nov 2021 | US |