The present disclosure relates to robotic apparatuses, and more specifically to techniques for generating instructions to control movement of robotic apparatuses.
A transport structure such as an automobile, truck or aircraft employs a large number of interior and exterior nodes. These nodes provide structure to the automobile, truck and aircraft, and respond appropriately to the many different types of forces that are generated or that result from various actions like accelerating and braking. These nodes also provide support. Nodes of varying sizes and geometries may be integrated in a transport structure, for example, to provide an interface between panels, extrusions, and/or other structures. Thus, nodes are an integral part of transport structures.
Most nodes must be coupled to, or interface securely with, another part or structure in secure, well-designed ways. In order to securely connect a node with another part or structure, the node may need to undergo one or more processes in order to prepare the node to connect with the other part or structure. For example, the node may be machined at an interface in order to connect with various other parts or structures. Further examples of processes include surface preparation operations, heat treatment, electrocoating, electroplating, anodization, chemical etching, cleaning, support removal, powder removal, and so forth.
In order to produce a transport structure (e.g., a vehicle, an aircraft, a metro system, etc.), one or more assembly operations may be performed after a node is constructed. For example, a node may be connected with a part, e.g., in order to form a portion of a transport structure (e.g., a vehicle chassis, etc.). Such assembly may involve a degree of accuracy that is within one or more tolerance thresholds of an assembly system, e.g., in order to ensure that the node is securely connected with the part and, therefore, the transport structure may be satisfactorily produced.
When robotic apparatuses (e.g., robotic arms) perform assembly operations, the robotic apparatuses are to be accurately positioned in order for the assembly operations to be accurately performed. For example, a robotic arm with which a node is engaged may be positioned so that the node is accurately connected with a part. Thus, a need exists for an approach to correctly positioning at least one robotic apparatus (e.g., a robotic arm) with a degree of precision that is within tolerance threshold(s) of an assembly system when performing various assembly operations.
The present disclosure generally relates to assembly operations performed in association with production of transport structures. Such assembly operations may include connection of nodes (e.g., additively manufactured nodes) with parts and/or other structures. Because transport structures are to be safe, reliable, and so forth, approaches to accurately performing various assembly operations associated with production of transport structures may be beneficial. Such approaches to various assembly operations may be performed by at least one robotic arm that may be instructed via computer-generated instructions. Accordingly, a computer may implement various techniques to generate instructions for at least one robotic arm that causes the at least one robotic arm to be correctly position when performing various assembly operations.
In the present disclosure, systems and apparatuses for positioning a robotic arm may be described. In one aspect, a first apparatus may comprise a first robotic arm having a distal end and a proximal end, and the distal end may be configured for movement while the proximal end may secure the first robotic arm. The first apparatus may include a camera connected with the distal end of the first robotic arm, and the camera may be configured to capture image data of a marker connected with a second robotic arm and provide the image data to a computer. The movement of the first robotic arm may be caused by the computer based on the image data of the marker.
In another aspect, a first system for positioning a robotic arm may be described. The first system may comprise a first robotic arm having a first camera disposed on a first distal end. The system may include a first controller connected with the first robotic arm and the first controller may be configured to cause movement of the first robotic arm. The system may include a second robotic arm having a second marker disposed on a second distal end. The first camera may be positioned such that the second marker is within a field of view of the first camera, and the first camera may be configured to capture first image data of the second marker and provide the first image data to a computer. The first controller may be configured to cause movement of the first robotic arm based on instructions obtained from the computer. In various aspects, the system may further include a second controller connected with the second robotic arm, and the second controller may be configured to cause movement of the second robotic arm. The first robotic arm may include a first marker disposed on the first distal end and the second robotic arm may include a second camera disposed on the second distal end. The second camera may be configured to capture second image data of the first marker and provide the second image data to the computer, and the second controller may be configured to cause the movement of the second robotic arm based on instructions obtained from the computer.
In another aspect, a second apparatus for positioning a robotic arm may be described.
The second apparatus may include a memory and at least one processor coupled to the memory and configured to obtain, from a first camera disposed on a first robotic arm, first image data of a second marker disposed on a second robotic arm. The at least one processor may be further configured to generate a first set of instructions associated with causing connection of a first part engaged by the first robotic arm to a second part engaged by the second robotic arm. The at least one processor may be further configured to provide the first set of instructions to a first controller connected with the first robotic arm, and the first set of instructions may cause first movement of the first robotic arm in association with the connection of the first part to the second part. In some embodiments, the first image data may further include data of a fixed marker, and the first set of instructions may be based on the fixed marker as a reference point.
It will be understood that other aspects of mechanisms for realizing adhesive connections with additively manufactured components and the manufacture thereof will become readily apparent to those skilled in the art from the following detailed description, wherein it is shown and described only several embodiments by way of illustration. As will be realized by those skilled in the art, the disclosed subject matter is capable of other and different embodiments and its several details are capable of modification in various other respects, all without departing from the invention. Accordingly, the drawings and detailed description are to be regarded as illustrative in nature and not as restrictive.
The detailed description set forth below in connection with the appended drawings is intended to provide a description of various exemplary embodiments and is not intended to represent the only embodiments in which the invention may be practiced. The terms “exemplary,” “illustrative,” and the like used throughout the present disclosure mean “serving as an example, instance, or illustration,” and should not necessarily be construed as preferred or advantageous over other embodiments presented in the present disclosure. The detailed description includes specific details for the purpose of providing a thorough and complete disclosure that fully conveys the scope of the invention to those skilled in the art. However, the invention may be practiced without these specific details. In some instances, well-known structures and components may be shown in block diagram form, or omitted entirely, in order to avoid obscuring the various concepts presented throughout the present disclosure. In addition, the figures may not be drawn to scale and instead may be drawn in a way that attempts to most effectively highlight various features relevant to the subject matter described.
Additive Manufacturing (3-D Printing).
Additive manufacturing (AM) is advantageously a non-design specific manufacturing technique. AM provides the ability to create complex structures within a part. For example, nodes can be produced using AM. A node is a structural member that may include one or more interfaces used to connect to other spanning components such as tubes, extrusions, panels, other nodes, and the like. Using AM, a node may be constructed to include additional features and functions, depending on the objectives. For example, a node may be printed with one or more ports that enable the node to secure two parts by injecting an adhesive rather than welding multiple parts together, as is traditionally done in manufacturing complex products. Alternatively, some components may be connected using a brazing slurry, a thermoplastic, a thermoset, or another connection feature, any of which can be used interchangeably in place of an adhesive. Thus, while welding techniques may be suitable with respect to certain embodiments, additive manufacturing provides significant flexibility in enabling the use of alternative or additional connection techniques.
A variety of different AM techniques have been used to 3-D print components composed of various types of materials. Numerous available techniques exist, and more are being developed. For example, Directed Energy Deposition (DED) AM systems use directed energy sourced from laser or electron beams to melt metal. These systems utilize both powder and wire feeds. The wire feed systems advantageously have higher deposition rates than other prominent AM techniques. Single Pass Jetting (SPJ) combines two powder spreaders and a single print unit to spread metal powder and to print a structure in a single pass with apparently no wasted motion. As another illustration, electron beam additive manufacturing processes use an electron beam to deposit metal via wire feedstock or sintering on a powder bed in a vacuum chamber. Single Pass Jetting is another exemplary technology claimed by its developers to be much quicker than conventional laser-based systems. Atomic Diffusion Additive Manufacturing (ADAM) is still another recently developed technology in which components are printed, layer-by-layer, using a metal powder in a plastic binder. After printing, plastic binders are removed and the entire part is sintered at once into a desired metal.
One of several such AM techniques, as noted, is DMD.
The powdered metal is then fused by the laser 106 in a melt pool region 108, which may then bond to the workpiece 112 as a region of deposited material 110. The dilution area 114 may include a region of the workpiece where the deposited powder is integrated with the local material of the workpiece. The feed nozzle 102 may be supported by a computer numerical controlled (CNC) robot or a gantry, or other computer-controlled mechanism. The feed nozzle 102 may be moved under computer control multiple times along a predetermined direction of the substrate until an initial layer of the deposited material 110 is formed over a desired area of the workpiece 112. The feed nozzle 102 can then scan the region immediately above the prior layer to deposit successive layers until the desired structure is formed. In general, the feed nozzle 102 may be configured to move with respect to all three axes, and in some instances to rotate on its own axis by a predetermined amount.
3-D modeling software, in turn, may include one of numerous commercially available 3-D modeling software applications. Data models may be rendered using a suitable computer-aided design (CAD) package, for example in an STL format. STL is one example of a file format associated with commercially available stereolithography-based CAD software. A CAD program may be used to create the data model of the 3-D object as an STL file. Thereupon, the STL file may undergo a process whereby errors in the file are identified and resolved.
Following error resolution, the data model can be “sliced” by a software application known as a slicer to thereby produce a set of instructions for 3-D printing the object, with the instructions being compatible and associated with the particular 3-D printing technology to be utilized (operation 220). Numerous slicer programs are commercially available. Generally, the slicer program converts the data model into a series of individual layers representing thin slices (e.g., 100 microns thick) of the object be printed, along with a file containing the printer-specific instructions for 3-D printing these successive individual layers to produce an actual 3-D printed representation of the data model.
The layers associated with 3-D printers and related print instructions need not be planar or identical in thickness. For example, in some embodiments depending on factors like the technical sophistication of the 3-D printing equipment and the specific manufacturing objectives, etc., the layers in a 3-D printed structure may be non-planar and/or may vary in one or more instances with respect to their individual thicknesses.
A common type of file used for slicing data models into layers is a G-code file, which is a numerical control programming language that includes instructions for 3-D printing the object. The G-code file, or other file constituting the instructions, is uploaded to the 3-D printer (operation 230). Because the file containing these instructions is typically configured to be operable with a specific 3-D printing process, it will be appreciated that many formats of the instruction file are possible depending on the 3-D printing technology used.
In addition to the printing instructions that dictate what and how an object is to be rendered, the appropriate physical materials necessary for use by the 3-D printer in rendering the object are loaded into the 3-D printer using any of several conventional and often printer-specific methods (operation 240). In DMD techniques, for example, one or more metal powders may be selected for layering structures with such metals or metal alloys. In selective laser melting (SLM), selective laser sintering (SLS), and other PBF-based AM methods (see below), the materials may be loaded as powders into chambers that feed the powders to a build platform. Depending on the 3-D printer, other techniques for loading printing materials may be used.
The respective data slices of the 3-D object are then printed based on the provided instructions using the material(s) (operation 250). In 3-D printers that use laser sintering, a laser scans a powder bed and melts the powder together where structure is desired, and avoids scanning areas where the sliced data indicates that nothing is to be printed. This process may be repeated thousands of times until the desired structure is formed, after which the printed part is removed from a fabricator. In fused deposition modelling, as described above, parts are printed by applying successive layers of model and support materials to a substrate. In general, any suitable 3-D printing technology may be employed for purposes of the present disclosure.
Another AM technique includes powder-bed fusion (“PBF”). Like DMD, PBF creates ‘build pieces’ layer-by-layer. Each layer or ‘slice’ is formed by depositing a layer of powder and exposing portions of the powder to an energy beam. The energy beam is applied to melt areas of the powder layer that coincide with the cross-section of the build piece in the layer. The melted powder cools and fuses to form a slice of the build piece. The process can be repeated to form the next slice of the build piece, and so on. Each layer is deposited on top of the previous layer. The resulting structure is a build piece assembled slice-by-slice from the ground up.
Referring specifically to
In various embodiments, the deflector 305 can include one or more gimbals and actuators that can rotate and/or translate the energy beam source to position the energy beam. In various embodiments, energy beam source 303 and/or deflector 305 can modulate the energy beam, e.g., turn the energy beam on and off as the deflector scans so that the energy beam is applied only in the appropriate areas of the powder layer. For example, in various embodiments, the energy beam can be modulated by a digital signal processor (DSP).
The present disclosure presents various approaches to positioning at least one robotic arm in an assembly system. For example, an assembly system may include two robots, each of which may include a respective robotic arm. A first robotic arm may be configured to engage with a node during various operations performed with the node. For example, the first robotic arm may engage with a node that is to be connected with a part, and the part may be engaged by a second robotic arm. Various operations performed with the node (e.g., connecting the node with a part) may be performed with a relatively high degree of precision. Accordingly, at least one of the robotic arms may be positioned (e.g., repositioned) during an operation with the node in order to function in accordance with the precision commensurate with the operation.
In some aspects, the first robotic arm may engage with the node and the second robotic arm may engage with a part. An operation with the node may include connecting the node with the part. Thus, the first robotic arm may be positioned relative to the second robotic arm and/or the second robotic arm may be positioned relative to the first robotic arm. When the first and/or second robotic arms are configured to move, the first and/or second robotic arms may be positioned (e.g., repositioned) relative to the other one of the first and/or second robotic arms. Such positioning may correct the position(s) of the first and/or second robotic arms, e.g., to maintain the precision necessary for operations with a node, including connecting a node with a part by the first and second robotic arms.
The present disclosure provides various different embodiments of positioning one or more robotic arms of an assembly system for assembly processes and/or post-processing operations. It will be appreciated that various embodiments described herein may be practiced together. For example, an embodiment described with respect to one illustration of the present disclosure may be implemented in another embodiment described with respect to another illustration of the present disclosure.
The assembly system 400 may include a first robotic arm 402. The first robotic arm 402 may have a distal end 412 and a proximal end 410. The distal end 412 may be configured for movement, e.g., for operations associated with a node and/or part. The proximal end 410 may be secure the first robotic arm 402, e.g., to a base 414.
The distal end 412 of the first robotic arm 402 may be connected with a tool flange 416. The tool flange 416 may be configured to connect with one or more components (e.g., tools) so that the first robotic arm 402 may connect with the one or more components and position the one or more components as the first robotic arm 402 moves.
In the illustrated embodiment, the distal end 412 of the first robotic arm 402 may be connected with an end effector 418, e.g., by means of the tool flange 416. That is, the end effector 418 may be connected with the tool flange 416, and the tool flange 416 may be connected with the distal end 412 of the first robotic arm 402. The end effector 418 may be a component configured to interface with various parts, nodes, and/or other structures. Illustratively, the end effector 418 may be configured to engage with a node 480 (however, the end effector 418 may be configured to engage with a part or other structure). Examples of an end effector 418 may include jaws, grippers, pins, or other similar components capable of engaging a node, part, or other structure.
The distal end 412 of the first robotic arm 402 may be further connected with a camera 408. In an aspect, the camera 408 may be connected with the end effector 418 and, therefore, the camera 408 may be connected with the distal end 412 of the first robotic arm 402 by means of the end effector 418 and the tool flange 416. The camera 408 may be configured to capture image data (e.g., still images, moving images, etc.). In one aspect, the camera 408 may be a motion-capture camera.
The field of view of the camera 408 may be away from the distal end 412 of the first robotic arm 402 and toward the second robotic arm 404. For example, the camera 408 may be positioned so that a distal end 422 of the second robotic arm 404 is at least partially within the field of view of the camera 408.
The assembly system 400 may further include a first controller 442. The first controller 442 may be communicatively coupled with the first robotic arm 402, e.g., via a wired connection, such as a bus, a wireless connection, or another connection interface. The first controller 442 may be configured to drive movement of the first robotic arm 402, which may include movement of the distal end 412 of the first robotic arm 402. Accordingly, the first controller 442 may drive movement of the tool flange 416, the end effector 418, and the camera 408 when driving movement of the distal end 412 of the first robotic arm 402. In some embodiments, the first controller 442 may further directly drive movement of one or of the tool flange 416, the end effector 418, and/or the camera 408 (e.g., the first controller 442 may cause the tool flange 416 and/or the end effector 418 to rotate and/or translate in space without driving movement of the distal end 412 of the first robotic arm 402).
The assembly system 400 may further include a computer 460. The computer 460 may be communicatively coupled with the first controller 442 via a first connection 464a (e.g., a wired connection, such as a bus, a wireless connection, or another connection interface). The computer 460 may be further communicatively coupled with the camera 408 disposed on the first robotic arm 402 via a second connection 464b (e.g., a wired connection, such as a bus, a wireless connection, or another connection interface). The computer 460 may be configured to generate a set of instructions 452a for the first robotic arm 402. The set of instructions 452a may instruct the first robotic arm 402 to move to a position. Accordingly, the computer 460 may provide the set of instructions 452a to the first controller 442 to drive movement of the first robotic arm 402 to the position indicated by the set of instructions 452a.
The computer 460 may generate a plurality of sets of instructions 452a to instruct the position of the first robotic arm 402. For example, the computer 460 may generate a first set of instructions to instruct a first position of the first robotic arm 402 when connecting the node 480 with the part 482. During the connecting the node 480 with the part 482, the computer 460 may generate a second set of instructions to instruct a second position of the first robotic arm 402. That is, the computer 460 may reposition (e.g., correct the position) of the first robotic arm 402 so that the node 480 is accurately connected with the part 482. As described herein, the computer 460 may generate one or more sets of instructions 452a based on image data 450a obtained from the camera 408.
As illustrated, the assembly system 400 may further include a second robotic arm 404. The second robotic arm 404 may have a distal end 422 and a proximal end 420. The proximal end 420 of the second robotic arm 404 may be connected with a base 424, e.g., in order to secure the second robotic arm 404. Illustratively, the first robotic arm 402 and the second robotic arm 404 may be located in the assembly system 400 to be approximately facing one another, e.g., so that the distal end 412 of the first robotic arm 402 extends towards the distal end 422 of the second robotic arm 404 and, correspondingly, the distal end 422 of the second robotic arm 404 extends towards the distal end 412 of the first robotic arm 402. However, the first and second robotic arms 402, 404 may be differently located in the assembly system 400 in other embodiments, e.g., according to an assembly operation that is to be performed.
Similar to the first robotic arm 402, the distal end 422 of the second robotic arm 404 may be connected with a tool flange 426, and the tool flange 426 may be connected with an end effector 428. The end effector 428 may be configured to engage with a node, part, or other structure, such as the part 482 that is to be connected with the node 480.
The distal end 422 of the second robotic arm 404 may be connected with at least one marker 430. In an aspect, the marker 430 may be connected with the end effector 428 and, therefore, the marker 430 may be connected with the distal end 422 of the second robotic arm 404 by means of the end effector 428 and the tool flange 426. While
The marker 430 may be at least partially within the field of view of the camera 408.
Accordingly, the camera 408 may be configured to capture image data 450a of the marker 430. In various embodiments, the marker 430 may be of any suitable material and/or shape to be distinguishable in the image data 450a. For example, the marker 430 may be composed of a reflective material, the marker 430 may be a specific shape (e.g., an “X”, a cross, etc.), or of another material and/or shape that is detectable in the image data 450a. The marker 430 may be an active marker (e.g., the marker 430 may be configured to change color, change luminous intensity, etc.) or a passive marker.
In one embodiment, the second robotic arm 404 may be stationary. In another embodiment, the second robotic arm 404 may be configured for movement, e.g., so that a distal end 422 of the second robotic arm 404 moves and, correspondingly, causes movement of the tool flange 426, the end effector 428, and the marker 430. When the second robotic arm 404 is configured for movement, the second robotic arm 404 may be communicatively coupled with a second controller 444. The second controller 444 may be configured to drive movement of the second robotic arm 404. Similar to the movement of the first robotic arm 402, the movement of the second robotic arm 404 may be based on a set of instructions 452b generated by the computer 460. The computer may generate the set of instructions based on image data 450b, which may be captured by a camera disposed on the second robotic arm 404 and the image data 450b may be of at least one other marker (e.g., a fixed marker located in the assembly system 400, a marker connected with the distal end 412 of the first robotic arm 402, etc.).
The computer 460 may be communicatively coupled with the second controller 444 via a third connection 466a, and may issue a generated set of instructions 452b to instruct the position of the second robotic arm 404 to the second controller 444 so that the second controller 444 may drive movement of the second robotic arm 404 according to the set of instructions 452b.
When the computer 460 is communicatively coupled with both the first controller 442 and the second controller 444, the computer 460 may be connected with a switch 462. The switch 462 may control the path of the sets of instructions 452a-b. For example, when the computer 460 generates a set of instructions 452a to instruct the position of the first robotic arm 402, the switch 462 may cause the set of instructions 452a to be sent over the first connection 464a to the first controller 442.
During an assembly operation of a node-based transport structure, the first robotic arm 402 may engage with the node 480 through the end effector 418 and the second robotic arm 404 may engage with the part 482 through the end effector 428. In various embodiments of the assembly operation of a node-based transport structure, the node 480 may be connected with the part 482 and, therefore, the first and second robotic arms 402, 404 may perform the assembly operation. Connection of the node 480 with the part 482 is exemplary, and the operations described herein are applicable to other assembly and/or post-processing operations without departing from the scope of the present disclosure.
According to one embodiment, the first robotic arm 402 may be configured to move to bring the node 480 to the part 482 connected with the second robotic arm 404. Thus, the distal end 412 of the first robotic arm 402 may extend away from the base 414 and toward the distal end 422 of the second robotic arm 404, at which the part 482 is engaged. In one embodiment, the movement of the distal end 412 of the first robotic arm 402 may be at least approximately horizontal (e.g., parallel to the horizon); however, the movement of the distal end 412 of the first robotic arm 402 may be additionally or alternatively approximately vertical (e.g., perpendicular to the horizon).
The movement of the distal end 412 of the first robotic arm 402 toward the distal end 422 of the second robotic arm 404 may be caused by the first controller 442. The first controller 442 may cause the movement of the distal end 412 of the first robotic arm 402 based on the set of instructions 452a provided by the computer 460.
When the node 480 is connected with the first robotic arm 402 (e.g., engaged with the end effector 418), the distal end 412 of the first robotic arm 402 may be positioned to accurately connect the node 480 with the part 482. To that end, the camera 408 may capture image data 450a of the marker 430. The camera 408 may provide the image data 450a to the computer 460. For example, the computer 460 may issue a request to the camera 408 to capture the image data 450a or the camera 408 may automatically capture the image data 450a (e.g., when the first robotic arm 402 is at a fixed position and not moving, when the marker 430 is within the field of view of the camera 408, when the marker 430 is activated, etc.).
Based on the image data 450a, the computer 460 may generate at least one set of instructions 452a. In some embodiments, the assembly system 400 may include other markers in addition to the marker 430, such as another marker located on the second robotic arm 404 and/or a marker located elsewhere in the assembly system 400 and/or proximate thereto and within the field of view of the camera 408 (e.g., located on a wall or ceiling of a room in which the assembly system 400 is located).
The computer 460 may determine the location of the marker 430 in the image data 450a, such as a set of numerical values indicating the location of the marker 430 and/or a set of coordinates indicating the location of the marker 430. In one embodiment, the computer 460 may determine the actual location (e.g., absolute location) of the marker 430 in space (e.g., three-dimensional space), as shown by the image data 450a. In another embodiment, the computer 460 may determine the relative location of the marker 430 in the image data 450a, e.g., with respect to the first robotic arm 402.
The computer 460 may have stored therein or may be otherwise able to access computer-aided design (CAD) data (e.g., numerical CAD data and/or a set of coordinates based on CAD data). The CAD data may indicate the position for first robotic arm 402 in order to accurately connect the node 480 with the part 482 (e.g., the “correct” position of the distal end 412 of the first robotic arm 402 in order for the node 480 to be accurately connected with the part 482).
The computer 460 may determine the location of the marker 430 in the image data 450a relative to the CAD data. For example, the computer 460 may compare the determined location of the marker 430 to the CAD data, and the computer 460 may determine (e.g., calculate) the difference between the determined location of the marker 430 and the CAD data. The determined difference may indicate the variance between the actual position of the first robotic arm 402 in the assembly system 400 and the “correct” position of the first robotic arm 402 in the assembly system 400, wherein the “correct” position of the first robotic arm 402 in the assembly system 400 may be the position of the first robotic arm 402 in order to reduce (or eliminate) the variance and accurately connect the node 480 with the part 482. The determined difference may be used for positional compensation(s) and/or positional correction(s) of the first robotic arm 402, the tool flange 416, and/or the end effector 418, thereby compensating and/or correcting the position of the node 480 to be accurate (e.g., within one or more thresholds commensurate with tolerances of the assembly system 400).
Based on the determined location of the marker 430 in the image data 450a and the CAD data (e.g., based on the determined difference between the determined location of the marker 430 to the CAD data), the computer 460 may determine a set of instructions 452a that indicates a position to which the distal end 412 of the first robotic arm 402 is to move in order to accurately connect the node 480 with the part 482. For example, the set of instructions 452a may indicate one or more movements that are to be performed by the first robotic arm 402 so that the distal end 412 of the first robotic arm 402 will be positioned for an accurate connection of the node 480 with the part 482. In one embodiment, the set of instructions 452a may indicate one or more movements of the distal end 412 of the first robotic arm 402 so that the distal end 412 of the first robotic arm 402 will be at a position corresponding to the CAD data.
The computer 460 may generate the set of instructions 452a to indicate one or more movements in one or more of six degrees of freedom (6DoF), including forward/backward (e.g., surge), up/down (e.g., heave), left/right (e.g., sway) for translation in space and including yaw, pitch, and roll for rotation in space. For one or more of the 6DoF, the tool flange 416 and/or the end effector 418 may perform one or more movements. For example, the computer 460 may generate the set of instructions 452a to cause the tool flange 416 and/or the end effector 418 to rotate and/or translate in space.
The computer 460 may provide the generated set of instructions 452a to the first controller 442. Accordingly, the first controller 442 may drive movement of the first robotic arm 402 based on the set of instructions 452a. For example, the first controller 442 may drive movement of the first robotic arm 402, thereby causing the distal end 412 of the first robotic arm 402 to move into a position indicated by the set of instructions 452a. Thus, the first robotic arm 402 may be positioned relative to the marker 430. Correspondingly, the first robotic arm 402 may move the tool flange 416, the end effector 418, and the node 480 into a position corresponding to the set of instructions 452a.
Based on the set of instructions 452a, the first robotic arm 402 may cause the node 480 to contact the part 482 that is engaged with the end effector 428 connected with the second robotic arm 404. When the node 480 is accurately contacting the part 482 (e.g., within designed or acceptable tolerances), the node 480 may be connected with the part 482 in order to form a portion of a node-based transport structure (e.g., without any fixtures). For example, the first robotic arm 402 and/or the end effector 418 may cause the node 480 to be extended toward the part 482 and, when the node 480 is contacting the part 482, pressure may be applied until at least one feature (e.g., a male/extension feature, a female/reception feature, etc.) of the node 480 is securely engaged with at least one feature (e.g., a correspondingly opposing one of a female/reception feature, a male/extension feature, etc.) of the part 482.
As aforementioned, the computer 460 may generate a plurality of sets of instructions.
Each set of instructions may be dynamically generated in association with the movement of the first robotic arm 402. For example, the computer 460 may generate one set of instructions 452a that causes the first controller 442 to drive movement of the first robotic arm 402 into a first position. The camera 408 may capture image data 450a of the marker 430 during and/or after the first robotic arm 402 is controlled to the first position. The image data 450a may be provided to the computer 460, and the computer 460 may reevaluate the location of the marker 430. The computer 460 may compare the reevaluated location of the marker 430 to the CAD data, and generate a next set of instructions to compensate and/or correct for the position of the first robotic arm 402. The computer 460 may provide the next set of instructions to the first controller 442 to drive movement of the first robotic arm 402 to a second position corresponding to the next set of instructions. The computer 460 may iteratively generate instructions based on image data until the first robotic arm 402 is at a position at which the node 480 may be accurately connected with the part 482. The computer 460 may generate sets of instructions while the first robotic arm 402 is in motion and/or when the first robotic arm 402 is stationary between movements performed according to sets of instructions.
The arrangement of
In some embodiments in which the camera (e.g., the camera 408) is not disposed on a robotic arm, a computer (e.g., the computer 460) may generate a set of instructions (e.g., the set of instructions 452a) that instructs movement of one or more robotic arms, e.g., in order to correct the position of the one or more robotic arms. In some embodiments, the computer may generate such a set of instructions based on relative positions of the one or more robotic arms, such as the positions of the robotic arms relative to one another. For example, the computer may generate a set of instructions that causes movement of the first robotic arm relative to the second and/or other additional robotic arms, e.g., based on respective markers affixed to each of the first, second, and (potentially) one or more additional robotic arms.
In another embodiment, the camera 408 may be replaced or used in combination with a different metrology system, such as a laser tracking system or other device configured to capture metrology data of the assembly system 400.
The assembly system 500 may include a first robotic arm 502. The first robotic arm 502 may have a distal end 512 and a proximal end 510. The distal end 512 may be configured for movement, e.g., for operations associated with a node and/or part. The proximal end 510 may be secure the first robotic arm 502, e.g., to a base 514.
The distal end 512 of the first robotic arm 502 may be connected with a tool flange 516. The tool flange 516 may be configured to connect with one or more components (e.g., tools) so that the first robotic arm 502 may connect with the one or more components and position the one or more components as the first robotic arm 502 moves.
In the illustrated embodiment, the distal end 512 of the first robotic arm 502 may be connected with an end effector 518, e.g., by means of the tool flange 516. That is, the end effector 518 may be connected with the tool flange 516, and the tool flange 516 may be connected with the distal end 512 of the first robotic arm 502. The end effector 518 may be a component configured to interface with various parts, nodes, and/or other structures. Illustratively, the end effector 518 may be configured to engage with a node 580 (however, the end effector 518 may be configured to engage with a part or other structure). Examples of an end effector 518 may include jaws, grippers, pins, or other similar components capable of engaging a node, part, or other structure.
The distal end 512 of the first robotic arm 502 may be further connected with a first camera 508a. In an aspect, the first camera 508a may be connected with the end effector 518 and, therefore, the first camera 508a may be connected with the distal end 512 of the first robotic arm 502 by means of the end effector 518 and the tool flange 516. The first camera 508a may be configured to capture image data (e.g., still images, moving images, etc.). In one aspect, the first camera 508a may be a motion-capture camera.
The field of view of the first camera 508a may be away from the distal end 512 of the first robotic arm 502 and toward the second robotic arm 504. For example, the first camera 508a may be positioned so that a distal end 522 of the second robotic arm 504 is at least partially within the field of view of the first camera 508a.
The distal end 512 of the first robotic arm 502 may be connected with at least one a first marker 530a. In an aspect, the first marker 530a may be connected with the end effector 518 and, therefore, the first marker 530a may be connected with the distal end 512 of the first robotic arm 502 by means of the end effector 518 and the tool flange 516. While
The first marker 530a may be at least partially within the field of view of a second camera 508b, which may be connected with a distal end 522 of a second robotic arm 504 as further described herein (e.g., infra). In various embodiments, the first marker 530a may be of any suitable material and/or shape to be distinguishable in second image data 550b captured by the second camera 508b. For example, the first marker 530a may be composed of a reflective material, the first marker 530a may be a specific shape (e.g., an “X”, a cross, etc.), or of another material and/or shape that is detectable in the second image data 550b. The first marker 530a may be an active marker (e.g., the first marker 530a may be configured to change color, change luminous intensity, etc.) or a passive marker.
The assembly system 500 may further include a first controller 542. The first controller 542 may be communicatively coupled with the first robotic arm 502, e.g., via a wired connection, such as a bus, a wireless connection, or another connection interface. The first controller 542 may be configured to drive movement of the first robotic arm 502, which may include movement of the distal end 512 of the first robotic arm 502. Accordingly, the first controller 542 may drive movement of the tool flange 516, the end effector 518, and the first camera 508a when driving movement of the distal end 512 of the first robotic arm 502. In some embodiments, the first controller 542 may further directly drive movement of one or of the tool flange 516, the end effector 518, and/or the first camera 508a (e.g., the first controller 542 may cause the tool flange 516 and/or the end effector 518 to rotate and/or translate in space without driving movement of the distal end 512 of the first robotic arm 502).
As illustrated, the assembly system 500 may further include a second robotic arm 504.
The second robotic arm 504 may have a distal end 522 and a proximal end 520. The proximal end 520 of the second robotic arm 504 may be connected with a base 524, e.g., in order to secure the second robotic arm 504. The first robotic arm 502 and the second robotic arm 504 may be located in the assembly system 500 to be approximately facing one another, e.g., so that the distal end 512 of the first robotic arm 502 extends towards the distal end 522 of the second robotic arm 504 and, correspondingly, the distal end 522 of the second robotic arm 504 extends towards the distal end 512 of the first robotic arm 502. However, the first and second robotic arms 502, 504 may be differently located in the assembly system 500 in other embodiments, e.g., according to an assembly operation that is to be performed.
Similar to the first robotic arm 502, the distal end 522 of the second robotic arm 504 may be connected with a tool flange 526, and the tool flange 526 may be connected with an end effector 528. The end effector 528 may be configured to engage with a node, part, or other structure, such as the part 582 that is to be connected with the node 580.
The distal end 522 of the second robotic arm 504 may be connected with at least one a second marker 530b. In an aspect, the second marker 530b may be connected with the end effector 528 and, therefore, the second marker 530b may be connected with the distal end 522 of the second robotic arm 504 by means of the end effector 528 and the tool flange 526. While
The second marker 530b may be at least partially within the field of view of the first camera 508a. Accordingly, the first camera 508a may be configured to capture first image data 550a of the second marker 530b. In various embodiments, the second marker 530b may be of any suitable material and/or shape to be distinguishable in the first image data 550a. For example, the second marker 530b may be composed of a reflective material, the second marker 530b may be a specific shape (e.g., an “X”, a cross, etc.), or of another material and/or shape that is detectable in the first image data 550a. The second marker 530b may be an active marker (e.g., the second marker 530b may be configured to change color, change luminous intensity, etc.) or a passive marker.
The second robotic arm 504 may be configured for movement, e.g., so that a distal end 522 of the second robotic arm 504 moves and, correspondingly, causes movement of the tool flange 526, the end effector 528, and the second marker 530b. When the second robotic arm 504 is configured for movement, the second robotic arm 504 may be communicatively coupled with a second controller 544. The second controller 544 may be configured to drive movement of the second robotic arm 504. Similar to the movement of the first robotic arm 502, the movement of the second robotic arm 504 may be based on a set of instructions 552b generated by the computer 560. The computer 560 may be communicatively coupled with the second controller 544 via a third connection 566a, and may issue a generated set of instructions 552b to instruct the position of the second robotic arm 504 to the second controller 544 so that the second controller 544 may drive movement of the second robotic arm 504 according to the set of instructions 552b.
The distal end 522 of the second robotic arm 504 may be further connected with a second camera 508b. In an aspect, the second camera 508b may be connected with the end effector 528 and, therefore, the second camera 508b may be connected with the distal end 522 of the second robotic arm 504 by means of the end effector 528 and the tool flange 526. The second camera 508b may be configured to capture second image data 550b (e.g., still images, moving images, etc.). In one aspect, the second camera 508b may be a motion-capture camera.
The field of view of the second camera 508b may be away from the distal end 522 of the second robotic arm 504 and toward the first robotic arm 502. For example, the second camera 508b may be positioned so that a distal end 512 of the first robotic arm 502 is at least partially within the field of view of the second camera 508b.
In some embodiments, the assembly system 500 may further include one or more fixed markers 530c-d. The one or more fixed markers 530c-d may be unconnected with the robotic arms 502, 504. Instead, the fixed markers 530c-d may be located in a location of the assembly system 500. For example, a first fixed marker 530c may be located on a ceiling or other overhead surface that is above the robotic arms 502, 504 in the assembly system 500. In other examples, a second fixed marker 530d may be located on another surface that is proximate to the robotic arms 502, 504, such as a surface to the side of the robotic arms 502, 504 (e.g., a wall), a surface below the robotic arms 502, 504 (e.g., a floor), or another surface (e.g., a surface elevated from the floor but below the distal ends 512, 522 of the robotic arms 502, 504.
The fixed markers 530c-d may serve as respective reference points. Because the fixed markers 530c-d may be stationary in the assembly system 500, the fixed markers 530c-d may be absolute positions. Accordingly, the frames of reference for each of the robotic arms 502, 504 may be provided by the fixed markers 530c-d. In some embodiments, one or more fixed markers (e.g., the fixed markers 530c-d) may provide a common frame of reference with respect to all robotic arms in an assembly system (e.g., the first and second robotic arms 502, 504 in the assembly system 500). Thus, two or more robotic arms (e.g., three, four, or potentially even more robotic arms) may be contemporaneously configured for movement in an assembly system, for example, so that a plurality of operations may be contemporaneously (potentially simultaneously) performed in a coordinated approach with a common frame of reference. This enables robots to determine their positions with respect to each other in embodiments where a plurality of robots are configured for movement during an assembly/post-processing operation.
The fixed markers 530c-d may be at least partially within the fields of view of the first camera 508a and/or the second camera 508b. For example, the first camera 508a may be configured to capture first image data 550a of the first fixed marker 530c (e.g., first image data 550a that also captures the second marker 530b). Similarly, the second camera 508b may be configured to capture second image data 550b of the second fixed marker 530d (e.g., the second image data 550b in which the first marker 530a is also captured). In some embodiments, at least one of the fixed markers 530c-d may be within the fields of view of both cameras 508a-b (at least at some point during an assembly operation).
In various embodiments, the fixed markers 530c-d may be of any suitable material and/or shape to be distinguishable in the image data 550a-b. For example, one or both of the fixed markers 530c-d may be composed of a reflective material, one or both of the fixed markers 530c-d may be a specific shape (e.g., an “X”, a cross, etc.), or of another material and/or shape that is detectable in the image data 550a-b. One or both of the fixed markers 530c-d may be an active marker (e.g., one or both of the fixed markers 530c-d may be configured to change color, change luminous intensity, etc.) or a passive marker.
The assembly system 500 may further include a computer 560. The computer 560 may be communicatively coupled with the first controller 542 via a first connection 564a (e.g., a wired connection, such as a bus, a wireless connection, or another connection interface). The computer 560 may be further communicatively coupled with the first camera 508a disposed on the first robotic arm 502 via a second connection 564b (e.g., a wired connection, such as a bus, a wireless connection, or another connection interface). The computer 560 may be configured to generate a first set of instructions 552a for the first robotic arm 502. The first set of instructions 552a may instruct the first robotic arm 502 to move to a position. Accordingly, the computer 560 may provide the first set of instructions 552a to the first controller 542 to drive movement of the first robotic arm 502 to the position indicated by the first set of instructions 552a.
The computer 560 may generate a plurality of sets of instructions 552a-b to instruct the positions of the first robotic arm 502 and the second robotic arm 504. For example, the computer 560 may generate a first set of instructions to instruct a first position of the first robotic arm 502 when connecting the node 580 with the part 582. During the connecting the node 580 with the part 582, the computer 560 may generate an updated first set of instructions to instruct a second position of the first robotic arm 502. That is, the computer 560 may reposition (e.g., correct the position) of the first robotic arm 502 so that the node 580 is accurately connected with the part 582. As described herein, the computer 560 may generate one or more first sets of instructions 552a based on first image data 550a obtained from the first camera 508a.
Similarly, the computer 560 may be communicatively coupled with the second controller 544 via the third connection 566a (e.g., a wired connection, such as a bus, a wireless connection, or another connection interface). The computer 560 may be further communicatively coupled with the second camera 508b disposed on the second robotic arm 504 via a fourth connection 566b (e.g., a wired connection, such as a bus, a wireless connection, or another connection interface). The computer 560 may be configured to generate a second set of instructions 552b for the second robotic arm 504. The second set of instructions 552b may instruct the second robotic arm 504 to move to a position. Accordingly, the computer 560 may provide the second set of instructions 552b to the second controller 544 to drive movement of the second robotic arm 504 to the position indicated by the second set of instructions 552b.
The computer 560 may generate a second set of instructions 552b to instruct a first position of the second robotic arm 504 when connecting the part 582 with the node 580. During the connecting the part 582 with the node 580, the computer 560 may generate an updated second set of instructions to instruct a second position of the second robotic arm 504. That is, the computer 560 may reposition (e.g., correct the position) of the second robotic arm 504 so that the part 582 is accurately connected with the node 580. As described herein, the computer 560 may generate one or more second sets of instructions 552b based on second image data 550b obtained from the second camera 508b.
When the computer 560 is communicatively coupled with both the first controller 542 and the second controller 544, the computer 560 may be connected with a switch 562. The switch 562 may control the path of the sets of instructions 552a-b. For example, when the computer 560 generates a first set of instructions 552a to instruct the position of the first robotic arm 502, the switch 562 may cause the first set of instructions 552a to be sent over the first connection 564a to the first controller 542. Similarly, when the computer 560 generates a second set of instructions 552b to instruct the position of the second robotic arm 504, the switch 562 may cause the second set of instructions 552b to be sent over the second connection 564b to the first controller 542. In some embodiments, the computer 560 may control the switch 562. For example, the computer 560 may indicate, to the switch 562, which of the connections 564a, 566a to use when sending the sets of instructions 552a-b, or the computer 560 may indicate, to the switch 562, which of the controllers 542, 544 one of the sets of instructions 552a-b is to be sent.
During an assembly operation of a node-based transport structure, the first robotic arm 502 may engage with the node 580 through the end effector 518 and the second robotic arm 504 may engage with the part 582 through the end effector 528. In various embodiments of the assembly operation of a node-based transport structure, the node 580 may be connected with the part 582 and, therefore, the first and second robotic arms 502, 504 may perform the assembly operation. Connection of the node 580 with the part 582 is exemplary, and the operations described herein are applicable to other assembly and/or post-processing operations without departing from the scope of the present disclosure.
According to one embodiment, the first robotic arm 502 may be configured to move to bring the node 580 to the part 582 connected with the second robotic arm 504. Thus, the distal end 512 of the first robotic arm 502 may extend away from the base 514 and toward the distal end 522 of the second robotic arm 504, at which the part 582 is engaged. In one embodiment, the movement of the distal end 512 of the first robotic arm 502 may be at least approximately horizontal (e.g., parallel to the horizon); however, the movement of the distal end 512 of the first robotic arm 502 may be additionally or alternatively approximately vertical (e.g., perpendicular to the horizon).
The movement of the distal end 512 of the first robotic arm 502 toward the distal end 522 of the second robotic arm 504 may be caused by the first controller 542. The first controller 542 may cause the movement of the distal end 512 of the first robotic arm 502 based on the first set of instructions 552a provided by the computer 560.
When the node 580 is connected with the first robotic arm 502 (e.g., engaged with the end effector 518), the distal end 512 of the first robotic arm 502 may be positioned to accurately connect the node 580 with the part 582. To that end, the first camera 508a may capture first image data 550a of the second marker 530b and/or the first fixed marker 530c. In some embodiments, the first camera 508a may capture first image data 550a of the second fixed marker 530d, e.g., in addition or alternative to the first fixed marker 530c. Operations described herein with respect to the first fixed marker 530c captured in the first image data 550a may be similar when the second fixed marker 530d is captured in the first image data 550a (e.g., comparison to CAD data associated with the first robotic arm 502, etc.).
The first camera 508a may provide the first image data 550a to the computer 560. For example, the computer 560 may issue a request to the first camera 508a to capture the first image data 550a or the first camera 508a may automatically capture the first image data 550a (e.g., when the first robotic arm 502 is at a fixed position and not moving, when the second marker 530b and/or the first fixed marker 530c is/are within the field of view of the first camera 508a, when the second marker 530b and/or first fixed marker 530c is/are activated, etc.).
Based on the first image data 550a, the computer 560 may generate at least one set of instructions 552a. The computer 560 may determine the location of the second marker 530b, such as a set of numerical values indicating the location of the second marker 530b and/or a set of coordinates indicating the location of the second marker 530b in the first image data 550a. In one embodiment, the computer 560 may determine the actual location (e.g., absolute location) of the second marker 530b in space (e.g., three-dimensional space) in the first image data 550a. In another embodiment, the computer 560 may determine the relative location of the second marker 530b in the first image data 550a, e.g., with respect to the first robotic arm 502.
From the first image data 550a, the computer 560 may determine the location of the first fixed marker 530c, such as a set of numerical values indicating the location of the first fixed marker 530c and/or a set of coordinates indicating the location of the first fixed marker 530c. In one embodiment, the computer 560 may determine the actual location (e.g., absolute location) of the first fixed marker 530c in space (e.g., three-dimensional space) in the first image data 550a. In another embodiment, the computer 560 may determine the relative location of the first robotic arm 502 relative to the first fixed marker 530c in the first image data 550a.
The computer 560 may have stored therein or may be otherwise able to access a set of CAD data associated with the first robotic arm 502 (e.g., numerical CAD data and/or a set of coordinates based on CAD data). The set of CAD data associated with the first robotic arm 502 may indicate the position for first robotic arm 502 in order to accurately connect the node 580 with the part 582 (e.g., the “correct” position of the distal end 512 of the first robotic arm 502 in order for the node 580 to be accurately connected with the part 582).
In one embodiment, the computer 560 may determine the location of the second marker 530b in the first image data 550a relative to first CAD data associated with the first robotic arm 502 of the set of CAD data associated with the first robotic arm 502. For example, the computer 560 may compare the determined location of the second marker 530b in the first image data 550a to the first CAD data associated with the first robotic arm 502, and the computer 560 may determine (e.g., calculate) the difference between the determined location of the second marker 530b and the first CAD data associated with the first robotic arm 502. The determined difference may indicate the variance between the actual position of the first robotic arm 502 in the assembly system 500 and the “correct” position of the first robotic arm 502 in the assembly system, wherein the “correct” position of the first robotic arm 502 in the assembly system may be the position of the first robotic arm 502 in order to reduce (or eliminate) the variance and accurately connect the node 580 with the part 582. The determined difference may be used for positional compensation(s) and/or positional correction(s) of the first robotic arm 502, the tool flange 516, and/or the end effector 518, thereby compensating and/or correcting the position of the node 580 to be accurate (e.g., within one or more thresholds commensurate with designed or acceptable tolerances).
In one embodiment, the computer 560 may determine the location of the first fixed marker 530c in the first image data 550a relative to second CAD data associated with the first robotic arm 502 of the set of CAD data associated with the first robotic arm 502. For example, the computer 560 may compare the determined location of the first fixed marker 530c to the second CAD data associated with the first robotic arm 502, and the computer 560 may determine (e.g., calculate) the difference between the determined location of the first fixed marker 530c to the second CAD data associated with the first robotic arm 502. The determined difference may indicate the variance between the actual position of the first robotic arm 502 in the assembly system 500 and the “correct” position of the first robotic arm 502 in the assembly system, wherein the “correct” position of the first robotic arm 502 in the assembly system may be the position of the first robotic arm 502 in order to reduce (or eliminate) the variance and accurately connect the node 580 with the part 582. The determined difference may be used for positional compensation(s) and/or positional correction(s) of the first robotic arm 502, the tool flange 516, and/or the end effector 518, thereby compensating and/or correcting the position of the node 580 to be accurate (e.g., within one or more thresholds commensurate with designed or acceptable tolerances).
According to one embodiment, the determined location of the first fixed marker 530c may function as a reference point in order to provide a frame of reference and/or position of the first robotic arm 502 relative to a fixed and known point (i.e., the first fixed marker 530c). That is, the location of the first fixed marker 530c in the assembly system 500 may be known (e.g., stored in or otherwise accessible by the computer 560) and, therefore, the computer 560 may use the first fixed marker 530c in order to determine a position of the first robotic arm 502 in the assembly system 500. For example, the computer 560 may determine (e.g., calculate) the location of the first fixed marker 530c in the first image data 550a, and the computer 560 may derive the position of the first robotic arm 502 in the assembly system 500 based on the location of the first fixed marker 530c in the first image data 550a (e.g., the computer 560 may calculate the distance and/or direction of the first fixed marker 530c in the first image data 550a in order to derive the position of the first robotic arm 502).
Based on the determined location of the second marker 530b, the determined location of the first fixed marker 530c, the first CAD data associated with the first robotic arm 502, and/or the second CAD data associated with the first robotic arm 502, the computer 560 may determine a first set of instructions 552a that indicates a position to which the distal end 512 of the first robotic arm 502 is to move in order to accurately connect the node 580 with the part 582. For example, the first set of instructions 552a may indicate one or more movements that are to be performed by the first robotic arm 502 so that the distal end 512 of the first robotic arm 502 will be positioned for an accurate connection of the node 580 with the part 582. In one embodiment, the first set of instructions 552a may indicate one or more movements of the distal end 512 of the first robotic arm 502 so that the distal end 512 of the first robotic arm 502 will be at a position at which the variance from the first CAD data and/or the second CAD data associated with the first robotic arm 502 may be satisfactorily low (e.g., within one or more thresholds of variance that is acceptable for an accurate connection of the node 580 with the part 582).
The computer 560 may generate the first set of instructions 552a to indicate one or more movements in one or more of 6DoF, including forward/backward (e.g., surge), up/down (e.g., heave), left/right (e.g., sway) for translation in space and including yaw, pitch, and roll for rotation in space. For one or more of the 6DoF, the tool flange 516 and/or the end effector 518 may perform one or more movements. For example, the computer 560 may generate the first set of instructions 552a to cause the tool flange 516 and/or the end effector 518 to rotate and/or translate in space.
The computer 560 may provide the generated first set of instructions 552a to the first controller 542. For example, the computer 560 may indicate, to the switch 562, that the first set of instructions 552a is to be provided to the first controller 542, and the switch 562 may cause the first set of instructions 552a to be sent over the first connection 564a to the first controller 542.
Accordingly the first controller 542 may drive movement of the first robotic arm 502 based on the first set of instructions 552a. For example, the first controller 542 may drive movement of the first robotic arm 502, thereby causing the distal end 512 of the first robotic arm 502 to move into a position indicated by the first set of instructions 552a. Thus, the first robotic arm 502 may be positioned relative to the second marker 530b and/or the first fixed marker 530c. Correspondingly, the first robotic arm 502 may move the tool flange 516, the end effector 518, and the node 580 into a position corresponding to the first set of instructions 552a.
Based on the first set of instructions 552a, the first robotic arm 502 may cause the node 580 to contact the part 582 that is engaged with the end effector 528 connected with the second robotic arm 504. When the node 580 is accurately contacting the part 582 (e.g., within designed or acceptable tolerances), the node 580 may be connected with the part 582 in order to form a portion of a node-based transport structure (e.g., without any fixtures). For example, the first robotic arm 502 and/or the end effector 518 may cause the node 580 to be extended toward the part 582 and, when the node 580 is contacting the part 582, pressure may be applied until at least one feature (e.g., a male/extension feature, a female/reception feature, etc.) of the node 580 is securely engaged with at least one feature (e.g., a correspondingly opposing one of a female/reception feature, a male/extension feature, etc.) of the part 582.
As aforementioned, the computer 560 may generate a plurality of first sets of instructions. Each first set of instructions may be dynamically generated in association with the movement of the first robotic arm 502. For example, the computer 560 may generate one first set of instructions 552a that causes the first controller 542 to drive movement of the first robotic arm 502 into a first position. The first camera 508a may capture first image data 550a of the second marker 530b and/or the first fixed marker 530c during and/or after the first robotic arm 502 is controlled to the first position.
The first image data 550a may be provided to the computer 560, and the computer 560 may reevaluate the location of the second marker 530b and/or the first fixed marker 530c in the first image data 550a. The computer 560 may compare the reevaluated location of the second marker 530b and/or the first fixed marker 530c to the first CAD data and the second CAD data associated with the first robotic arm 502, respectively, and the computer 560 may generate a next first set of instructions to compensate and/or correct for the position of the first robotic arm 502, thereby reducing the variance of the position of the first robotic arm 502 from the correct position at which the first robotic arm 502 is to be for an accurate connection of the node 580 with the part 582.
The computer 560 may provide the next first set of instructions to the first controller 542 to drive movement of the first robotic arm 502 to a second position corresponding to the next first set of instructions. The computer 560 may iteratively generate instructions based on first image data until the first robotic arm 502 is at a position at which the node 580 may be accurately connected with the part 582. The computer 560 may generate first sets of instructions while the first robotic arm 502 is in motion and/or when the first robotic arm 502 is stationary between movements performed according to first sets of instructions.
According to one embodiment, the second robotic arm 504 may be configured to move to bring the part 582 to the node 580 connected with the first robotic arm 502. Thus, the distal end 522 of the second robotic arm 504 may extend away from the base 524 and toward the distal end 512 of the first robotic arm 502, at which the node 580 is engaged. In one embodiment, the movement of the distal end 522 of the second robotic arm 504 may be at least approximately horizontal (e.g., parallel to the horizon); however, the movement of the distal end 522 of the second robotic arm 504 may be additionally or alternatively approximately vertical (e.g., perpendicular to the horizon).
The movement of the distal end 522 of the second robotic arm 504 toward the distal end 512 of the first robotic arm 502 may be caused by the second controller 544. The second controller 544 may cause the movement of the distal end 522 of the second robotic arm 504 based on the second set of instructions 552b provided by the computer 560.
When the part 582 is connected with the second robotic arm 504 (e.g., engaged with the end effector 528), the distal end 522 of the second robotic arm 504 may be positioned to accurately connect the part 582 with the node 580. To that end, the second camera 508b may capture second image data 550b of the first marker 530a and/or the second fixed marker 530d. Operations described herein with respect to the second fixed marker 530d captured in the second image data 550b may be similar when the first fixed marker 530c is captured in the second image data 550b (e.g., comparison to CAD data associated with the second robotic arm 504, etc.). In some embodiments, the first and second cameras 508a-b may be positioned so that both of the first and second cameras 508a-b capture image data 550a-b of at least one of the same one of the fixed markers 530c-d. Accordingly, the same fixed marker of the fixed markers 530c-d captured in both the first and second image data 550a-b may provide a common reference point and/or common frame of reference for both of the robotic arms 502, 504 in the assembly system 500.
The second camera 508b may provide the second image data 550b to the computer 560. For example, the computer 560 may issue a request to the second camera 508b to capture the second image data 550b or the second camera 508b may automatically capture the second image data 550b (e.g., when the second robotic arm 504 is at a fixed position and not moving, when the first marker 530a and/or the second fixed marker 530d is/are within the field of view of the second camera 508b, when the first marker 530a and/or second fixed marker 530d is/are activated, etc.).
Based on the second image data 550b, the computer 560 may generate at least one second set of instructions 552b. The computer 560 may determine the location of the first marker 530a, such as a set of numerical values indicating the location of the first marker 530a and/or a set of coordinates indicating the location of the first marker 530a in the second image data 550b. In one embodiment, the computer 560 may determine the actual location (e.g., absolute location) of the first marker 530a in space (e.g., three-dimensional space) in the second image data 550b. In another embodiment, the computer 560 may determine the relative location of the first marker 530a in the second image data 550b, e.g., with respect to the second robotic arm 504.
From the second image data 550b, the computer 560 may determine the location of the second fixed marker 530d, such as a set of numerical values indicating the location of the second fixed marker 530d and/or a set of coordinates indicating the location of the second fixed marker 530d. In one embodiment, the computer 560 may determine the actual location (e.g., absolute location) of the second fixed marker 530d in space (e.g., three-dimensional space) in the second image data 550b. In another embodiment, the computer 560 may determine the relative location of the second robotic arm 504 relative to the second fixed marker 530d in the second image data 550b.
Similar to operations described with respect to movement of the first robotic arm 502, the computer 560 may have stored therein or may be otherwise able to access a set of CAD data associated with the second robotic arm 504 (e.g., numerical CAD data and/or a set of coordinates based on CAD data). The set of CAD data associated with the second robotic arm 504 may indicate the position for second robotic arm 504 in order to accurately connect the part 582 with the node 580 (e.g., the “correct” position of the distal end 522 of the second robotic arm 504 in order for the part 582 to be accurately connected with the node 580).
In one embodiment, the computer 560 may determine the location of the first marker 530a in the second image data 550b relative to first CAD data associated with the second robotic arm 504 of the set of CAD data associated with the second robotic arm 504. For example, the computer 560 may compare the determined location of the first marker 530a in the second image data 550b to the first CAD data associated with the second robotic arm 504, and the computer 560 may determine (e.g., calculate) the difference between the determined location of the first marker 530a and the first CAD data associated with the second robotic arm 504.
The determined difference may indicate the variance between the actual position of the second robotic arm 504 in the assembly system 500 and the “correct” position of the second robotic arm 504 in the assembly system, wherein the “correct” position of the second robotic arm 504 in the assembly system may be the position of the second robotic arm 504 in order to reduce (or eliminate) the variance and accurately connect the part 582 with the node 580. The determined difference may be used for positional compensation(s) and/or positional correction(s) of the second robotic arm 504, the tool flange 526, and/or the end effector 528, thereby compensating and/or correcting the position of the part 582 to be accurate (e.g., within one or more thresholds commensurate with designed or acceptable tolerances).
In one embodiment, the computer 560 may determine the location of the second fixed marker 530d in the second image data 550b relative to second CAD data associated with the second robotic arm 504 of the set of CAD data associated with the second robotic arm 504. For example, the computer 560 may compare the determined location of the second fixed marker 530d to the second CAD data associated with the second robotic arm 504, and the computer 560 may determine (e.g., calculate) the difference between the determined location of the second fixed marker 530d and the second CAD data associated with the second robotic arm 504.
The determined difference may indicate the variance between the actual position of the second robotic arm 504 in the assembly system 500 and the “correct” position of the second robotic arm 504 in the assembly system, wherein the “correct” position of the second robotic arm 504 in the assembly system may be the position of the second robotic arm 504 in order to reduce (or eliminate) the variance and accurately connect the part 582 with the node 580. The determined difference may be used for positional compensation(s) and/or positional correction(s) of the second robotic arm 504, the tool flange 526, and/or the end effector 528, thereby compensating and/or correcting the position of the part 582 to be accurate (e.g., within one or more thresholds commensurate with designed or acceptable tolerances).
According to one embodiment, the determined location of the second fixed marker 530d may function as a reference point in order to provide a frame of reference and/or position of the second robotic arm 504 relative to a fixed and known point (i.e., the second fixed marker 530d). That is, the location of the second fixed marker 530d in the assembly system 500 may be known (e.g., stored in or otherwise accessible by the computer 560) and, therefore, the computer 560 may use the second fixed marker 530d in order to determine a position of the second robotic arm 504 in the assembly system 500. For example, the computer 560 may determine (e.g., calculate) the location of the second fixed marker 530d in the second image data 550b, and the computer 560 may derive the position of the second robotic arm 504 in the assembly system 500 based on the location of the second fixed marker 530d in the second image data 550b (e.g., the computer 560 may calculate the distance and/or direction of the second fixed marker 530d in the second image data 550b in order to derive the position of the second robotic arm 504).
Based on the determined location of the first marker 530a, the determined location of the second fixed marker 530d, the first CAD data associated with the second robotic arm 504, and/or the second CAD data associated with the second robotic arm 504, the computer 560 may determine a second set of instructions 552b that indicates a position to which the distal end 522 of the second robotic arm 504 is to move in order to accurately connect the part 582 with the node 580. For example, the second set of instructions 552b may indicate one or more movements that are to be performed by the second robotic arm 504 so that the distal end 522 of the second robotic arm 504 will be positioned for an accurate connection of the part 582 with the node 580. In one embodiment, the second set of instructions 552b may indicate one or more movements of the distal end 522 of the second robotic arm 504 so that the distal end 522 of the second robotic arm 504 will be at a position at which the variance from the first and/or second CAD data associated with the second robotic arm 504 may be satisfactorily low (e.g., within one or more thresholds of variance that is acceptable for an accurate connection of the part 582 with the node 580).
The computer 560 may generate the second set of instructions 552b to indicate one or more movements in one or more of 6DoF, including forward/backward (e.g., surge), up/down (e.g., heave), left/right (e.g., sway) for translation in space and including yaw, pitch, and roll for rotation in space. For one or more of the 6DoF, the tool flange 526 and/or the end effector 528 may perform one or more movements. For example, the computer 560 may generate the second set of instructions 552b to cause the tool flange 526 and/or the end effector 528 to rotate and/or translate in space.
The computer 560 may provide the generated second set of instructions 552b to the second controller 544. For example, the computer 560 may indicate, to the switch 562, that the second set of instructions 552b is to be provided to the second controller 544, and the switch 562 may cause the second set of instructions 552b to be sent over the third connection 566a to the second controller 544.
Accordingly the second controller 544 may drive movement of the second robotic arm 504 based on the second set of instructions 552b. For example, the second controller 544 may drive movement of the second robotic arm 504, thereby causing the distal end 522 of the second robotic arm 504 to move into a position indicated by the second set of instructions 552b. Thus, the second robotic arm 504 may be positioned relative to the first marker 530a and/or the second fixed marker 530d. Correspondingly, the second robotic arm 504 may move the tool flange 526, the end effector 528, and the part 582 into a position corresponding to the second set of instructions 552b.
Based on the second set of instructions 552b, the second robotic arm 504 may cause the part 582 to contact the node 580 that is engaged with the end effector 518 connected with the first robotic arm 502. When the part 582 is accurately contacting the node 580 (e.g., within designed or acceptable tolerances), the part 582 may be connected with the node 580 in order to form a portion of a node-based transport structure (e.g., without any fixtures). For example, the second robotic arm 504 and/or the end effector 528 may cause the part 582 to be extended toward the node 580 and, when the part 582 is contacting the node 580, pressure may be applied until at least one feature (e.g., a male/extension feature, a female/reception feature, etc.) of the part 582 is securely engaged with at least one feature (e.g., a correspondingly opposing one of a female/reception feature, a male/extension feature, etc.) of the node 580.
As aforementioned, the computer 560 may generate a plurality of second sets of instructions. Each second set of instructions may be dynamically generated in association with the movement of the second robotic arm 504. For example, the computer 560 may generate one second set of instructions 552b that causes the second controller 544 to drive movement of the second robotic arm 504 into a first position. The second camera 508b may capture second image data 550b of the first marker 530a and/or the second fixed marker 530d during and/or after the second robotic arm 504 is controlled to the first position.
The second image data 550b may be provided to the computer 560, and the computer 560 may reevaluate the location of the first marker 530a and/or the second fixed marker 530d in the second image data 550b. The computer 560 may compare the reevaluated location of the first marker 530a and/or the second fixed marker 530d to the first CAD data associated with the second robotic arm 504 and the second CAD data associated with the second robotic arm 504, respectively, and the computer 560 may generate a next second set of instructions to compensate and/or correct for the position of the second robotic arm 504, thereby reducing the variance of the position of the second robotic arm 504 from the correct position at which the second robotic arm 504 is to be for an accurate connection of the part 582 with the node 580.
The computer 560 may provide the next second set of instructions to the second controller 544 to drive movement of the second robotic arm 504 to a second position corresponding to the next second set of instructions. The computer 560 may iteratively generate instructions based on second image data until the second robotic arm 504 is at a position at which the part 582 may be accurately connected with the node 580. The computer 560 may generate second sets of instructions while the second robotic arm 504 is in motion and/or when the second robotic arm 504 is stationary between movements performed according to second sets of instructions.
The method 600 may be performed in an assembly system, such as the assembly system 400 of
The computer may obtain, from a first camera disposed on a first robotic arm, first image data of a second marker disposed on a second robotic arm (operation 602). In some embodiments, the computer may obtain first image data of a fixed marker (e.g., in addition to the second marker), and the computer may use the fixed marker in the first image data as a reference point and/or frame of reference for the first robotic arm.
The computer may generate a first set of instructions based on a position of the second marker in the first image data (operation 604). The first set of instructions may be associated with one or more assembly and/or pre-/post-assembly operations, such as causing connection of a node engaged by the first robotic arm to a part engaged by the second robotic arm. In some embodiments, the computer may further generate the first set of instructions based on a position of the fixed marker in the first image data. For example, the computer may use the fixed marker in the first image data as a reference point and/or frame of reference with respect to the first robotic arm.
The computer may provide the first set of instructions to a first controller connected with the first robotic arm (operation 606). The first set of instructions may cause first movement of the first robotic arm in association with one or more assembly and/or pre-/post-assembly operations, such as causing connection of a node engaged by the first robotic arm to a part engaged by the second robotic arm.
In some embodiments, the computer may further obtain, from the first camera disposed on the first robotic arm, updated image data of the second marker disposed on the second robotic arm after the first movement of the first robotic arm (operation 608). In addition to the updated image data of the second marker, the computer may obtain updated image data of the fixed marker.
The computer may generate an updated set of instructions based on an updated position of the second marker in the updated image data (operation 610). In some embodiments, the computer may generate the updated set of instructions further based on the fixed marker, which may provide a reference point and/or frame of reference of the first robotic arm in the assembly system. The updated set of instructions may compensate or correct a position of the first robotic arm in association with one or more assembly and/or pre-/post-assembly operations, such as causing connection of the node engaged by the first robotic arm to the part engaged by the second robotic arm.
The computer may provide the updated set of instructions to the first controller connected with the first robotic arm (operation 612). The updated set of instructions may cause corrective movement of the first robotic arm in association with one or more assembly and/or pre-/post-assembly operations, such as causing connection of the node engaged by the first robotic arm to the part engaged by the second robotic arm.
In some embodiments, the computer may further obtain, from a second camera disposed on the second robotic arm, second image data of a first marker disposed on the first robotic arm (operation 614). In some embodiments, the computer may obtain second image data of a fixed marker (e.g., the same fixed marker captured in the first image data or a different fixed marker), and the computer may use the fixed marker in the second image data as a reference point and/or frame of reference for the second robotic arm.
The computer may generate a second set of instructions based on a position of the first marker in the second image data (operation 616). The second set of instructions may be associated with one or more assembly and/or pre-/post-assembly operations, such as causing connection of the node engaged by the first robotic arm to the part engaged by the second robotic arm. In some embodiments, the computer may further generate the second set of instructions based on a position of the fixed marker in the second image data. For example, the computer may use the fixed marker captured in the second image data as a reference point and/or frame of reference with respect to the second robotic arm.
The computer may provide the second set of instructions to a second controller connected with the second robotic arm (operation 618). The second set of instructions may cause second movement of the second robotic arm in association with one or more assembly and/or pre-/post-assembly operations, such as causing connection of the node engaged by the first robotic arm to the part engaged by the second robotic arm.
In causing the movement(s) to the first robotic arm (and the second robotic arm in some embodiments), the computer may cause the first robotic arm (and potentially the second robotic arm) to be correctly positioned for one or more assembly and/or pre-/post-assembly operations, such as causing connection of the node engaged by the first robotic arm to the part engaged by the second robotic arm. Accordingly, positional correction for at least one robotic arm in an assembly system may be described herein.
The present disclosure is provided to enable any person skilled in the art to practice the various aspects described herein. Various modifications to these exemplary embodiments presented throughout the present disclosure will be readily apparent to those skilled in the art, and the concepts disclosed herein may be applied to other techniques for printing nodes and interconnects. Thus, the claims are not intended to be limited to the exemplary embodiments presented throughout the disclosure, but are to be accorded the full scope consistent with the language claims. All structural and functional equivalents to the elements of the exemplary embodiments described throughout the present disclosure that are known or later come to be known to those of ordinary skill in the art are intended to be encompassed by the claims. Moreover, nothing disclosed herein is intended to be dedicated to the public regardless of whether such disclosure is explicitly recited in the claims. No claim element is to be construed under the provisions of 35 U.S.C. § 112(f), or analogous law in applicable jurisdictions, unless the element is expressly recited using the phrase “means for” or, in the case of a method claim, the element is recited using the phrase “step for.”
Number | Name | Date | Kind |
---|---|---|---|
5148591 | Pryor | Sep 1992 | A |
5203226 | Hongou et al. | Apr 1993 | A |
5742385 | Champa | Apr 1998 | A |
5990444 | Costin | Nov 1999 | A |
6010155 | Rinehart | Jan 2000 | A |
6096249 | Yamaguchi | Aug 2000 | A |
6140602 | Costin | Oct 2000 | A |
6250533 | Otterbein et al. | Jun 2001 | B1 |
6252196 | Costin et al. | Jun 2001 | B1 |
6318642 | Goenka et al. | Nov 2001 | B1 |
6365057 | Whitehurst et al. | Apr 2002 | B1 |
6391251 | Keicher et al. | May 2002 | B1 |
6409930 | Whitehurst et al. | Jun 2002 | B1 |
6468439 | Whitehurst et al. | Oct 2002 | B1 |
6514018 | Martinez et al. | Feb 2003 | B2 |
6554345 | Jonsson | Apr 2003 | B2 |
6585151 | Ghosh | Jul 2003 | B1 |
6644721 | Miskech et al. | Nov 2003 | B1 |
6644897 | Martinez et al. | Nov 2003 | B2 |
6811744 | Keicher et al. | Nov 2004 | B2 |
6866497 | Saiki | Mar 2005 | B2 |
6919035 | Clough | Jul 2005 | B1 |
6926970 | James et al. | Aug 2005 | B2 |
7152292 | Hohmann et al. | Dec 2006 | B2 |
7344186 | Hausler et al. | Mar 2008 | B1 |
7500373 | Quell | Mar 2009 | B2 |
7586062 | Heberer | Sep 2009 | B2 |
7637134 | Burzlaff et al. | Dec 2009 | B2 |
7710347 | Gentilman et al. | May 2010 | B2 |
7716802 | Stern et al. | May 2010 | B2 |
7745293 | Yamazaki et al. | Jun 2010 | B2 |
7766123 | Sakurai et al. | Aug 2010 | B2 |
7852388 | Shimizu et al. | Dec 2010 | B2 |
7908922 | Zarabadi et al. | Mar 2011 | B2 |
7951324 | Naruse et al. | May 2011 | B2 |
8094036 | Heberer | Jan 2012 | B2 |
8163077 | Eron et al. | Apr 2012 | B2 |
8286236 | Jung et al. | Oct 2012 | B2 |
8289352 | Vartanian et al. | Oct 2012 | B2 |
8297096 | Mizumura et al. | Oct 2012 | B2 |
8354170 | Henry et al. | Jan 2013 | B1 |
8383028 | Lyons | Feb 2013 | B2 |
8408036 | Reith et al. | Apr 2013 | B2 |
8429754 | Jung et al. | Apr 2013 | B2 |
8437513 | Derakhshani et al. | May 2013 | B1 |
8444903 | Lyons et al. | May 2013 | B2 |
8452073 | Taminger et al. | May 2013 | B2 |
8599301 | Dowski, Jr. et al. | Dec 2013 | B2 |
8606540 | Haisty et al. | Dec 2013 | B2 |
8610761 | Haisty et al. | Dec 2013 | B2 |
8631996 | Quell et al. | Jan 2014 | B2 |
8675925 | Derakhshani et al. | Mar 2014 | B2 |
8678060 | Dietz et al. | Mar 2014 | B2 |
8686314 | Schneegans et al. | Apr 2014 | B2 |
8686997 | Radet et al. | Apr 2014 | B2 |
8694284 | Berard | Apr 2014 | B2 |
8720876 | Reith et al. | May 2014 | B2 |
8752166 | Jung et al. | Jun 2014 | B2 |
8755923 | Farahani et al. | Jun 2014 | B2 |
8787628 | Derakhshani et al. | Jul 2014 | B1 |
8818771 | Gielis et al. | Aug 2014 | B2 |
8873238 | Wilkins | Oct 2014 | B2 |
8978535 | Ortiz et al. | Mar 2015 | B2 |
9006605 | Schneegans et al. | Apr 2015 | B2 |
9071436 | Jung et al. | Jun 2015 | B2 |
9101979 | Hofmann et al. | Aug 2015 | B2 |
9104921 | Derakhshani et al. | Aug 2015 | B2 |
9126365 | Mark et al. | Sep 2015 | B1 |
9128476 | Jung et al. | Sep 2015 | B2 |
9138924 | Yen | Sep 2015 | B2 |
9149988 | Mark et al. | Oct 2015 | B2 |
9156205 | Mark et al. | Oct 2015 | B2 |
9186848 | Mark et al. | Nov 2015 | B2 |
9244986 | Karmarkar | Jan 2016 | B2 |
9248611 | Divine et al. | Feb 2016 | B2 |
9254535 | Buller et al. | Feb 2016 | B2 |
9266566 | Kim | Feb 2016 | B2 |
9269022 | Rhoads et al. | Feb 2016 | B2 |
9327452 | Mark et al. | May 2016 | B2 |
9329020 | Napoletano | May 2016 | B1 |
9332251 | Haisty et al. | May 2016 | B2 |
9346127 | Buller et al. | May 2016 | B2 |
9389315 | Bruder et al. | Jul 2016 | B2 |
9399256 | Buller et al. | Jul 2016 | B2 |
9403235 | Buller et al. | Aug 2016 | B2 |
9418193 | Dowski, Jr. et al. | Aug 2016 | B2 |
9457514 | Schwärzler | Oct 2016 | B2 |
9469057 | Johnson et al. | Oct 2016 | B2 |
9478063 | Rhoads et al. | Oct 2016 | B2 |
9481402 | Muto et al. | Nov 2016 | B1 |
9486878 | Buller et al. | Nov 2016 | B2 |
9486960 | Paschkewitz et al. | Nov 2016 | B2 |
9502993 | Deng | Nov 2016 | B2 |
9525262 | Stuart et al. | Dec 2016 | B2 |
9533526 | Nevins | Jan 2017 | B1 |
9555315 | Aders | Jan 2017 | B2 |
9555580 | Dykstra et al. | Jan 2017 | B1 |
9557856 | Send et al. | Jan 2017 | B2 |
9566742 | Keating et al. | Feb 2017 | B2 |
9566758 | Cheung et al. | Feb 2017 | B2 |
9573193 | Buller et al. | Feb 2017 | B2 |
9573225 | Buller et al. | Feb 2017 | B2 |
9586290 | Buller et al. | Mar 2017 | B2 |
9595795 | Lane et al. | Mar 2017 | B2 |
9597843 | Stauffer et al. | Mar 2017 | B2 |
9600929 | Young et al. | Mar 2017 | B1 |
9609755 | Coull et al. | Mar 2017 | B2 |
9610737 | Johnson et al. | Apr 2017 | B2 |
9611667 | GangaRao et al. | Apr 2017 | B2 |
9616623 | Johnson et al. | Apr 2017 | B2 |
9626487 | Jung et al. | Apr 2017 | B2 |
9626489 | Nilsson | Apr 2017 | B2 |
9643361 | Liu | May 2017 | B2 |
9662840 | Buller et al. | May 2017 | B1 |
9665182 | Send et al. | May 2017 | B2 |
9672389 | Mosterman et al. | Jun 2017 | B1 |
9672550 | Apsley et al. | Jun 2017 | B2 |
9676145 | Buller et al. | Jun 2017 | B2 |
9684919 | Apsley et al. | Jun 2017 | B2 |
9688032 | Kia et al. | Jun 2017 | B2 |
9690286 | Hovsepian et al. | Jun 2017 | B2 |
9700966 | Kraft et al. | Jul 2017 | B2 |
9703896 | Zhang et al. | Jul 2017 | B2 |
9713903 | Paschkewitz et al. | Jul 2017 | B2 |
9718302 | Young et al. | Aug 2017 | B2 |
9718434 | Hector, Jr. et al. | Aug 2017 | B2 |
9724877 | Flitsch et al. | Aug 2017 | B2 |
9724881 | Johnson et al. | Aug 2017 | B2 |
9725178 | Wang | Aug 2017 | B2 |
9731730 | Stiles | Aug 2017 | B2 |
9731773 | Garni et al. | Aug 2017 | B2 |
9741954 | Bruder et al. | Aug 2017 | B2 |
9747352 | Karmarkar | Aug 2017 | B2 |
9764415 | Seufzer et al. | Sep 2017 | B2 |
9764520 | Johnson et al. | Sep 2017 | B2 |
9765226 | Dain | Sep 2017 | B2 |
9770760 | Liu | Sep 2017 | B2 |
9773393 | Velez | Sep 2017 | B2 |
9776234 | Schaalliausen et al. | Oct 2017 | B2 |
9782936 | Glunz et al. | Oct 2017 | B2 |
9783324 | Embler et al. | Oct 2017 | B2 |
9783977 | Alqasimi et al. | Oct 2017 | B2 |
9789548 | Golshany et al. | Oct 2017 | B2 |
9789922 | Dosenbach et al. | Oct 2017 | B2 |
9796137 | Zhang et al. | Oct 2017 | B2 |
9802108 | Aders | Oct 2017 | B2 |
9809977 | Carney et al. | Nov 2017 | B2 |
9817922 | Glunz et al. | Nov 2017 | B2 |
9818071 | Jung et al. | Nov 2017 | B2 |
9821339 | Paschkewitz et al. | Nov 2017 | B2 |
9821411 | Buller et al. | Nov 2017 | B2 |
9823143 | Twelves, Jr. et al. | Nov 2017 | B2 |
9829564 | Bruder et al. | Nov 2017 | B2 |
9846933 | Yuksel | Dec 2017 | B2 |
9854828 | Langeland | Jan 2018 | B2 |
9858604 | Apsley et al. | Jan 2018 | B2 |
9862833 | Hasegawa et al. | Jan 2018 | B2 |
9862834 | Hasegawa et al. | Jan 2018 | B2 |
9863885 | Zaretski et al. | Jan 2018 | B2 |
9870629 | Cardno et al. | Jan 2018 | B2 |
9879981 | Dehghan Niri et al. | Jan 2018 | B1 |
9884663 | Czinger et al. | Feb 2018 | B2 |
9898776 | Apsley et al. | Feb 2018 | B2 |
9914150 | Pettersson et al. | Mar 2018 | B2 |
9919360 | Buller et al. | Mar 2018 | B2 |
9931697 | Levin et al. | Apr 2018 | B2 |
9933031 | Bracamonte et al. | Apr 2018 | B2 |
9933092 | Sindelar | Apr 2018 | B2 |
9957031 | Golshany et al. | May 2018 | B2 |
9958535 | Send et al. | May 2018 | B2 |
9962767 | Buller et al. | May 2018 | B2 |
9963978 | Johnson et al. | May 2018 | B2 |
9971920 | Derakhshani et al. | May 2018 | B2 |
9976063 | Childers et al. | May 2018 | B2 |
9987792 | Flitsch et al. | Jun 2018 | B2 |
9988136 | Tiryaki et al. | Jun 2018 | B2 |
9989623 | Send et al. | Jun 2018 | B2 |
9990565 | Rhoads et al. | Jun 2018 | B2 |
9994339 | Colson et al. | Jun 2018 | B2 |
9996890 | Cinnamon et al. | Jun 2018 | B1 |
9996945 | Holzer et al. | Jun 2018 | B1 |
10002215 | Dowski et al. | Jun 2018 | B2 |
10006156 | Kirkpatrick | Jun 2018 | B2 |
10011089 | Lyons et al. | Jul 2018 | B2 |
10011685 | Childers et al. | Jul 2018 | B2 |
10012532 | Send et al. | Jul 2018 | B2 |
10013777 | Mariampillai et al. | Jul 2018 | B2 |
10015908 | Williams et al. | Jul 2018 | B2 |
10016852 | Broda | Jul 2018 | B2 |
10016942 | Mark et al. | Jul 2018 | B2 |
10017384 | Greer et al. | Jul 2018 | B1 |
10018576 | Herbsommer et al. | Jul 2018 | B2 |
10022792 | Srivas et al. | Jul 2018 | B2 |
10022912 | Kia et al. | Jul 2018 | B2 |
10027376 | Sankaran et al. | Jul 2018 | B2 |
10029415 | Swanson et al. | Jul 2018 | B2 |
10040239 | Brown, Jr. | Aug 2018 | B2 |
10046412 | Blackmore | Aug 2018 | B2 |
10048769 | Selker et al. | Aug 2018 | B2 |
10052712 | Blackmore | Aug 2018 | B2 |
10052820 | Kemmer et al. | Aug 2018 | B2 |
10055536 | Maes et al. | Aug 2018 | B2 |
10058764 | Aders | Aug 2018 | B2 |
10058920 | Buller et al. | Aug 2018 | B2 |
10061906 | Nilsson | Aug 2018 | B2 |
10065270 | Buller et al. | Sep 2018 | B2 |
10065361 | Susnjara et al. | Sep 2018 | B2 |
10065367 | Brown, Jr. | Sep 2018 | B2 |
10068316 | Holzer et al. | Sep 2018 | B1 |
10071422 | Buller et al. | Sep 2018 | B2 |
10071525 | Susnjara et al. | Sep 2018 | B2 |
10072179 | Drijfhout | Sep 2018 | B2 |
10074128 | Colson et al. | Sep 2018 | B2 |
10076875 | Mark et al. | Sep 2018 | B2 |
10076876 | Mark et al. | Sep 2018 | B2 |
10081140 | Paesano et al. | Sep 2018 | B2 |
10081431 | Seack et al. | Sep 2018 | B2 |
10086568 | Snyder et al. | Oct 2018 | B2 |
10087320 | Simmons et al. | Oct 2018 | B2 |
10087556 | Gallucci et al. | Oct 2018 | B2 |
10099427 | Mark et al. | Oct 2018 | B2 |
10100542 | GangaRao et al. | Oct 2018 | B2 |
10100890 | Bracamonte et al. | Oct 2018 | B2 |
10107344 | Bracamonte et al. | Oct 2018 | B2 |
10108766 | Druckman et al. | Oct 2018 | B2 |
10113600 | Bracamonte et al. | Oct 2018 | B2 |
10118347 | Stauffer et al. | Nov 2018 | B2 |
10118579 | Lakic | Nov 2018 | B2 |
10120078 | Bruder et al. | Nov 2018 | B2 |
10124546 | Johnson et al. | Nov 2018 | B2 |
10124570 | Evans et al. | Nov 2018 | B2 |
10137500 | Blackmore | Nov 2018 | B2 |
10138354 | Groos et al. | Nov 2018 | B2 |
10144126 | Krohne et al. | Dec 2018 | B2 |
10145110 | Carney et al. | Dec 2018 | B2 |
10151363 | Bracamonte et al. | Dec 2018 | B2 |
10152661 | Kieser | Dec 2018 | B2 |
10160278 | Coombs et al. | Dec 2018 | B2 |
10161021 | Lin et al. | Dec 2018 | B2 |
10166752 | Evans et al. | Jan 2019 | B2 |
10166753 | Evans et al. | Jan 2019 | B2 |
10171578 | Cook et al. | Jan 2019 | B1 |
10173255 | TenHouten et al. | Jan 2019 | B2 |
10173327 | Kraft et al. | Jan 2019 | B2 |
10178800 | Mahalingam et al. | Jan 2019 | B2 |
10179640 | Wilkerson | Jan 2019 | B2 |
10183330 | Buller et al. | Jan 2019 | B2 |
10183478 | Evans et al. | Jan 2019 | B2 |
10189187 | Keating et al. | Jan 2019 | B2 |
10189240 | Evans et al. | Jan 2019 | B2 |
10189241 | Evans et al. | Jan 2019 | B2 |
10189242 | Evans et al. | Jan 2019 | B2 |
10190424 | Johnson et al. | Jan 2019 | B2 |
10195693 | Buller et al. | Feb 2019 | B2 |
10196539 | Boonen et al. | Feb 2019 | B2 |
10197338 | Melsheimer | Feb 2019 | B2 |
10200677 | Trevor et al. | Feb 2019 | B2 |
10201932 | Flitsch et al. | Feb 2019 | B2 |
10201941 | Evans et al. | Feb 2019 | B2 |
10202673 | Lin et al. | Feb 2019 | B2 |
10204216 | Nejati et al. | Feb 2019 | B2 |
10207454 | Buller et al. | Feb 2019 | B2 |
10209065 | Estevo, Jr. et al. | Feb 2019 | B2 |
10210662 | Holzer et al. | Feb 2019 | B2 |
10213837 | Kondoh | Feb 2019 | B2 |
10214248 | Hall et al. | Feb 2019 | B2 |
10214252 | Schellekens et al. | Feb 2019 | B2 |
10214275 | Goehlich | Feb 2019 | B2 |
10220575 | Reznar | Mar 2019 | B2 |
10220881 | Tyan et al. | Mar 2019 | B2 |
10221530 | Driskell et al. | Mar 2019 | B2 |
10226900 | Nevins | Mar 2019 | B1 |
10232550 | Evans et al. | Mar 2019 | B2 |
10234342 | Moorlag et al. | Mar 2019 | B2 |
10237477 | Trevor et al. | Mar 2019 | B2 |
10252335 | Buller et al. | Apr 2019 | B2 |
10252336 | Buller et al. | Apr 2019 | B2 |
10254499 | Cohen et al. | Apr 2019 | B1 |
10257499 | Hintz et al. | Apr 2019 | B2 |
10259044 | Buller et al. | Apr 2019 | B2 |
10268181 | Nevins | Apr 2019 | B1 |
10269225 | Velez | Apr 2019 | B2 |
10272860 | Mohapatra et al. | Apr 2019 | B2 |
10272862 | Whitehead | Apr 2019 | B2 |
10275564 | Ridgeway et al. | Apr 2019 | B2 |
10279580 | Evans et al. | May 2019 | B2 |
10285219 | Fetfatsidis et al. | May 2019 | B2 |
10286452 | Buller et al. | May 2019 | B2 |
10286603 | Buller et al. | May 2019 | B2 |
10286961 | Hillebrecht et al. | May 2019 | B2 |
10289263 | Troy et al. | May 2019 | B2 |
10289875 | Singh et al. | May 2019 | B2 |
10291193 | Dandu et al. | May 2019 | B2 |
10294552 | Liu et al. | May 2019 | B2 |
10294982 | Gabrys et al. | May 2019 | B2 |
10295989 | Nevins | May 2019 | B1 |
10303159 | Czinger et al. | May 2019 | B2 |
10307824 | Kondoh | Jun 2019 | B2 |
10310197 | Droz et al. | Jun 2019 | B1 |
10313651 | Trevor et al. | Jun 2019 | B2 |
10315252 | Mendelsberg et al. | Jun 2019 | B2 |
10336050 | Susnjara | Jul 2019 | B2 |
10337542 | Hesslewood et al. | Jul 2019 | B2 |
10337952 | Bosetti et al. | Jul 2019 | B2 |
10339266 | Urick et al. | Jul 2019 | B2 |
10343330 | Evans et al. | Jul 2019 | B2 |
10343331 | McCall et al. | Jul 2019 | B2 |
10343355 | Evans et al. | Jul 2019 | B2 |
10343724 | Polewarczyk et al. | Jul 2019 | B2 |
10343725 | Martin et al. | Jul 2019 | B2 |
10350823 | Rolland et al. | Jul 2019 | B2 |
10356341 | Holzer et al. | Jul 2019 | B2 |
10356395 | Holzer et al. | Jul 2019 | B2 |
10357829 | Spink et al. | Jul 2019 | B2 |
10357957 | Buller et al. | Jul 2019 | B2 |
10359756 | Newell et al. | Jul 2019 | B2 |
10369629 | Mendelsberg et al. | Aug 2019 | B2 |
10382739 | Rusu et al. | Aug 2019 | B1 |
10384393 | Xu et al. | Aug 2019 | B2 |
10384416 | Cheung et al. | Aug 2019 | B2 |
10389410 | Brooks et al. | Aug 2019 | B2 |
10391710 | Mondesir | Aug 2019 | B2 |
10392097 | Pham et al. | Aug 2019 | B2 |
10392131 | Deck et al. | Aug 2019 | B2 |
10393315 | Tyan | Aug 2019 | B2 |
10400080 | Ramakrishnan et al. | Sep 2019 | B2 |
10401832 | Snyder et al. | Sep 2019 | B2 |
10403009 | Mariampillai et al. | Sep 2019 | B2 |
10406750 | Barton et al. | Sep 2019 | B2 |
10412283 | Send et al. | Sep 2019 | B2 |
10416095 | Herbsommer et al. | Sep 2019 | B2 |
10421496 | Swayne et al. | Sep 2019 | B2 |
10421863 | Hasegawa et al. | Sep 2019 | B2 |
10422478 | Leachman et al. | Sep 2019 | B2 |
10425793 | Sankaran et al. | Sep 2019 | B2 |
10427364 | Alves | Oct 2019 | B2 |
10429006 | Tyan et al. | Oct 2019 | B2 |
10434573 | Buller et al. | Oct 2019 | B2 |
10435185 | Divine et al. | Oct 2019 | B2 |
10435773 | Liu et al. | Oct 2019 | B2 |
10436038 | Buhler et al. | Oct 2019 | B2 |
10438407 | Pavanaskar et al. | Oct 2019 | B2 |
10440351 | Holzer et al. | Oct 2019 | B2 |
10442002 | Benthien et al. | Oct 2019 | B2 |
10442003 | Symeonidis et al. | Oct 2019 | B2 |
10449696 | Elgar et al. | Oct 2019 | B2 |
10449737 | Johnson et al. | Oct 2019 | B2 |
10461810 | Cook et al. | Oct 2019 | B2 |
20060108783 | Ni et al. | May 2006 | A1 |
20130010081 | Tenney et al. | Jan 2013 | A1 |
20140107841 | Danko | Apr 2014 | A1 |
20140277669 | Nardi et al. | Sep 2014 | A1 |
20160125594 | Becker et al. | May 2016 | A1 |
20170113344 | Schönberg | Apr 2017 | A1 |
20170336774 | Freeman | Nov 2017 | A1 |
20170341309 | Piepenbrock et al. | Nov 2017 | A1 |
20180021944 | Schreiber | Jan 2018 | A1 |
20180087910 | Salehi | Mar 2018 | A1 |
Number | Date | Country |
---|---|---|
1996036455 | Nov 1996 | WO |
1996036525 | Nov 1996 | WO |
1996038260 | Dec 1996 | WO |
2003024641 | Mar 2003 | WO |
2004108343 | Dec 2004 | WO |
2005093773 | Oct 2005 | WO |
2007003375 | Jan 2007 | WO |
2007110235 | Oct 2007 | WO |
2007110236 | Oct 2007 | WO |
2008019847 | Feb 2008 | WO |
2007128586 | Jun 2008 | WO |
2008068314 | Jun 2008 | WO |
2008086994 | Jul 2008 | WO |
2008087024 | Jul 2008 | WO |
2008107130 | Sep 2008 | WO |
2008138503 | Nov 2008 | WO |
2008145396 | Dec 2008 | WO |
2009083609 | Jul 2009 | WO |
2009098285 | Aug 2009 | WO |
2009112520 | Sep 2009 | WO |
2009135938 | Nov 2009 | WO |
2009140977 | Nov 2009 | WO |
2010125057 | Nov 2010 | WO |
2010125058 | Nov 2010 | WO |
2010142703 | Dec 2010 | WO |
2011032533 | Mar 2011 | WO |
2014016437 | Jan 2014 | WO |
WO-2014161603 | Oct 2014 | WO |
2014187720 | Nov 2014 | WO |
2014195340 | Dec 2014 | WO |
2015193331 | Dec 2015 | WO |
2016116414 | Jul 2016 | WO |
WO-2016119829 | Aug 2016 | WO |
2017036461 | Mar 2017 | WO |
WO2018063100 | Apr 2018 | WO |
2019030248 | Feb 2019 | WO |
2019042504 | Mar 2019 | WO |
2019048010 | Mar 2019 | WO |
2019048498 | Mar 2019 | WO |
2019048680 | Mar 2019 | WO |
2019048682 | Mar 2019 | WO |
Entry |
---|
US 9,202,136 B2, 12/2015, Schmidt et al. (withdrawn) |
US 9,809,265 B2, 11/2017, Kinjo (withdrawn) |
US 10,449,880 B2, 10/2019, Mizobata et al. (withdrawn) |
CADlearning, “Understanding the Difference Between BIM and CAD”, accessed 2021, https://knowledge.autodesk.com/support/revit-products/learn-explore/caas/video/youtube/lesson/143344-courseld-100332.html (Year: 2021). |
Jorge Corona-Gastuera et al,; “An Approach for Intelligent Fixtureless Assembly: Issues and Experiments;” A. Gelbukh, A. de Albornoz, and H. Terashima (Eds.): MICAI 2005, LNAI 3789, pp. 1052-1061, 2005. © Springer-Verlag Berlin Heidelberg 2005. |
Bone, G. and Capson D., “Vision-Guided fixtureless Assembly of Automotive Components”, Robotics and Computer Integrated Manufacturing, vol. 19, pp. 79-87, 2003. DOI: 10.1016/S0736-5845(02)00064-9. |
Ogun, P. et al., 2015. “3D Vision Assisted Flexible Robotic Assembly of Machine Components.” IN: Proceedings of 2015 8th International Conference on Machine Vision (ICMV 2015), Barcelona, spain, Nov. 19-21, 2015 (Proceedings of SPIE, 9878, DOI: 10.1117/12.2229053). |
James K. Mills et al., “Robotic Fixtureless Assembly of Sheet Metal Parts Using Dynamic Finite Element Models: Modelling and Stimulation.” Laboratory for Nonlinear Systems Control, Department of Mechanical Engineering, University of Toronto, 5 King's College Road, Toronto, Ontario, Canada M5S 1A4. IEEE International Conference on Robotics and Automation 0-7803-1965-6/95 $4.00 © 1995 IEEE. |
International Search Report & Written Opinion received in PCT/US2019/066820 dated Mar. 3, 2020. |
First Office Action received in Chinese Patent Application No. 201922270352.X, dated Jul. 14, 2020. |
Extended European Search Report in EP19899837, dated Aug. 17, 2022, 11 pages. |
Number | Date | Country | |
---|---|---|---|
20200189114 A1 | Jun 2020 | US |