Robotic manufacturing entails the use of robots to perform one or more aspects of a manufacturing process. Robotic welding is one application in the field of robotic manufacturing. In robotic welding, robots weld two or more components together along one or more seams. Because such robots automate processes that would otherwise be performed by humans or by machines directly controlled by humans, they provide significant benefits in production time, reliability, efficiency, and costs.
In various examples, a computer-implemented method of generating instructions for a welding robot. The computer-implemented method comprises identifying an expected position and expected orientation of a candidate seam on a part to be welded based on a Computer Aided Design (CAD) model of the part, scanning a workspace containing the part to produce a representation of the part, identifying the candidate seam on the part based on the representation of the part and the expected position and expected orientation of the candidate seam, determining an actual position and actual orientation of the candidate seam, and generating welding instructions for the welding robot based at least in part on the actual position and actual orientation of the candidate seam.
In examples, a computer-implemented method of generating welding instructions for a welding robot. The method comprises obtaining, via a sensor, image data of a workspace that includes a part to be welded, identifying a plurality of points on the part to be welded based on the image data, identifying a candidate seam on the part to be welded from the plurality of points, and generating welding instructions for the welding robot based at least in part on the identification of the candidate seam.
Conventional welding techniques are tedious, labor-intensive, and inefficient. Conventional welding techniques are also not adequately flexible to accommodate irregularities that are commonly encountered during manufacturing processes, leading to undesirable downtime and inefficiencies. For example, in conventional welding techniques, a skilled programmer must generate instructions by which a welding robot performs welding operations. These instructions instruct the welding robot as to the motion, path, trajectory, and welding parameters that must be used to perform a particular welding operation. The instructions are written under the assumption that a high-volume operation is to be performed in which the same welding operation is repeated many times. Thus, any aberrations encountered during the welding process (e.g., a different part) can result in misplaced welds. Misplaced welds, in turn, increase inefficiencies, costs, and other negative aspects of volume production.
In some instances, a computer-aided-design (CAD) model of parts may be useful to a welding robot to facilitate welding operations. For example, a CAD model of a part to be welded may be provided to a welding robot, and the welding robot may use the CAD model to guide its movements, such as the location of a seam to be welded. The seam(s) to be welded are annotated (e.g., annotations that include user-selected edges, where each edge represents a seam) in the CAD model and the welding robot, after locating a seam using sensors, lays weld according to the annotations. Although such approaches may reduce or eliminate the need for a skilled programmer or manufacturing engineer, they have limitations. For instance, welding operations are very precise operations. Generally, in order to create an acceptable weld, it is desirable for the weld tip to be located within 1 mm from a target position associated with a seam. When guiding a welding robot based on a CAD model, actual seams may be more than 1 mm from the modeled location even when the part closely conforms to the CAD model, which can make it difficult or impossible for the weld tip to be accurately positioned to create an acceptable weld. In instances in which the CAD model is a simplification of the actual part (which is common in low-volume production), the seam may be removed from the modeled location by one or more centimeters. Therefore, using known techniques, locating a seam precisely based on a CAD model can be challenging. Accordingly, the welding robot may create unacceptable welds, thereby creating defective parts.
Furthermore, prior solutions for controlling welding robots require a skilled operator to provide specific instructions to the welding robot to avoid collisions with other components (e.g., parts, sensors, clamps, etc.) as the welding robot (and, more specifically, the robot arm) moves within the manufacturing workspace along a path from a first point to a second point, such as a seam. Identifying a path (e.g., a path that the robot may follow to weld a seam) free from obstructions and collisions is referred to herein as path planning. Requiring a skilled operator to perform path planning for dozens or even hundreds of potential pathways for a robot arm is inefficient, tedious, and costly. Furthermore, conventional welding robots are often programmed to follow the same path, same motion, and same trajectory, repeatedly. This repeatedly performed process may be acceptable in a high-volume manufacturing setting where the manufacturing process is highly matured, but in low- or medium-volume settings, a component may be placed in an unexpected position relative to the welding robot, which may lead to collisions, misaligned parts, poor tolerances, and other problems. Accordingly, in certain settings, a skilled operator may be needed to facilitate welding.
The welding technology described herein is superior to prior welding robots and techniques because it can automatically and dynamically generate instructions useful to a welding robot to precisely and accurately identify and weld seams. Unlike prior systems and techniques, the welding technology described herein does not necessarily require CAD models of parts to be welded (although, in some examples and as described below, CAD models may be useful), nor does it necessarily require any other a priori information about the parts or the manufacturing workspace. Rather, the welding technology described herein uses movable sensors to map the manufacturing workspace (and in particular, parts and seams) in three-dimensional (3D) space, and it uses such maps to locate and weld seams with a high degree of accuracy and precision. The welding technology described herein includes various additional features that further distinguish it from prior, inferior solutions, such as the ability to identify multiple candidate seams for welding, the ability to interact with a user to select a candidate seam for welding, and the ability to dynamically change welding parameters and provide feedback on welding operations, among others. Furthermore, the welding technology described herein is configured to use data acquired by the sensors to automatically and dynamically perform path planning—that is, to automatically and dynamically identify, without a priori information, one or more paths in the manufacturing workspace along which the robot arm may travel free from collisions with other components. The welding technology described herein is also configured to use a combination of the data acquired by the sensors and a priori information (e.g., annotated CAD model) to dynamically perform path planning and welding. These and other examples are now described below with reference to the drawings.
The sensors 102 are configured to capture information about the workspace 101. In examples, the sensors 102 are image sensors that are configured to capture visual information (e.g., two-dimensional (2D) images) about the workspace 101. For instance, the sensors 102 may include cameras (e.g., cameras with built-in laser), scanners (e.g., laser scanners), etc. The sensors 102 may include sensors such as Light Detection and Ranging (LiDAR) sensors. Alternatively or in addition, the sensors 102 may be audio sensors configured to emit and/or capture sound, such as Sound Navigation and Ranging (SONAR) devices. Alternatively or in addition, the sensors 102 may be electromagnetic sensors configured to emit and/or capture electromagnetic (EM) waves, such as Radio Detection and Ranging (RADAR) devices. Through visual, audio, electromagnetic, and/or other sensing technologies, the sensors 102 may collect information about physical structures in the workspace 101. In examples, the sensors 102 collect static information (e.g., stationary structures in the workspace 101), and in other examples, the sensors 102 collect dynamic information (e.g., moving structures in the workspace 101), and in still other examples, the sensors 102 collect a combination of static and dynamic information. The sensors 102 may collect any suitable combination of any and all such information about the physical structures in the workspace 101 and may provide such information to other components (e.g., the controller 108) to generate a 3D representation of the physical structures in the workspace 101. As described above, the sensors 102 may capture and communicate any of a variety of information types, but this description assumes that the sensors 102 primarily capture visual information (e.g., 2D images) of the workspace 101, which are subsequently used en masse to generate 3D representations of the workspace 101 as described below.
To generate 3D representations of the workspace 101, the sensors 102 capture 2D images of physical structures in the workspace 101 from a variety of angles. For example, although a single 2D image of a fixture 116 or a part 114 may be inadequate to generate a 3D representation of that component, and, similarly, a set of multiple 2D images of the fixture 116 or the part 114 from a single angle, view, or plane may be inadequate to generate a 3D representation of that component, multiple 2D images captured from multiple angles in a variety of positions within the workspace 101 may be adequate to generate a 3D representation of a component, such as a fixture 116 or part 114. This is because capturing 2D images in multiple orientations provides spatial information about a component in three dimensions, similar in concept to the manner in which plan drawings of a component that include frontal, profile, and top-down views of the component provide all information necessary to generate a 3D representation of that component. Accordingly, in examples, the sensors 102 are configured to move about the workspace 101 so as to capture information adequate to generate 3D representations of structures within the workspace 101. In examples, the sensors are stationary but are present in adequate numbers and in adequately varied locations around the workspace 101 such that adequate information is captured by the sensors 102 to generate the aforementioned 3D representations. In examples where the sensors 102 are mobile, any suitable structures may be useful to facilitate such movement about the workspace 101. For example, one or more sensors 102 may be positioned on a motorized track system. The track system itself may be stationary while the sensors 102 are configured to move about the workspace 101 on the track system. In some examples, however, the sensors 102 are mobile on the track system and the track system itself is mobile around the workspace 101. In still other examples, one or more mirrors are arranged within the workspace 101 in conjunction with sensors 102 that may pivot, swivel, rotate, or translate about and/or along points or axes such that the sensors 102 capture 2D images from initial vantage points when in a first configuration and, when in a second configuration, capture 2D images from other vantage points using the mirrors. In yet other examples, the sensors 102 may be suspended on arms that may be configured to pivot, swivel, rotate, or translate about and/or along points or axes, and the sensors 102 may be configured to capture 2D images from a variety of vantage points as these arms extend through their full ranges of motion.
Additionally, or alternatively, one or more sensors 102 may be positioned on the robot 110 (e.g., on a weld head of the robot 110) and may be configured to collect image data as the robot 110 moves about the workspace 101. Because the robot 110 is mobile with multiple degrees of freedom and therefore in multiple dimensions, sensors 102 positioned on the robot 110 may capture 2D images from a variety of vantage points. In yet other examples, one or more sensors 102 may be stationary while physical structures to be imaged are moved about or within the workspace 101. For instance, a part 114 to be imaged may be positioned on a fixture 116 such as a positioner, and the positioner and/or the part 114 may rotate, translate (e.g., in x-, y-, and/or z-directions), or otherwise move within the workspace 101 while a stationary sensor 102 (e.g., either the one coupled to the robot 110 or the one decoupled from the robot 110) captures multiple 2D images of various facets of the part 114.
In some examples, some or all of the aforementioned sensor 102 configurations are implemented. Other sensor 102 configurations are contemplated and included in the scope of this disclosure.
Referring still to
Referring still to
The controller 108 controls the sensor(s) 102 and the robot 110 within the workspace 101. In some examples, the controller 108 controls the fixture(s) 116 within the workspace 101. For example, the controller 108 may control the sensor(s) 102 to move within the workspace 101 as described above and/or to capture 2D images, audio data, and/or EM data as described above. For example, the controller 108 may control the robot 110 as described herein to perform welding operations and to move within the workspace 101 according to a path planning technique as described below. For example, the controller 108 may manipulate the fixture(s) 116, such as a positioner (e.g., platform, clamps, etc.), to rotate, translate, or otherwise move one or more parts within the workspace 101. The controller 108 may also control other aspects of the system 100. For example, the controller 108 may further interact with the user interface (UI) 106 by providing a graphical interface on the UI 106 by which a user may interact with the system 100 and provide inputs to the system 100 and by which the controller 108 may interact with the user, such as by providing and/or receiving various types of information to and/or from a user (e.g., identified seams that are candidates for welding, possible paths during path planning, welding parameter options or selections, etc.). The UI 106 may be any type of interface, including a touchscreen interface, a voice-activated interface, a keypad interface, a combination thereof, etc.
Furthermore, the controller 108 may interact with the database 112, for example, by storing data to the database 112 and/or retrieving data from the database 112. The database 112 may more generally be stored in any suitable type of storage 109 that is configured to store any and all types of information. In some examples, the database 112 can be stored in storage 109 such as a random access memory (RAM), a memory buffer, a hard drive, an erasable programmable read-only memory (EPROM), an electrically erasable read-only memory (EEPROM), a read-only memory (ROM), Flash memory, and the like. In some examples, the database 112 may be stored on a cloud-based platform. The database 112 may store any information useful to the system 100 in performing welding operations. For example, the database 112 may store a CAD model of the part 114. As another example, the database 112 may store an annotated version of a CAD model of the part 114. The database 112 may also store a point cloud of the part 114 generated using the CAD model (also herein referred to as CAD model point cloud). Similarly, welding instructions for the part 114 that are generated based on 3D representations of the part 114 and/or on user input provided regarding the part 114 (e.g., regarding which seams of the part 114 to weld, welding parameters, etc.) may be stored in the database 112. In examples, the storage 109 stores executable code 111, which, when executed, causes the controller 108 to perform one or more actions attributed herein to the controller 108, or, more generally, to the system 100. In examples, the executable code 111 is a single, self-contained, program, and in other examples, the executable code is a program having one or more function calls to other executable code which may be stored in storage 109 or elsewhere. In some examples, one or more functions attributed to execution of the executable code 111 may be implemented by hardware. For instance, multiple processors may be useful to perform one or more discrete tasks of the executable code 111.
Referring again to
In examples, the 3D image data can be collated by the controller 108 in a manner such that the point cloud generated from the data can have six degrees of freedom. For instance, each point in the point cloud may represent an infinitesimally small position in 3D space. As described above, the sensor(s) 102 can capture multiple 2D images of the point from various angles. These multiple 2D images can be collated by the controller 108 to determine an average image pixel for each point. The averaged image pixel can be attached to the point. For example, if the sensor(s) 102 are color cameras having red, green, and blue channels, then the six degrees of freedom can be {x-position, y-position, z-position, red-intensity, green-intensity, and blue-intensity}. If, for example, the sensor(s) 102 are black and white cameras with black and white channels, then four degrees of freedom may be generated.
The controller 108, upon executing the executable code 111, uses a neural network to perform a pixel-wise (e.g., using images captured by or based on the images captured by sensors 102) and/or point-wise (e.g., using one or more point clouds) classification to identify and classify structures within the workspace 101. For example, the controller 108 may perform a pixel-wise and/or point-wise classification to identify each imaged structure within the workspace 101 as a part 114, as a seam on the part 114 or at an interface between multiple parts 114 (referred to herein as candidate seams), as a fixture 116, as the robot 110, etc. The controller 108 may identify and classify pixels and/or points based on a neural network (e.g., a U-net model) trained using appropriate training data, in examples. The neural network can be trained on image data, point cloud data, spatial information data, or a combination thereof. Because the point cloud and/or the image data includes information captured from various vantage points within the workspace 101, the neural network can be operable to classify the fixtures 116 or the candidate seams on the part(s) 114 from multiple angles and/or viewpoints. In some examples, a neural network can be trained to operate on a set of points directly, for example a dynamic graph convolutional neural network, and the neural network may be implemented to analyze unorganized points on the point cloud. In some examples, a first neural network can be trained on point cloud data to perform point-wise classification and a second neural network can be trained on image data to perform pixel-wise classification. The first neural network and the second neural network can individually identify candidate seams and localize candidate seams. The output from the first neural network and the second neural network can be combined as a final output to determine the location and orientation of one or more candidate seams on a part 114.
In some examples, if pixel-wise classification is performed, the results can be projected onto 3D point cloud data and/or a meshed version of the point cloud data, thereby providing information on a location of the fixture 116 in the workspace 101. If the input data is image data (e.g., color images), spatial information such as depth information may be included along with color data in order to perform pixel-wise segmentation. In some examples, pixel-wise classification can be performed to identify candidate seams and localize candidate seams relative to a part 114 as further described below.
As described above, the controller 108 may identify and classify pixels and/or points as specific structures within the workspace 101, such as fixtures 116, part 114, candidate seams of the part 114, etc. Portions of the image and/or point cloud data classified as non-part and non-candidate seam structures, such as fixtures 116, may be segmented out (e.g., redacted or otherwise removed) from the data, thereby isolating data identified and classified as corresponding to a part 114 and/or candidate seam(s) on the part 114. In some examples, after identifying the candidate seams and segmenting the non-part 114 and non-candidate seam data as described above (or, optionally, prior to such segmentation), the neural network can be configured to analyze each candidate seam to determine the type of seam. For example, the neural network can be configured to determine whether the candidate seam is a butt joint, a corner joint, an edge joint, a lap joint, a tee joint, or the like. The model (e.g., a U-net model) may classify the type of seam based on data captured from multiple vantage points within the workspace 101.
If pixel-wise classification is performed using image data, the controller 108 may project the pixels of interest (e.g., pixels representing parts 114 and candidate seams on the parts 114) onto a 3D space to generate a set of 3D points representing the parts 114 and candidate seams on the parts 114. Alternatively, if point-wise classification is performed using point cloud data, the points of interest may already exist in 3D space in the point cloud. In either case, to the controller 108, the 3D points are an unordered set of points and at least some of the 3D points may be clumped together. To eliminate such noise and generate a continuous and contiguous subset of points to represent the candidate seams, a Manifold Blurring and Mean Shift (MBMS) technique or similar techniques may be applied. Such techniques may condense the points and eliminate noise. Subsequently, the controller 108 may apply a clustering method to break down the candidate seams into individual candidate seams. Stated another way, instead of having several subsets of points representing multiple seams, clustering can break down each subset of points into individual seams. Following clustering, the controller 108 may fit a spline to each individual subset of points. Accordingly, each individual subset of points can be an individual candidate seam.
To summarize, and without limitation, using the techniques described above, the controller 108 receives image data captured by the sensors 102 from various locations and vantage points within the workspace 101. The controller 108 performs a pixel-wise and/or point-wise classification technique using a neural network to classify and identify each pixel and/or point as a part 114, a candidate seam on a part 114 or at an interface between multiple parts 114, a fixture 116, etc. Structures identified as being non-part 114 structures and non-candidate seam structures are segmented out, and the controller 108 may perform additional processing on the remaining points (e.g., to mitigate noise). By performing these actions, the controller 108 may produce a set of candidate seams on parts 114 that indicate locations and orientations of those seams. As is now described, the controller 108 may then determine whether the candidate seams are actually seams and may optionally perform additional processing using a priori information, such as CAD models of the parts and seams. The resulting data is suitable for use by the controller 108 to plan a path for laying weld along the identified seams, as is also described below.
In some instances, the identified candidate seams may not be seams (i.e., the identified candidate seams may be false positives). To determine whether the identified candidate seams are actually seams, the controller 108 uses the images captured by sensors 102 from various vantage points inside the workspace 101 to determine a confidence value. The confidence value represents the likelihood whether the candidate seam determined from the corresponding vantage point is an actual seam. The controller 108 may then compare the confidence values for the different vantage points and eliminate candidate seams that are unlikely to be actual seams. For example, the controller 108 may determine a mean, median, maximum, or any other suitable summary statistic of the candidate values associated with a specific candidate seam. Generally, a candidate seam that corresponds to an actual seam will have consistently high (e.g., above a threshold) confidence values across the various vantage points used to capture that candidate seam. If the summary statistic of the confidence values for a candidate seam is above a threshold value, the controller 108 can designate the candidate seam as an actual seam. Conversely, if the summary statistic of the confidence values for a candidate seam is below a threshold value, the candidate seam can be designated as a false positive that is not eligible for welding.
As mentioned above, after identifying the candidate seams that are actually seams, the controller 108 may perform additional processing referred to herein as registration using a priori information, such as a CAD model (or a point cloud version of the CAD model). More specifically, in some instances there may exist a difference between seam dimensions on the part and seam dimensions in the CAD model, and the CAD model should be deformed (e.g., updated) to account for any such differences, as the CAD model may be subsequently used to perform path planning as described herein. Accordingly, the controller 108 compares a first seam (e.g., a candidate seam on a part 114 that has been verified as an actual seam) to a second seam (e.g., a seam annotated (e.g., by an operator/user) on the CAD model corresponding to the first seam) to determine differences between the first and second seams. Seams on the CAD model may be annotated as described above. The first seam and the second seam can be in nearly the same location, in instances in which the CAD model and/or controller 108 accurately predicts the location of the candidate seam. Alternatively, the first seam and the second seam can partially overlap, in instances in which the CAD model and/or controller 108 is partially accurate. The controller 108 may perform a comparison of the first seam and the second seam. This comparison of first seam and the second seam can be based in part on shape and relative location in space of both the seams. Should the first seam and the second seam be relatively similar in shape and be proximal to each other, the second seam can be identified as being the same as the first seam. In this way, the controller 108 can account for the topography of the surfaces on the part that are not accurately represented in the CAD models. In this manner, the controller 108 can identify candidate seams and can sub-select or refine or update candidate seams relative to the part using a CAD model of the part. Each candidate seam can be a set of updated points that represents the position and orientation of the candidate seam relative to the part.
Irrelevant data and noise in the data (e.g., the output of the coarse registration 602) may impact registration of the parts 114. For at least this reason, it is desirable to remove as much of the irrelevant data and noise as possible. A bounding box 608 is useful to remove this irrelevant data and noise (e.g., fixtures 116) in order to limit the area upon which registration is performed. Stated another way, data inside the bounding box is retained, but all the data, 3D or otherwise, from outside the bounding box is discarded. The aforementioned bounding box may be any shape that can enclose or encapsulate the CAD model itself (e.g., either partially or completely). For instance, the bounding box may be an inflated or scaled-up version of the CAD model. The data outside the bounding box may be removed from the final registration or may still be included but weighted to mitigate its impact.
Referring still to
A set of corresponding points that best support the rigid transformation between the CAD point cloud and scan point cloud models should be determined during registration. Corresponding candidates may be stored (e.g., in the database 112) as a matrix in which each element stores the confidence or the probability of a match between two points:
Various methods are useful to find corresponding points from this matrix, for example, hard correspondence, soft correspondence, product manifold filter, graph clique, covariance, etc. After completion of the refined registration 610, the registration process is then complete (612).
As described above, in some instances, the actual location of a seam on a part 114 may differ from the seam location as determined by the controller 108 using sensor imaging (e.g., using scan point clouds) and/or as determined by a CAD model (e.g., using CAD model point clouds). In such cases, a scanning procedure (also sometimes referred herein as pre-scan) is useful to correct the determined seam location to more closely or exactly match the actual seam location on the part 114. In the scanning procedure, the sensor 102 that are positioned on the robot 110 (referred to herein as on-board sensors) perform a scan of the seam. In some instances, this scan may be performed using an initial motion and/or path plan generated by the controller 108 using the CAD model, the scan, or a combination thereof. For example, the sensors 102 may scan any or all areas of the workspace 101. During the performance of this initial motion and/or path plan, the sensors 102 may capture observational images and/or data. The observational images and/or data may be processed by the controller 108 to generate seam point cloud data. The controller 108 may use the seam point cloud data when processing the point cloud(s) 604 and/or 606 to correct the seam location. The controller 108 may also use seam point cloud data in correcting path and motion planning.
In some examples, the registration techniques described above may be useful to compare and match the seams determined using sensors 102 in addition to the on-board sensors 102 to those identified by the on-board sensors 102. By matching the seams in this manner, the robot 110 (and, more specifically, the head of the robot 110) is positioned relative to the actual seam as desired.
In some examples, the pre-scan trajectory of the robot 110 is identical to that planned for welding along a seam. In some such examples, the motion taken for the robot 110 during pre-scan may be generated separately so as to limit the probability or curtail the instance of collision, to better visualize the seam or key geometry with the onboard sensor 102, or to scan geometry around the seam in question.
In some examples, the pre-scan technique may include scanning more than a particular seam or seams, and rather may also include scanning of other geometry of the part(s) 114. The scan data may be useful for more accurate application of any or all of the techniques described herein (e.g., registration techniques) to find, locate, detect a seam and ensure the head of the robot 110 will be placed and moved along the seam as desired.
In some examples, the scanning technique (e.g., scanning the actual seam using sensors/cameras mounted on the weld arm/weld head) may be useful to identify gap variability information about the seams rather than position and orientation information about the seams. For example, the scan images captured by sensor(s) 102 on the robot 110 during a scanning procedure may be useful to identify variability in gaps and adjust the welding trajectory or path plan to account for such gaps. For example, in 3D points, 2D image pixels, or a combination thereof may be useful to locate variable gaps between parts 114 to be welded. In some examples, variable gap finding is useful, in which 3D points, 2D image pixels, or a combination thereof are useful to locate, identify, and measure the variable sizes of multiple gaps between parts 114 to be welded together. In tack weld finding or general weld finding, former welds or material deposits in gaps between parts 114 to be welded may be identified using 3D points and/or 2D image pixels. Any or all such techniques may be useful to optimize welding, including path planning. In some instances, the variability in gaps may be identified within the 3D point cloud generated using the images captured by sensors 102. In yet other instances, the variability in gaps may be identified a scanning technique (e.g., scanning the actual seam using sensors/cameras mounted on the weld arm/weld head) performed while performing a welding operation on the task. In any one of the instances, the controller 108 may be configured to adapt the welding instructions dynamically (e.g., welding voltage) based on the determined location and size of the gap. For example, the dynamically adjust welding instructions for the welding robots can result in precise welding of seam at variable gaps. Adjusting welding instructions may include adjusting one or more of: welder voltage, welder current, duration of an electrical pulse, shape of an electrical pulse, and material feed rate.
In examples, the user interface 106 can provide the user with an option to view candidate seams. For example, the user interface 106 may provide a graphical representation of a part 114 and/or candidate seams on a part 114. In addition or alternatively, the user interface 106 may group the candidate seam based on the type of seam. As described above, the controller 108 can identify the type of seam. For instance, candidate seams identified as lap joints can be grouped under a label “lap joints” and can be presented to the user via the user interface 106 under the label “lap joints.” Similarly, candidate seams identified as edge joints can be grouped under a label “edge joints” and can be presented to the user via the user interface 106 under the label “edge joints.”
The user interface 106 can further provide the user with an option to select a candidate seam to be welded by the robot 110. For example, each candidate seam on a part 114 can be presented as a press button on the user interface 106. When the user presses on a specific candidate seam, the selection can be sent to the controller 108. The controller 108 can generate instructions for the robot 110 to perform welding operations on that specific candidate seam.
In some examples, the user can be provided with an option to update welding parameters. For example, the user interface 106 can provide the user with a list of different welding parameters. The user can select a specific parameter to be updated. Changes to the selected parameter can be made using a drop-down menu, via text input, etc. This update can be transmitted to the controller 108 so that the controller 108 can update the instructions for the robot 110.
In examples for which the system 100 is not provided with a priori information (e.g., a CAD model) of the part 114, the sensor(s) 102 can scan the part 114. A representation of the part 114 can be presented to the user via the user interface 106. This representation of the part 114 can be a point cloud and/or a mesh of the point cloud that includes projected 3D data of the scanned image of the part 114 obtained from the sensor(s) 102. The user can annotate seams that are to be welded in the representation via the user interface 106. Alternatively, the controller 108 can identify candidate seams in the representation of the part 114. Candidate seams can be presented to the user via the user interface 106. The user can select seams that are to be welded from the candidate seams. The user interface 106 can annotate the representation based on the user's selection. The annotated representation can be saved in the database 112, in some examples.
After one or more seams on the part(s) 114 have been identified and corrected to the extent possible using the techniques described above (or using other suitable techniques), the controller 108 plans a path for the robot 110 during a subsequent welding process. In some examples, graph-matching and/or graph-search techniques may be useful to plan a path for the robot 110. A particular seam identified as described above may include multiple points, and the path planning technique entails determining a different state of the robot 110 for each such point along a given seam. A state of the robot 110 may include, for example, a position of the robot 110 within the workspace 101 and a specific configuration of the arm of the robot 110 in any number of degrees of freedom that may apply. For instance, for a robot 110 that has an arm having six degrees of freedom, a state for the robot 110 would include not only the location of the robot 110 in the workspace 101 (e.g., the location of the weld head of the robot 110 in three-dimensional, x-y-z space), but it would also include a specific sub-state for each of the robot arm's six degrees of freedom. Furthermore, when the robot 110 transitions from a first state to a second state, it may change its location within the workspace 101, and in such a case, the robot 110 necessarily would traverse a specific path within the workspace 101 (e.g., along a seam being welded). Thus, specifying a series of states of the robot 110 necessarily entails specifying the path along which the robot 110 will travel within the workspace 101. The controller 108 may perform the pre-scan technique or a variation thereof after path planning is complete, and the controller 108 may use the information captured during the pre-scan technique to make any of a variety of suitable adjustments (e.g., adjustment of the X-Y-Z axes or coordinate system used to perform the actual welding along the seam).
In some examples, to determine a path plan for the robot 110 using the graph-search technique (e.g., according to the technique depicted in the diagram 700), the controller 108 may determine the shortest path from a state 704A-704D to a state corresponding to a seam point N (e.g., a state 712A-712D). By assigning a cost to each state and each transition between states, an objective function can be designed by the controller 108. The controller 108 finds the path that results in the least possible cost value for the objective function. Due to the freedom of having multiple starts and endpoints to choose from, graph search methods like Dijkstra's algorithm or A* may be implemented. In some examples, a brute force method may be useful to determine a suitable path plan. The brute force technique would entail the controller 108 computing all possible paths (e.g., through the diagram 700) and choosing the shortest one.
The controller 108 may determine whether the state at each seam point is feasible, meaning at least in part that the controller 108 may determine whether implementing the chain of states along the sequence of seam points of the seam will cause any collisions between the robot 110 and structures in the workspace 101, or even with parts of the robot 110 itself. To this end, the concept of realizing different states at different points of a seam may alternatively be expressed in the context of a seam that has multiple waypoints. First, the controller 108 may discretize an identified seam into a sequence of waypoints. A waypoint may constrain an orientation of the weld head connected to the robot 120 in three (spatial/translational) degrees of freedom. Typically, constraints in orientation of the weld head of the robot 120 are provided in one or two rotational degrees of freedom about each waypoint, for the purpose of producing some desired weld of some quality; the constraints are typically relative to the surface normal vectors emanating from the waypoints and the path of the weld seam. For example, the position of the weld head can be constrained in x-, y-, and z-axes, as well as about one or two rotational axes perpendicular to an axis of the weld wire or tip of the welder, all relative to the waypoint and some nominal coordinate system attached to it. These constraints in some examples may be bounds or acceptable ranges for the angles. Those skilled in the art will recognize that the ideal or desired weld angle may vary based on part or seam geometry, the direction of gravity relative to the seam, and other factors. In some examples, the controller 108 may constrain in 1F or 2F weld positions to ensure that the seam is perpendicular to gravity for one or more reasons (such as to find a balance between welding and path planning for optimization purposes). The position of the weld head can therefore be held (constrained) by each waypoint at any suitable orientation relative to the seam. Typically, the weld head will be unconstrained about a rotational axis (θ) coaxial with an axis of the weld head. For instance, each waypoint can define a position of the weld head of the welding robot 120 such that at each waypoint, the weld head is in a fixed position and orientation relative to the weld seam. In some implementations, the waypoints are discretized finely enough to make the movement of the weld head substantially continuous.
The controller 108 may divide each waypoint into multiple nodes. Each node can represent a possible orientation of the weld head at that waypoint. As a non-limiting example, the weld head can be unconstrained about a rotational axis coaxial with the axis of the weld head such that the weld head can rotate (e.g., 360 degrees) along a rotational axis θ at each waypoint. Each waypoint can be divided into 20 nodes, such that each node of each waypoint represents the weld head at 18 degree of rotation increments. For instance, a first waypoint-node pair can represent rotation of the weld head at 0 degrees, a second waypoint-node pair can represent rotation of the weld head at from 18 degrees, a third waypoint-node pair can represent rotation of the weld head at 36 degrees, etc. Each waypoint can be divided into 2, 10, 20, 60, 120, 360, or any suitable number of nodes. The subdivision of nodes can represent the division of orientations in more than 1 degree of freedom. For example, the orientation of the welder tip about the waypoint can be defined by 3 angles. A weld path can be defined by linking each waypoint-node pair. Thus, the distance between waypoints and the offset between adjacent waypoint nodes can represent an amount of translation and rotation of the weld head as the weld head moves between node-waypoint pairs.
The controller 108 can evaluate each waypoint-node pair for feasibility of welding. For instance, consider the non-limiting example of dividing waypoint into 20 nodes. The controller 108 can evaluate whether the first waypoint-node pair representing the weld head held at 0 degrees would be feasible. Put differently, the controller 108 can evaluate whether the robot 110 would collide or interfere with the part, the fixture, or the welding robot itself, if placed at the position and orientation defined by that waypoint-node pair. In a similar manner, the controller 108 can evaluate whether the second waypoint-node pair, third waypoint-node pair, etc., would be feasible. The controller 108 can evaluate each waypoint similarly. In this way, all feasible nodes of all waypoints can be determined.
In some examples, a collision analysis as described herein may be performed by comparing a 3D model of the workspace 101 and a 3D model of the robot 110 to determine whether the two models overlap, and optionally, some or all of the triangles overlap. If the two models overlap, the controller 108 may determine that a collision is likely. If the two models do not overlap, the controller 108 may determine that a collision is unlikely. More specifically, in some examples, the controller 108 may compare the models for each of a set of waypoint-node pairs (such as the waypoint-node pairs described above) and determine that the two models overlap for a subset, or even possibly all, of the waypoint-node pairs. For the subset of waypoint-node pairs with respect to which model intersection is identified, the controller 108 may omit the waypoint-node pairs in that subset from the planned path and may identify alternatives to those waypoint-node pairs. The controller 108 may repeat this process as needed until a collision-free path has been planned. The controller 108 may use a flexible collision library (FCL), which includes various techniques for efficient collision detection and proximity computations, as a tool in the collision avoidance analysis. The FCL is useful to perform multiple proximity queries on different model representations, and it may be used to perform probabilistic collision identification between point clouds. Additional or alternative resources may be used in conjunction with or in lieu of the FCL.
The controller 108 can generate one or more feasible simulate (or evaluate, both terms used interchangeably herein) weld paths should they physically be feasible. A weld path can be a path that the welding robot takes to weld the candidate seam. In some examples, the weld path may include all the waypoints of a seam. In some examples, the weld path may include some but not all the waypoints of the candidate seam. The weld path can include the motion of the robot and the weld head as the weld head moves between each waypoint-node pair. Once a feasible path between node-waypoint pairs is identified, a feasible node-waypoint pair for the next sequential waypoint can be identified should it exist. Those skilled in the art will recognize that many search trees or other strategies may be employed to evaluate the space of feasible node-waypoint pairs. As discussed in further detail herein, a cost parameter can be assigned or calculated for movement from each node-waypoint pair to a subsequent node-waypoint pair. The cost parameter can be associated with a time to move, an amount of movement (e.g., including rotation) between node-waypoint pairs, and/or a simulated/expected weld quality produced by the weld head during the movement.
In instances in which no nodes are feasible for welding for one or more waypoints and/or no feasible path exists to move between a previous waypoint-node pair and any of the waypoint-node pairs of a particular waypoint, the controller 108 can determine alternative welding parameters such that at least some additional waypoint-node pairs become feasible for welding. For example, if the controller 108 determines that none of the waypoint-node pairs for a first waypoint are feasible, thereby making the first waypoint unweldable, the controller 108 can determine an alternative welding parameters such as an alternative weld angle so that at least some waypoint-node pairs for the first waypoint become weldable. For example, the controller 108 can remove or relax the constraints on rotation about the x and/or y axis. Similarly stated, the controller 108 can allow the weld angle to vary in one or two additional rotational (angular) dimensions. For example, the controller 108 can divide waypoint that is unweldable into two- or three-dimensional nodes. Each node can then be evaluated for welding feasibility of the welding robot and weld held in various weld angles and rotational states. The additional rotation about the x- and/or y-axes or other degrees of freedom may make the waypoints accessible to the weld head such that the weld head does not encounter any collision. In some implementations, the controller 108—in instances in which no nodes are feasible for welding for one or more waypoints and/or no feasible path exists to move between a previous waypoint-node pair and any of the waypoint-node pairs of a particular waypoint—can use the degrees of freedom provided by the positioner system in determining feasible paths between a previous waypoint-node pair and any of the waypoint-node pairs of a particular waypoint.
Based on the generated weld paths, the controller 108 can optimize the weld path for welding. (Optimal and optimize, as used herein, does not refer to determining an absolute best weld path, but generally refers to techniques by which weld time can be decreased and/or weld quality improved relative to less efficient weld paths.) For example, the controller 108 can determine a cost function that seeks local and/or global minima for the motion of the robot 110. Typically, the optimal weld path minimizes weld head rotation, as weld head rotation can increase the time to weld a seam and/or decrease weld quality. Accordingly, optimizing the weld path can include determining a weld path through a maximum number of waypoints with a minimum amount of rotation.
In evaluating the feasibility of welding at each of the divided nodes or node-waypoint pairs, the controller 108 may perform multiple computations. In some examples, each of the multiple computations may be mutually exclusive from one another. In some examples, the first computation may include kinematic feasibility computation, which computes for whether the arm of the robot 110 of the welding robot being employed can mechanically reach (or exist) at the state defined by the node or node-waypoint pair. In some examples, in addition to the first computation, a second computation—which may be mutually exclusive to the first computation—may also be performed by the controller 108. The second computation may include determining whether the arm of the robot 110 will encounter a collision (e.g., collide with the workspace 101 or a structure in the workspace 101) when accessing the portion of the seam (e.g., the node or node-waypoint pair in question).
The controller 108 may perform the first computation before performing the second computation. In some examples, the second computation may be performed only if the result of the first computation is positive (e.g., if it is determined that the arm of the robot 110 can mechanically reach (or exist) at the state defined by the node or node-waypoint pair). In some examples, the second computation may not be performed if the result of the first computation is negative (e.g., if it is determined that the arm of the robot 110 cannot mechanically reach (or exist) at the state defined by the node or node-waypoint pair).
The kinematic feasibility may correlate with the type of robotic arm employed. For the purposes of this description, it is assumed that the welding robot 110 includes a six-axis robotic welding arm with a spherical wrist. The six-axis robotic arm can have 6 degrees of freedom—three degrees of freedom in X-, Y-, Z-cartesian coordinates and three additional degrees of freedom because of the wrist-like nature of the robot 110. For example, the wrist-like nature of the robot 110 results in a fourth degree of freedom in wrist-up/-down manner (e.g., wrist moving in +y and −y direction), a fifth degree of freedom in wrist-side manner (e.g., wrist moving in −x and +x direction), and sixth degree of freedom in rotation. In some examples, the welding torch is attached to the wrist portion of the robot 110.
To determine whether the arm of the robot 110 being employed can mechanically reach (or exist) at the state defined by the node or node-waypoint pair—i.e., to perform the first computation—the robot 110 may be mathematically modeled as shown in the example model 800 of
After the first three joint variables (i.e., S, L, U) are computed successfully, the controller 108 may then solve for the last three joint variables (i.e., R, B, Tat 808, 810, 812, respectively) by, for example, considering wrist orientation as a Z-Y-Z Euler angle. The controller 108 may consider some offsets in the robot 110. These offsets may need to be considered and accounted for because of inconsistencies in the unified robot description format (URDF) file. For example, in some examples, values (e.g., a joint's X axis) of the position of a joint (e.g., actual joint of the robot 110) may not be consistent with the value noted in its URDF file. Such offset values may be provided to the controller 108 in a table. The controller 108, in some examples, may consider these offset values while mathematically modeling the robot 110. In some examples, after the robot 110 is mathematically modeled, the controller 108 may determine whether the arm of the robot 110 can mechanically reach (or exist) at the states defined by the node or node-waypoint pair.
As noted above, the controller 108 can evaluate whether the robot 110 would collide or interfere with the part 114, the fixture 116, or anything else in the workspace 101, including the robot 110 itself, if placed at the position and orientation defined by that waypoint-node pair. Once the controller 108 determines the states in which the robotic arm can exist, the controller 108 may perform the foregoing evaluation (e.g., regarding whether the robot would collide something in its environment) using the second computation.
At step 904, the method 900 includes identifying a set of points on the part to be welded based on the sensor data, which may be images. The set of points can represent the possibility of a seam that is to be welded. In some examples, a neural network can perform pixel-wise segmentation on the image data to identify the set of points. Fixtures and clamps in the image data can be classified by the neural network based on image classification. The portions of the image data associated with the fixtures and/or the clamps can be segmented out such that those portions of the image data are not used to identify the set of points, which can reduce computational resources required to identify set of points to be welded by decreasing the search space. In such examples, the set of points can be identified from other portions of the image data (e.g., portions of the image data that are not segmented out).
At step 906, the method 900 includes identifying a candidate seam from the set of points. For example, a subset of points within the set of points can be identified as a candidate seam. A neural network can perform image classification and/or depth classification to identify the candidate seam. In some examples, the candidate seam can be localized relative to the part. For example, a position and an orientation for the candidate seam can be determined relative to the part in order to localize the candidate seam.
Additionally, method 900 further includes verifying whether the candidate seam is an actual seam. As discussed above, the sensor(s) can collect image data from multiple angles. For each image captured from a different angle, a confidence value that represents whether the candidate seam determined from that angle is an actual seam can be determined. When the confidence value is above a threshold based on views taken from multiple angles, the candidate seam can be verified as an actual seam. In some embodiments, the method 900 also includes classifying the candidate seam as a type of seam. For example, a neural network can determine if the candidate seam is a butt joint, a corner joint, an edge joint, a lap joint, a tee joint, and/or the like.
In some examples, after the candidate seam has been identified and verified, the subset of points can be clustered together to form a contiguous and continuous seam. At step 908, the method 900 includes generating welding instructions for a welding robot based on the candidate seam. For example, the welding instructions can be generated by tracing a path from one end of the subset of points to the other end of the subset of points. This can generate a path for the seam. Put differently, the weld can be made by tracing this path with the welding head. Additionally, path planning can be performed based on the identified and localized candidate seam. For example, path planning can be performed based on the path for the seam that can be generated from clustering the subset of points.
In some examples, the welding instructions can be based on the type of seam (e.g., butt joint, corner joint, edge joint, lap joint, tee joint, and/or the like). In some examples, the welding instructions can be updated based on input from a user via a user interface (e.g., user interface 106 in
In this manner, welding robots can be operated and controlled by implementing method 900 without a priori information (e.g., a CAD model) of the parts to be welded. Since the parts are scanned in order to generate welding instructions, a representation of the scanned image of the part can be annotated with one or more candidate seams (e.g., via a user interface). The annotated representation can be used to define a 3D model of the part. The 3D model of the part can be saved in a database for subsequent welding of additional instances of the part.
At step 1004, the method 1000 includes obtaining image data of a workspace (e.g., workspace 101 in
In some examples, in order to reduce the processing time to generate welding instructions, the sensors are configured to perform a partial scan. Put differently, instead of scanning the workspace from every angle, the image data is collected from a few angles (e.g., angles from which a candidate seam is expected to be visible). In such examples, the point cloud generated from the image data is a partial point cloud. Generating a partial point cloud that, for example, does not include portions of the part that the model indicates do not contain seams to be welded, can reduce scanning and/or processing time.
At step 1006, the method 1000 includes identifying the candidate seam based on the image data, the point cloud, and/or the partial point cloud. For example, the controller 108 (
At step 1008, the method 1000 includes identifying the actual position and the actual orientation of the candidate seam. For example, at step 1002 a first subset of points can be identified as a modeled seam. At step 1006, a second subset of points can be identified as the candidate seam. In some examples, the first subset of points and the second subset of points can be compared (e.g., using the registration techniques described above with respect to
At step 1010, the method 1000 includes generating welding instructions for the welding robot based on the actual position and the actual orientation of the candidate seam. For example, the path planning can be performed based on the actual position and the actual orientation of the candidate seam.
Like method 900 in
Additionally, or alternatively to the steps described above with respect to method 1000, the welding robots (e.g., robot 110 in
The terms “position” and “orientation” are spelled out as separate entities in the disclosure above. However, the term “position” when used in context of a part means “a particular way in which a part is placed or arranged.” The term “position” when used in context of a seam means “a particular way in which a seam on the part is positioned or oriented.” As such, the position of the part/seam may inherently account for the orientation of the part/seam. As such, “position” can include “orientation.” For example, position can include the relative physical position or direction (e.g., angle) of a part or candidate seam.
Unless otherwise stated, “about,” “approximately,” or “substantially” preceding a value means+/−10 percent of the stated value. Unless otherwise stated, two objects described as being “parallel” are side by side and have a distance between them that is constant or varies by no more than 10 percent. Unless otherwise stated, two objects described as being perpendicular intersect at an angle ranging from 80 degrees to 100 degrees. Modifications are possible in the described examples, and other examples are possible within the scope of the claims.
This application is a continuation of U.S. patent application Ser. No. 17/902,748, filed Sep. 2, 2022, which is a continuation of U.S. patent application Ser. No. 17/680,027, filed Feb. 24, 2022, now issued into U.S. Pat. No. 11,548,162, which claims benefit of U.S. Provisional Patent Application No. 63/153,109, filed Feb. 24, 2021 and U.S. Provisional Patent Application No. 63/282,827, filed Nov. 24, 2021, the entire contents of each being incorporated herein by reference for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
1724301 | Peck | Aug 1946 | A |
3532807 | Wall, Jr. et al. | Oct 1970 | A |
4011437 | Hohn | Mar 1977 | A |
4021840 | Ellsworth et al. | May 1977 | A |
4148061 | Lemelson et al. | Apr 1979 | A |
4234777 | Balfanz | Nov 1980 | A |
4255643 | Balfanz | Mar 1981 | A |
4380696 | Masaki | Apr 1983 | A |
4412121 | Kremers et al. | Oct 1983 | A |
4482968 | Inaba et al. | Nov 1984 | A |
4492847 | Masaki et al. | Jan 1985 | A |
4495588 | Nio et al. | Jan 1985 | A |
4497019 | Waber | Jan 1985 | A |
4497996 | Libby et al. | Feb 1985 | A |
4515521 | Takeo et al. | May 1985 | A |
4553077 | Brantmark et al. | Nov 1985 | A |
4555613 | Shulman | Nov 1985 | A |
4561050 | Iganuchi et al. | Dec 1985 | A |
4567348 | Smith et al. | Jan 1986 | A |
4568816 | Casler, Jr. et al. | Feb 1986 | A |
4575304 | Nakagawa et al. | Mar 1986 | A |
4578554 | Coulter | Mar 1986 | A |
4580229 | Koyama et al. | Apr 1986 | A |
4587396 | Rubin | May 1986 | A |
4590577 | Nio et al. | May 1986 | A |
4593173 | Bromley et al. | Jun 1986 | A |
4594497 | Takahasi et al. | Jun 1986 | A |
4595989 | Yasukawa et al. | Jun 1986 | A |
4613942 | Chen | Sep 1986 | A |
4616121 | Clocksin et al. | Oct 1986 | A |
4617504 | Detriche | Oct 1986 | A |
4642752 | Debarbieri et al. | Feb 1987 | A |
4652803 | Kamejima et al. | Mar 1987 | A |
4675502 | Haefner et al. | Jun 1987 | A |
4677568 | Arbter | Jun 1987 | A |
4685862 | Nagai et al. | Aug 1987 | A |
4724301 | Shibata et al. | Feb 1988 | A |
4725965 | Keenan | Feb 1988 | A |
4744039 | Suzuki et al. | May 1988 | A |
4745857 | Putnam et al. | May 1988 | A |
4757180 | Kainz et al. | Jul 1988 | A |
4804860 | Ross et al. | Feb 1989 | A |
4812614 | Wang et al. | Mar 1989 | A |
4833383 | Skarr et al. | May 1989 | A |
4833624 | Kuwahara et al. | May 1989 | A |
4837487 | Kurakake et al. | Jun 1989 | A |
4845992 | Dean | Jul 1989 | A |
4899095 | Kishi et al. | Feb 1990 | A |
4906907 | Tsuchihashi et al. | Mar 1990 | A |
4907169 | Lovoi | Mar 1990 | A |
4924063 | Buchel et al. | May 1990 | A |
4945493 | Huang et al. | Jul 1990 | A |
4969108 | Webb et al. | Nov 1990 | A |
4973216 | Domm | Nov 1990 | A |
5001324 | Aiello et al. | Mar 1991 | A |
5006999 | Kuno et al. | Apr 1991 | A |
5053976 | Nose et al. | Oct 1991 | A |
5083073 | Kato | Jan 1992 | A |
5096353 | Tesh et al. | Mar 1992 | A |
5154717 | Matsen, III et al. | Oct 1992 | A |
5159745 | Kato | Nov 1992 | A |
5219264 | McClure et al. | Jun 1993 | A |
5245409 | Tobar | Sep 1993 | A |
5288991 | King et al. | Feb 1994 | A |
5300869 | Skaar et al. | Apr 1994 | A |
5326469 | Watanabe | Jul 1994 | A |
5379721 | Dessing et al. | Jan 1995 | A |
5457773 | Jeon | Oct 1995 | A |
5465037 | Huissoon et al. | Nov 1995 | A |
5479078 | Karakama et al. | Dec 1995 | A |
5511007 | Nihei et al. | Apr 1996 | A |
5532924 | Hara et al. | Jul 1996 | A |
5570458 | Umeno et al. | Oct 1996 | A |
5572102 | Goodfellow et al. | Nov 1996 | A |
5598345 | Tokura et al. | Jan 1997 | A |
5600760 | Pryor | Feb 1997 | A |
5602967 | Pryor | Feb 1997 | A |
5608847 | Pryor | Mar 1997 | A |
5612785 | Boillot et al. | Mar 1997 | A |
5624588 | Terawaki et al. | Apr 1997 | A |
5828566 | Pryor | Oct 1998 | A |
5906761 | Gilliland et al. | May 1999 | A |
5925268 | Britnell | Jul 1999 | A |
5956417 | Pryor | Sep 1999 | A |
5959425 | Bieman et al. | Sep 1999 | A |
5961858 | Britnell | Oct 1999 | A |
6035695 | Kim | Mar 2000 | A |
6044308 | Huissoon | Mar 2000 | A |
6049059 | Kim | Apr 2000 | A |
6084203 | Bonigen | Jul 2000 | A |
6163946 | Pryor | Dec 2000 | A |
6167607 | Pryor | Jan 2001 | B1 |
6304050 | Skaar et al. | Oct 2001 | B1 |
6429404 | Tokura et al. | Aug 2002 | B1 |
6430474 | DiStasio et al. | Aug 2002 | B1 |
6804580 | Stoddard et al. | Oct 2004 | B1 |
6909066 | Zheng et al. | Dec 2005 | B2 |
7130718 | Gunnarsson et al. | Oct 2006 | B2 |
7151848 | Watanabe et al. | Dec 2006 | B1 |
7734358 | Watanabe et al. | Jun 2010 | B2 |
7805219 | Ishikawa et al. | Sep 2010 | B2 |
7813830 | Summers et al. | Oct 2010 | B2 |
7818091 | Kazi et al. | Oct 2010 | B2 |
7946439 | Toscano et al. | May 2011 | B1 |
8494678 | Quandt et al. | Jul 2013 | B2 |
8525070 | Tanaka et al. | Sep 2013 | B2 |
8538125 | Linnenkohl et al. | Sep 2013 | B2 |
8644984 | Nagatsuka et al. | Feb 2014 | B2 |
9067321 | Landsnes | Jun 2015 | B2 |
9221117 | Conrardy et al. | Dec 2015 | B2 |
9221137 | Otts | Dec 2015 | B2 |
9666160 | Patel et al. | May 2017 | B2 |
9685099 | Boulware et al. | Jun 2017 | B2 |
9724787 | Becker et al. | Aug 2017 | B2 |
9773429 | Boulware et al. | Sep 2017 | B2 |
9779635 | Zboray et al. | Oct 2017 | B2 |
9799635 | Kuriki et al. | Oct 2017 | B2 |
9802277 | Beatty et al. | Oct 2017 | B2 |
9821415 | Rajagopalan et al. | Nov 2017 | B2 |
9836987 | Postlethwaite et al. | Dec 2017 | B2 |
9937577 | Daniel et al. | Apr 2018 | B2 |
9975196 | Zhang et al. | May 2018 | B2 |
9977242 | Patel et al. | May 2018 | B2 |
9993891 | Wiryadinata | Jun 2018 | B2 |
10040141 | Rajagopalan et al. | Aug 2018 | B2 |
10083627 | Daniel et al. | Sep 2018 | B2 |
10191470 | Inoue | Jan 2019 | B2 |
10197987 | Battles et al. | Feb 2019 | B2 |
10198962 | Postlethwaite et al. | Feb 2019 | B2 |
10201868 | Dunahoo et al. | Feb 2019 | B2 |
10551179 | Lonsberry et al. | Feb 2020 | B2 |
11034024 | Saez et al. | Jun 2021 | B2 |
11067965 | Spieker et al. | Jul 2021 | B2 |
11179793 | Atherton et al. | Nov 2021 | B2 |
11407110 | Lonsberry et al. | Aug 2022 | B2 |
11759952 | Lonsberry et al. | Sep 2023 | B2 |
20010004718 | Gilliland et al. | Jun 2001 | A1 |
20040105519 | Yamada et al. | Jun 2004 | A1 |
20050023261 | Zheng et al. | Feb 2005 | A1 |
20060047363 | Farrelly et al. | Mar 2006 | A1 |
20060049153 | Cahoon et al. | Mar 2006 | A1 |
20080114492 | Miegel | May 2008 | A1 |
20080125893 | Tilove et al. | May 2008 | A1 |
20090075274 | Slepnev et al. | Mar 2009 | A1 |
20090139968 | Hesse et al. | Jun 2009 | A1 |
20100114338 | Bandyopadhyay et al. | May 2010 | A1 |
20100152870 | Wanner et al. | Jun 2010 | A1 |
20100206938 | Quandt et al. | Aug 2010 | A1 |
20100274390 | Bernd | Oct 2010 | A1 |
20110141251 | Marks et al. | Jun 2011 | A1 |
20110297666 | Ihle et al. | Dec 2011 | A1 |
20120096702 | Kinglsey et al. | Apr 2012 | A1 |
20120267349 | Bemdl et al. | Oct 2012 | A1 |
20130119040 | Suraba et al. | May 2013 | A1 |
20130123801 | Umasathan et al. | May 2013 | A1 |
20130259376 | Louban | Oct 2013 | A1 |
20140088577 | Anastassiou et al. | Mar 2014 | A1 |
20140100694 | Rueckl et al. | Apr 2014 | A1 |
20140365009 | Wettels | Dec 2014 | A1 |
20150122781 | Albrecht | May 2015 | A1 |
20150127162 | Takefumi | May 2015 | A1 |
20160096269 | Atohira et al. | Apr 2016 | A1 |
20160125592 | Becker et al. | May 2016 | A1 |
20160125593 | Becker et al. | May 2016 | A1 |
20160224012 | Hunt | Aug 2016 | A1 |
20160257000 | Guerin et al. | Sep 2016 | A1 |
20160257070 | Boydston et al. | Sep 2016 | A1 |
20160267636 | Duplaix et al. | Sep 2016 | A1 |
20160267806 | Hsu et al. | Sep 2016 | A1 |
20170072507 | Legault | Mar 2017 | A1 |
20170132807 | Shivaram et al. | May 2017 | A1 |
20170232615 | Hammock | Aug 2017 | A1 |
20170266758 | Fukui et al. | Sep 2017 | A1 |
20170364076 | Keshmiri et al. | Dec 2017 | A1 |
20170368649 | Marrocco et al. | Dec 2017 | A1 |
20180043471 | Aoki et al. | Feb 2018 | A1 |
20180065204 | Burrows | Mar 2018 | A1 |
20180117701 | Ge et al. | May 2018 | A1 |
20180147662 | Furuya | May 2018 | A1 |
20180266961 | Narayanan et al. | Sep 2018 | A1 |
20180266967 | Ohno et al. | Sep 2018 | A1 |
20180304550 | Atherton et al. | Oct 2018 | A1 |
20180322623 | Memo et al. | Nov 2018 | A1 |
20180341730 | Atherton et al. | Nov 2018 | A1 |
20190076949 | Atherton et al. | Mar 2019 | A1 |
20190108639 | Tchapmi et al. | Apr 2019 | A1 |
20190193180 | Troyer et al. | Jun 2019 | A1 |
20190240759 | Ennsbrunner et al. | Aug 2019 | A1 |
20190375101 | Miyata et al. | Dec 2019 | A1 |
20200021780 | Jeong et al. | Jan 2020 | A1 |
20200030984 | Suzuki et al. | Jan 2020 | A1 |
20200114449 | Chang et al. | Apr 2020 | A1 |
20200130089 | Ivkovich et al. | Apr 2020 | A1 |
20200164521 | Li | May 2020 | A1 |
20200180062 | Suzuki et al. | Jun 2020 | A1 |
20200223064 | Alexander | Jul 2020 | A1 |
20200262079 | Saez et al. | Aug 2020 | A1 |
20200269340 | Tang et al. | Aug 2020 | A1 |
20200316779 | Truebenbach et al. | Oct 2020 | A1 |
20200409376 | Afrouzi et al. | Dec 2020 | A1 |
20210012678 | Torrecilla et al. | Jan 2021 | A1 |
20210158724 | Becker et al. | May 2021 | A1 |
20210318673 | Kitchen et al. | Oct 2021 | A1 |
20220016776 | Lonsberry et al. | Jan 2022 | A1 |
20220226922 | Albrecht et al. | Jul 2022 | A1 |
20220250183 | Knoener | Aug 2022 | A1 |
20220258267 | Becker | Aug 2022 | A1 |
20220266453 | Lonsberry et al. | Aug 2022 | A1 |
20220305593 | Lonsberry et al. | Sep 2022 | A1 |
20220324110 | Lonsberry et al. | Oct 2022 | A1 |
20220402147 | Martin et al. | Dec 2022 | A1 |
20220410402 | Lonsberry et al. | Dec 2022 | A1 |
20230093558 | Takeya et al. | Mar 2023 | A1 |
20230123712 | Lonsberry et al. | Apr 2023 | A1 |
20230173676 | Lonsberry et al. | Jun 2023 | A1 |
20230260138 | Becker et al. | Aug 2023 | A1 |
20230278224 | Bunker et al. | Sep 2023 | A1 |
20230330764 | Ott et al. | Oct 2023 | A1 |
20240025041 | Lonsberry et al. | Jan 2024 | A1 |
Number | Date | Country |
---|---|---|
110524581 | Mar 2019 | CN |
110064818 | Jul 2019 | CN |
110270997 | Sep 2019 | CN |
111189393 | May 2020 | CN |
113352317 | Sep 2021 | CN |
3812105 | Apr 2021 | EP |
2010269336 | Dec 2010 | JP |
2015140171 | Aug 2015 | JP |
223604 | Dec 2015 | JP |
2015223604 | Dec 2015 | JP |
2017196653 | Nov 2017 | JP |
WO 1994003303 | Feb 1994 | WO |
WO 2004096481 | Nov 2004 | WO |
WO 2016013171 | Jun 2017 | WO |
WO 2017115015 | Jul 2017 | WO |
WO 2018173655 | Sep 2018 | WO |
WO 2019153090 | Aug 2019 | WO |
Entry |
---|
Horvath et al., “Bead geometry modeling on uneven base metal surface by fuzzy systems for multi-pass welding”, Expert Systems with Applications, 186, 115356, 24 pages, 2021. |
International Search Report and Written Opinion issued in corresponding PCT Application No. PCT/US2023/019063, mailed Sep. 12, 2023. |
Invitation to Pay Additional Fees and, Where Applicable, Protest Fee and Partial International Search Report issued in International Patent Application No. PCT/US2023/019063, dated Jul. 18, 2023, 11 pages. |
Jing et al., “Rgb-d sensor-based auto path generation method for arc welding robot”, 2016 Chinese Control and Decision Conference (CCDC), IEEE, 2016. |
Larkin et al., “Automatic program generation for welding robots from CAD”, IEEE International Conference on Advanced Intelligent Mechatronics (Aim), IEEE, pp. 560-565, 2016. |
Liu et al., “Motion navigation for arc welding robots based on feature mapping in a simulation environment”, Robotics and Computer-Integrated Manufacturing, 26(2); pp. 137-144, 2010. |
Ahmed, S.M. (Oct. 2016) “Object Detection and Motion Planning for Automated Welding of Tibular Joints”, 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems, Daejeon, Korea, Oct. 9-14: pp. 2610-2615. |
Bingul et al., A real-time prediction model of electrode extension for GMAW, IEEE/ASME Transaction on Mechatronics, vol. 11, No. 1, pp. 47-4, 2006. |
Boulware, “A methodology for the robust procedure development of fillet welds”, The Ohio State University Knowledge Bank, 2006. |
Chandrasekaran, “Predication of bead geometry in gas metal arc welding by statistical regression analysis”, University of Waterloo, 2019. |
Devaraj et al., “Grey-based Taguchi multiobjective optimization and artificial intelligence-based prediction of dissimilar gas metal arc welding process performance”, Metals, vol. 11, 1858, 2021. |
Ding, et al., “Bead modelling and implementation of adaptive MAT path in wire and arc additive manufacturing”, Robotics and Computer-Integrated Manufacturing, vol. 39, pp. 32-42, 2016. |
Hao et al., “Effects of tilt angle between laser nozzle and substrate on bead morphology in multi-axis laser cladding”, Journal of manufacturing processes, vol. 43, pp. 311-322, 2019. |
Harwig, “A wise method for assessing arc welding performance and quality”, Welding Journal, pp. 35-39, 2000. |
Harwig, “Arc behaviour and metal transfer of the vp-gmaw process”, School of Industrial and Manufacturing Science, 2003. |
He et al., “Dynamic modeling of weld bead geometry features in thick plate GMAW based on machine vision and learning”, Sensors, vol. 20, 7104, 2020. |
Hu et al., “Molten pool behaviors and forming appearance of robotic GMAW on complex surface with various welding positions”, Journal of Manufacturing Processes, vol. 64, pp. 1359-1376, 2021. |
Hu et al., “Multi-bead overlapping model with varying cross-section profile for robotic GMAW-based additive manufacturing”, Journal of Intelligent Manufacturing, 2019. |
International Application No. PCT/US2021/042218, filed Jul. 19, 2021, by Path Robotics, Inc.: International Search Report and Written Opinion, mailed Dec. 9, 2021, including Notification of Transmittal; 16 pages. |
International Patent Application No. PCT/US2022/017741 International Search Report and Written Opinion, dated Jul. 25, 2022. (23 pages). |
International Patent Application No. PCT/US2022/017741 Invitation to Pay Additional Fees and, Where Applicable, Protest Fee, dated May 23, 2022. (2 pages). |
International Patent Application No. PCT/US2022/017744 International Search Report and Written Opinion, dated Jun. 14, 2022. (13 pages). |
International Search Report and Written Opinion for International Application No. PCT/IB2022/061107, mailed Nov. 17, 2022, 19 pages. |
Kesse et al., “Development of an artificial intelligence powered TIG welding algorithm for the prediction of bead geometry for TIG welding processes using hybrid deep learning”, Metals, vol. 10, 451, 2020. |
Kiran et al., “Arc interaction and molten pool behavior in the three wire submerged arc welding process”, International Journal of Heat and Mass Transfer, vol. 87, pp. 327-340, 2015. |
Kolahan et al., A new approach for predicting and optimizing weld bead geometry in GMAW, World Academy of Science, Engineering and Technology, vol. 59, pp. 138-141, 2009. |
Li et al., “Enhanced beads overlapping model for wire and arc additive manufacturing of multi-layer multi-bead metallic parts”, Journal of Materials Processing Technology, 2017. |
Li et al., “GMAW-based additive manufacturing of inclined multi-layer multi-bead parts with flat-position deposition”, Journal of Materials Processing Tech, vol. 262, pp. 359-371, 2018. |
Lynch, K.M et al., “Robot Control”, Modern Robotics: Mechanics, Planning, and Control, Cambridge University Press, 2017, 403-460. |
Martinez et al., “Two gas metal arc welding process dataset of arc parameters and input parameters”, Data in Brief, vol. 35, 106790, 2021. |
Mollayi et al., “Application of multiple kernel support vector regression for weld bead geometry prediction in robotic GMAW Process”, International Journal of Electrical and Computer Engineering, vol. 8, No. pp. 2310-2318, 2018. |
Moos et al., “Resistance Spot Welding Process Simulation for Variational Analysis on Compliant Assemblies”, Journal of Manufacturing Systems, Dec. 16, 2014, pp. 44-71. |
Munro, “An empirical model for overlapping bead geometry during laser consolidation”, Defence Research and Development Canada, 2019. |
Murray, “Selecting parameters for GMAW using dimensional analysis”, Welding Research, pp. 152s-131s, 2002. |
Natarajan, S et al., “Aiding Grasp Synthesis for Novel Objects Using Heuristic-Based and Data-Driven Active Vision Methods”, Frontiers in Robotics and AI, 8:696587, 2021. |
Nguyen et al., “Multi-bead overlapping models for tool path generation in wire-arc additive manufacturing processes”, Procedia Manufacturing, vol. 47, pp. 1123-1128, 2020. |
Rao et al., “Effect of process parameters and mathematical model for the prediction of bead geometry in pulsed GMA welding”, The International Journal of Advanced Manufacturing Technology, vol. 45, pp. 496-05, 2009. |
Ravichandran et al., “Parameter optimization of gas metal arc welding process on AISI: 430Stainless steel using meta heuristic optimization techniques”, Department of Mechatronics Engineering. |
Sheng et al., “A new adaptive trust region algorithm for optimization problems”, Acta Mathematica Scientia, vol. 38b, No. 2., pp. 479-496, 2018. |
Thao, et al., “Interaction model for predicting bead geometry for lab joint in GMA welding process”, Computational Materials Science and Surface Engineering, vol. 1, issue. 4., pp. 237-244, 2009. |
Xiao, R. et al., “An adaptive feature extraction algorithm for multiple typical seam tracking based on vision sensor in robotic arc welding” Sensors and Actuators A: Physical, 297:111533, 15 pages, 2019. |
Xiong et al., “Modeling of bead section profile and overlapping beads with experimental validation for robotic GMAW-based rapid manufacturing”, Robotics and Computer-Integrated Manufacturing, vol. 29, pp. 417-423, 2013. |
Number | Date | Country | |
---|---|---|---|
20240075629 A1 | Mar 2024 | US |
Number | Date | Country | |
---|---|---|---|
63282827 | Nov 2021 | US | |
63153109 | Feb 2021 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17902748 | Sep 2022 | US |
Child | 18469506 | US | |
Parent | 17680027 | Feb 2022 | US |
Child | 17902748 | US |