The present invention relates to robotic calibration and control system tuning, and more particularly, to a system and method for use of robotic assembly systems involving a moving robot base and moving assembly base.
A variety of operations can be performed during the final trim and assembly (FTA) stage of automotive assembly, including, for example, door assembly, cockpit assembly, and seat assembly, among other types of assemblies. Yet, for a variety of reasons, only a relatively small number of FTA tasks are typically automated. For example, often during the FTA stage, while an operator is performing an FTA operation, the vehicle(s) undergoing FTA is/are being transported on a line(s) that is/are moving the vehicle(s) in a relatively continuous manner. Yet such continuous motions of the vehicle(s) can cause or create certain irregularities with respect to at least the movement and/or position of the vehicle(s), and/or the portions of the vehicle(s) that are involved in the FTA. Moreover, such motion can cause the vehicle to be subjected to movement irregularities, vibrations, and balancing issues during FTA, which can prevent, or be adverse to, the ability to accurately model or predict the location of a particular part, portion, or area of the vehicle directly involved in the FTA. Further, such movement irregularities can prevent the FTA from having a consistent degree of repeatability in terms of the movement and/or positioning of each vehicle, or its associated component, as each subsequent vehicle and/or component passes along the same area of the assembly line. Accordingly, such variances and concerns regarding repeatability can often preclude the use of traditional teach and repeat position based robot motion control in FTA operations.
Accordingly, although various robot control systems are available currently in the marketplace, further improvements are possible to provide a system and means to calibrate and tune the robot control system to accommodate such movement irregularities.
A robotic assembly operation is described for assembling parts together. During setup of the assembly operation, control parameters and a control scheme are set and changed by simulating the operation and testing whether performance requirements are met. A dry run may be performed and test data collected after running the simulation to determine if the performance requirements are satisfied during the dry run. During production, production data may also be collected and control parameters may be tuned when changes occur during production in order to maintain stable assembly. These and other aspects of the present invention will be better understood in view of the drawings and following detailed description.
The description herein makes reference to the accompanying figures wherein like reference numerals refer to like parts throughout the several views.
The foregoing summary, as well as the following detailed description of certain embodiments of the present application, will be better understood when read in conjunction with the appended drawings. For the purpose of illustrating the application, there is shown in the drawings, certain embodiments. It should be understood, however, that the present application is not limited to the arrangements and instrumentalities shown in the attached drawings. Further, like numbers in the respective figures indicate like or comparable parts.
Certain terminology is used in the foregoing description for convenience and is not intended to be limiting. Words such as “upper,” “lower,” “top,” “bottom,” “first,” and “second” designate directions in the drawings to which reference is made. This terminology includes the words specifically noted above, derivatives thereof, and words of similar import. Additionally, the words “a” and “one” are defined as including one or more of the referenced item unless specifically noted. The phrase “at least one of” followed by a list of two or more items, such as “A, B or C,” means any individual one of A, B or C, as well as any combination thereof.
According to certain embodiments, the robot station 102 includes one or more robots 106 having one or more degrees of freedom. For example, according to certain embodiments, the robot 106 can have, for example, six degrees of freedom. According to certain embodiments, an end effector 108 can be coupled or mounted to the robot 106. The end effector 108 can be a tool, part, and/or component that is mounted to a wrist or arm 110 of the robot 106. Further, at least portions of the wrist or arm 110 and/or the end effector 108 can be moveable relative to other portions of the robot 106 via operation of the robot 106 and/or the end effector 108, such for, example, by an operator of the management system 104 and/or by programming that is executed to operate the robot 106.
The robot 106 can be operative to position and/or orient the end effector 108 at locations within the reach of a work envelope or workspace of the robot 106, which can accommodate the robot 106 in utilizing the end effector 108 to perform work, including, for example, grasp and hold one or more components, parts, packages, apparatuses, assemblies, or products, among other items (collectively referred to herein as “components”). A variety of different types of end effectors 108 can be utilized by the robot 106, including, for example, a tool that can grab, grasp, or otherwise selectively hold and release a component that is utilized in a final trim and assembly (FTA) operation during assembly of a vehicle, among other types of operations.
The robot 106 can include, or be electrically coupled to, one or more robotic controllers 112. For example, according to certain embodiments, the robot 106 can include and/or be electrically coupled to one or more controllers 112 that may, or may not, be discrete processing units, such as, for example, a single controller or any number of controllers. The controller 112 can be configured to provide a variety of functions, including, for example, be utilized in the selective delivery of electrical power to the robot 106, control of the movement and/or operations of the robot 106, and/or control the operation of other equipment that is mounted to the robot 106, including, for example, the end effector 108, and/or the operation of equipment not mounted to the robot 106 but which are an integral to the operation of the robot 106 and/or to equipment that is associated with the operation and/or movement of the robot 106. Moreover, according to certain embodiments, the controller 112 can be configured to dynamically control the movement of both the robot 106 itself, as well as the movement of other devices to which the robot 106 is mounted or coupled, including, for example, among other devices, movement of the robot 106 along, or, alternatively, by, a track 130 or mobile platform such as the AGV to which the robot 106 is mounted via a robot base 142, as shown in
The controller 112 can take a variety of different forms, and can be configured to execute program instructions to perform tasks associated with operating the robot 106, including to operate the robot 106 to perform various functions, such as, for example, but not limited to, the tasks described herein, among other tasks. In one form, the controller(s) 112 is/are microprocessor based and the program instructions are in the form of software stored in one or more memories. Alternatively, one or more of the controllers 112 and the program instructions executed thereby can be in the form of any combination of software, firmware and hardware, including state machines, and can reflect the output of discreet devices and/or integrated circuits, which may be co-located at a particular location or distributed across more than one location, including any digital and/or analog devices configured to achieve the same or similar results as a processor-based controller executing software or firmware based instructions. Operations, instructions, and/or commands determined and/or transmitted from the controller 112 can be based on one or more models stored in non-transient computer readable media in a controller 112, other computer, and/or memory that is accessible or in electrical communication with the controller 112.
According to the illustrated embodiment, the controller 112 includes a data interface that can accept motion commands and provide actual motion data. For example, according to certain embodiments, the controller 112 can be communicatively coupled to a pendant, such as, for example, a teach pendant, that can be used to control at least certain operations of the robot 106 and/or the end effector 108.
The robot station 102 and/or the robot 106 can also include one or more sensors 132. The sensors 132 can include a variety of different types of sensors and/or combinations of different types of sensors, including, but not limited to, a vision system 114, force sensors 134, motion sensors, acceleration sensors, and/or depth sensors, among other types of sensors. Further, information provided by at least some of these sensors 132 can be integrated, including, for example, via use of algorithms, such that operations and/or movement, among other tasks, by the robot 106 can at least be guided via sensor fusion. Thus, as shown by at least
According to the illustrated embodiment, the vision system 114 can comprise one or more vision devices 114a that can be used in connection with observing at least portions of the robot station 102, including, but not limited to, observing, parts, component, and/or vehicles, among other devices or components that can be positioned in, or are moving through or by at least a portion of, the robot station 102. For example, according to certain embodiments, the vision system 114 can extract information for a various types of visual features that are positioned or placed in the robot station 102, such, for example, on a vehicle and/or on automated guided vehicle (AGV) that is moving the vehicle through the robot station 102, among other locations, and use such information, among other information, to at least assist in guiding the movement of the robot 106, movement of the robot 106 along a track 130 or mobile platform such as the AGV (
According to certain embodiments, the vision system 114 can have data processing capabilities that can process data or information obtained from the vision devices 114a that can be communicated to the controller 112. Alternatively, according to certain embodiments, the vision system 114 may not have data processing capabilities. Instead, according to certain embodiments, the vision system 114 can be electrically coupled to a computational member 116 of the robot station 102 that is adapted to process data or information outputted from the vision system 114. Additionally, according to certain embodiments, the vision system 114 can be operably coupled to a communication network or link 118, such that information outputted by the vision system 114 can be processed by a controller 120 and/or a computational member 124 of a management system 104, as discussed below.
Examples of vision devices 114a of the vision system 114 can include, but are not limited to, one or more imaging capturing devices, such as, for example, one or more two-dimensional, three-dimensional, and/or RGB cameras that can be mounted within the robot station 102, including, for example, mounted generally above the working area of the robot 106, mounted to the robot 106, and/or on the end effector 108 of the robot 106, among other locations. Further, according to certain embodiments, the vision system 114 can be a position based or image based vision system. Additionally, according to certain embodiments, the vision system 114 can utilize kinematic control or dynamic control.
According to the illustrated embodiment, in addition to the vision system 114, the sensors 132 also include one or more force sensors 134. The force sensors 134 can, for example, be configured to sense contact force(s) during the assembly process, such as, for example, a contact force between the robot 106, the end effector 108, and/or a component being held by the robot 106 with the vehicle 136 and/or other component or structure within the robot station 102. Such information from the force sensor(s) 134 can be combined or integrated with information provided by the vision system 114 such that movement of the robot 106 during assembly of the vehicle 136 is guided at least in part by sensor fusion.
According to the exemplary embodiment depicted in
According to certain embodiments, the management system 104 can include any type of computing device having a controller 120, such as, for example, a laptop, desktop computer, personal computer, programmable logic controller (PLC), or a mobile electronic device, among other computing devices, that includes a memory and a processor sufficient in size and operation to store and manipulate a database 122 and one or more applications for at least communicating with the robot station 102 via the communication network or link 118. In certain embodiments, the management system 104 can include a connecting device that may communicate with the communication network or link 118 and/or robot station 102 via an Ethernet WAN/LAN connection, among other types of connections. In certain other embodiments, the management system 104 can include a web server, or web portal, and can use the communication network or link 118 to communicate with the robot station 102 and/or the supplemental database system(s) 105 via the internet.
The management system 104 can be located at a variety of locations relative to the robot station 102. For example, the management system 104 can be in the same area as the robot station 102, the same room, a neighboring room, same building, same plant location, or, alternatively, at a remote location, relative to the robot station 102. Similarly, the supplemental database system(s) 105, if any, can also be located at a variety of locations relative to the robot station 102 and/or relative to the management system 104. Thus, the communication network or link 118 can be structured, at least in part, based on the physical distances, if any, between the locations of the robot station 102, management system 104, and/or supplemental database system(s) 105. According to the illustrated embodiment, the communication network or link 118 comprises one or more communication links 118 (Comm link1-N in
The communication network or link 118 can be structured in a variety of different manners. For example, the communication network or link 118 between the robot station 102, management system 104, and/or supplemental database system(s) 105 can be realized through the use of one or more of a variety of different types of communication technologies, including, but not limited to, via the use of fiber-optic, radio, cable, or wireless based technologies on similar or different types and layers of data protocols. For example, according to certain embodiments, the communication network or link 118 can utilize an Ethernet installation(s) with wireless local area network (WLAN), local area network (LAN), cellular data network, Bluetooth, ZigBee, point-to-point radio systems, laser-optical systems, and/or satellite communication links, among other wireless industrial links or communication protocols.
The database 122 of the management system 104 and/or one or more databases 128 of the supplemental database system(s) 105 can include a variety of information that may be used in the identification of elements within the robot station 102 in which the robot 106 is operating. For example, as discussed below in more detail, one or more of the databases 122, 128 can include or store information that is used in the detection, interpretation, and/or deciphering of images or other information detected by a vision system 114, such as, for example, features used in connection with the calibration of the sensors 132. Additionally, or alternatively, such databases 122, 128 can include information pertaining to the one or more sensors 132, including, for example, information pertaining to forces, or a range of forces, that are to be expected to be detected by via use of the one or more force sensors 134 at one or more different locations in the robot station 102 and/or along the vehicle 136 at least as work is performed by the robot 106. Additionally, information in the databases 122, 128 can also include information used to at least initially calibrate the one or more sensors 132, including, for example, first calibration parameters associated with first calibration features and second calibration parameters that are associated with second calibration features.
The database 122 of the management system 104 and/or one or more databases 128 of the supplemental database system(s) 105 can also include information that can assist in discerning other features within the robot station 102. For example, images that are captured by the one or more vision devices 114a of the vision system 114 can be used in identifying, via use of information from the database 122, FTA components within the robot station 102, including FTA components that are within a picking bin, among other components, that may be used by the robot 106 in performing FTA.
Additionally, while the example depicted in
The inventions described herein involve a method and system to choose a control scheme with initial control parameters for an assembly process during setup according to application requirements (e.g., with performance requirements). It usually takes a long time for engineers to tune control parameters with a selected control scheme based on empirical experience. The described inventions provide an automatic approach to arrive at a stable assembly system with a tolerant control parameter range during setup. With the initial inputs (e.g., control parameters), the system can select the best fit control scheme with suitable control parameters through simulation and dry run data analysis during the setup based on performance requirements. For multiple stages of an assembly task, the system can determine a control scheme with initial control parameter sets for each stage that are defined by way points. The combined control schemes for the whole assembly process may be tested during a dry run.
During setup, the control scheme with initial control parameters can be automatically determined based on the performance requirements. The control parameter inputs may include sensor characteristics (sampling rate, accuracy, noise, delay, etc.), robot information (robot model, tool payload, etc.) and application task specifications (path, way points, etc.). The performance requirements may include target moving speed, acceleration, assembly tolerance, cycle time, etc. Through simulation and dry run data analysis, a proper control scheme (e.g., 2D vs. 3D vision based, image based, using Kalman filter, etc.) and initial control parameters can be selected to minimize or compensate the computation, communication or/and robot response delays for fast and robust assembly. For each stage of the assembly process, a control scheme with suitable control parameter sets can be determined based on the stage specifications. The system can also change the inputs (delay time, robot motion, etc.) based on the dry run.
The inventions described herein also involve a method and system to dynamically tune control parameters with a selected control scheme during production runs of the robotic assembly operation. During a production run, the system calibration can drift over time or a minor collision may occur. In such a situation where something changes in the system, it is important to dynamically tune the control parameters accordingly to adjust and compensate for the changes to maintain a stable assembly operation. With the initial system from the setup, production data may continue to be collected over time and compared with historic data from each production run. If anything changes (e.g., contact force increase, tracking error increase, settling time increase, etc.), the system can automatically tune the control parameters during production to compensate for the change and adjust assembly performance.
During production runs of a robotic assembly operation, the system may continuously collect and store production data over time. The production data may include robot speed, acceleration, position, tracking error, delay, contact force, settling time, etc. After each production run, the system can compare the collected production data with historic production data to identify if anything has changed in the system. The system can then tune the control parameters dynamically during the production runs instead of stopping the assembly and recalibrating the system in a new setup. This feature is especially useful when something changes during production (e.g., the system drifts over time or a minor collision happens). Thus, time may be saved by not interrupting the assembly line while still maintaining stable assembly.
As illustrated in
The method may also include the steps of operating the robotic assembly operation in a dry run if the control scheme satisfies the performance requirement (180), collecting test data on the dry run (182), and determining if the test data satisfies the performance requirement (184). The one or more control parameters may also be changed and steps (170)-(184) may also be repeated if the test data does not satisfy the performance requirement (186).
The method may also include the steps of operating the robotic assembly operation in a production run if the control scheme satisfies the performance requirement (188), collecting production data on the production run (190), and analyzing the production data and determining if the production data satisfies the performance requirement (192). Additionally, at step 192, it is possible that machine learning or deep learning can be used to detect or predict any anomalies of the production performance change. Steps (188)-(192) may also be repeated without changing the one or more control parameters if the production data satisfies the performance requirement (194). The one or more control parameters, such as the weights applied to the fusion multi-camera inputs 154A, force sensor input 152A, IMU sensor input 156A, etc., may also be changed and steps (188)-(192) may also be repeated if the production data does not satisfy the performance requirement and repeating steps (196). The production data may include robot 106 speed, acceleration, position, tracking error, contact force or settling time. The method may also include the steps of collecting production data on two iterations of the production run (190), and determining if the production data varies between the two production runs (192). The production run may be repeated without changing the one or more control parameters if the production data varies by less than a threshold (194). The one or more control parameters may also be changed and the production run repeated if the production data varies by more than the threshold (196). Additionally, at step 192, machine learning or deep learning can be used to detect or predict an anomaly of sensor inputs, such as one of the vision sensors 154A being out of calibration due to a minor collision. As a result, the weighting applied to the out of calibration sensor 154A mounted on the robot tool can be reduced, and the weighting used for another vision sensor 154A mounted on the robot base and force sensor 152A can be increased. By tuning these parameters, the production performance can be recovered to a normal range. If desired, the out of calibration vision sensor can then be recalibrated during a break between shifts of production. The method illustrated in
The robotic assembly operation may also include an assembly base 138 with a first part 136 thereon and a robotic base 142 with a second part 164 thereon, and the control system may assemble the second part 164 with the first part 136 while the assembly base 138 and the robotic base 142 are both moving.
While the invention has been described in connection with what is presently considered to be the most practical and preferred embodiment, it is to be understood that the invention is not to be limited to the disclosed embodiment(s), but on the contrary, is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims, which scope is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures as permitted under the law. Furthermore it should be understood that while the use of the word preferable, preferably, or preferred in the description above indicates that feature so described may be more desirable, it nonetheless may not be necessary and any embodiment lacking the same may be contemplated as within the scope of the invention, that scope being defined by the claims that follow. In reading the claims it is intended that when words such as “a,” “an,” “at least one” and “at least a portion” are used, there is no intention to limit the claim to only one item unless specifically stated to the contrary in the claim. Further, when the language “at least a portion” and/or “a portion” is used the item may include a portion and/or the entire item unless specifically stated to the contrary.