The disclosed technology relates in general to industrial manufacturing and fabricating systems, devices, and processes and more specifically to a system for programming manufacturing equipment remotely, also referred to as a “tele-manufacturing” or “tele-programming” system.
Industrial welding is currently challenged by a variety of factors including a decreasing number of skilled users; a lack of individuals wanting to enter what are traditionally considered to be “manual” trades; and an ever-increasing list of hazards and limitations related to welding, grinding, and other hot work activities that make finding and keeping experienced users difficult. Additionally, efforts within industry to optimize weight and space in manufacturing processes have resulted in the construction of manufacturing facilities that are basically inaccessible to humans. Accordingly, there is an ongoing need for welding-related systems, processes, and methods that permit qualified users to enter and remain in the workforce regardless of physical limitations, age, or other perceived obstacles such as those described above.
The following provides a summary of certain example implementations of the disclosed technology. This summary is not an extensive overview and is not intended to identify key or critical aspects or elements of the disclosed technology or to delineate its scope. However, it is to be understood that the use of indefinite articles in the language used to describe and claim the disclosed technology is not intended in any way to limit the described technology. Rather the use of “a” or “an” should be interpreted to mean “at least one” or “one or more”.
One implementation of the disclosed technology provides a first method for programming equipment used for or related to a manufacturing process. This method comprises installing the equipment in a manufacturing environment; positioning a plurality of sensors within the manufacturing environment in proximity to the equipment, wherein the plurality of sensors are configured to gather data from the manufacturing environment; connecting at least one processor to the plurality of sensors, wherein the at least one processor includes software for receiving data from the plurality of sensors and the equipment; connecting at least one manual controller to the processor, wherein the at least one manual controller receives motion input from a user of the manual controller, wherein the software on the processor mathematically transforms the motion input into corresponding motion commands that are sent to the equipment by the processor, wherein the equipment, which is physically remote from the at least one controller, executes the motion commands in real-time during the manufacturing process; and by the software, saving a teachpoint in a program file.
Implementations of the method further comprise determining whether the equipment has moved a predetermined minimum distance from a previously saved teachpoint location; adding a first new teachpoint to the program file if the equipment moved the predetermined minimum distance from the previously saved teachpoint location; determining whether a predetermined minimum amount of time has passed since a previously saved teachpoint time; and adding a second new teachpoint to the program file if the predetermined minimum amount of time varied from the previously saved teachpoint time. Implementations of the method further comprise displaying a real-time video of the manufacturing environment to the user on a human machine interface screen during the manufacturing process, wherein the user can add the teachpoint to the program file through the human machine interface screen. Implementations of the method further comprise using artificial intelligence to determine if a new teachpoint should be saved to the program file. The artificial intelligence uses project trajectory, direction, speed, distance, or combinations thereof to determine if the new teachpoint should be saved to the program file. Implementations of the method further comprise, from at least one of the plurality of sensors, measuring a distance between an end effector on the equipment and a part surface, wherein the at least one of the plurality of sensors is a displacement data sensor; and disabling the user's control of the inspection equipment if the distances varies from a predetermined operating distance range. Implementations of the method further comprise, by the software, reading the measured distance between the end effector and the part surface; providing a haptic feedback response to the manual controller based on the data from the plurality of sensors and the equipment; and updating the haptic feedback response to the manual controller based on the measured distance from the displacement data sensor. Implementations of the method further comprise, from at least one of the plurality of sensors, measuring a pressure applied between an end effector on the equipment and a part surface; and disabling the user's control of the equipment if the pressure varies from a predetermined operating pressure range. Implementations of the method further comprise, by the software, reading the measured pressure applied between the end effector and the part surface; providing a haptic feedback response to the manual controller based on the data from the plurality of sensors and the equipment; and updating the haptic feedback response to the manual controller based on the measured pressure. The equipment may include welding equipment, measurement equipment, inspection equipment, remote assembly equipment, or combinations thereof. The at least one manual controller may be a hand-held stylus, a computer mouse, or a joystick. The method may further comprise providing a computer network across which the processor communicates with the equipment.
Another implementation of the disclosed technology provides a second method for programming equipment used for or related to a manufacturing process. This method comprises installing the equipment in a manufacturing environment; positioning a plurality of sensors within the manufacturing environment in proximity to the equipment, wherein the plurality of sensors are configured to gather data from the manufacturing environment; connecting at least one processor to the plurality of sensors, wherein the at least one processor includes software for receiving data from the plurality of sensors and the equipment; connecting at least one manual controller to the processor, wherein the at least one manual controller receives motion input from a user of the manual controller, wherein the software on the processor mathematically transforms the motion input into corresponding motion commands that are sent to the equipment by the processor, wherein the equipment, which is physically remote from the at least one controller, executes the motion commands in real-time during the manufacturing process; and saving a teachpoint in a program file, wherein the software determines whether the equipment has moved a predetermined minimum distance from a previously saved teachpoint location; and adds the teachpoint to the program file if the equipment moved the predetermined minimum distance from the previously saved teachpoint location.
The method may further comprise determining whether a predetermined minimum amount of time has passed since a previously saved teachpoint time; and adding a new teachpoint to the program file if the predetermined minimum amount of time varied from the previously saved teachpoint time. The method may further comprise a human machine interface screen during the manufacturing process, wherein the user can add the teachpoint to the program file through the human machine interface screen. The method may further comprise using artificial intelligence to determine if a new teachpoint should be saved to the program file. The artificial intelligence may use project trajectory, direction, speed, distance, or combinations thereof to determine if the new teachpoint should be saved to the program file. The equipment may include welding equipment, measurement equipment, inspection equipment, remote assembly equipment, or combinations thereof. The at least one manual controller may be a hand-held stylus, a computer mouse, or a joystick. The method may further comprise providing a computer network across which the processor communicates with the equipment.
It should be appreciated that all combinations of the foregoing concepts and additional concepts discussed in greater detail below (provided such concepts are not mutually inconsistent) are contemplated as being part of the technology disclosed herein and may be implemented to achieve the benefits as described herein. Additional features and aspects of the disclosed system, devices, and methods will become apparent to those of ordinary skill in the art upon reading and understanding the following detailed description of the example implementations. As will be appreciated by the skilled artisan, further implementations are possible without departing from the scope and spirit of what is disclosed herein. Accordingly, the descriptions provided herein are to be regarded as illustrative and not restrictive in nature.
The accompanying drawings, which are incorporated into and form a part of the specification, schematically illustrate one or more example implementations of the disclosed technology and, together with the general description given above and detailed description given below, serve to explain the principles of the disclosed subject matter, and wherein:
Example implementations are now described with reference to the Figures. Reference numerals are used throughout the detailed description to refer to the various elements and structures. Although the following detailed description contains many specifics for the purposes of illustration, a person of ordinary skill in the art will appreciate that many variations and alterations to the following details are within the scope of the disclosed technology. Accordingly, the following implementations are set forth without any loss of generality to, and without imposing limitations upon, the claimed subject matter.
The examples discussed herein are examples only and are provided to assist in the explanation of the apparatuses, devices, systems, and methods described herein. None of the features or components shown in the drawings or discussed below should be taken as required for any specific implementation of any of these the apparatuses, devices, systems or methods unless specifically designated as such. For ease of reading and clarity, certain components, modules, or methods may be described solely in connection with a specific Figure. Any failure to specifically describe a combination or sub-combination of components should not be understood as an indication that any combination or sub-combination is not possible. Also, for any methods described, regardless of whether the method is described in conjunction with a flow diagram, it should be understood that unless otherwise specified or required by context, any explicit or implicit ordering of steps performed in the execution of a method does not imply that those steps must be performed in the order presented but instead may be performed in a different order or in parallel.
U.S. Patent Publication No. 2023/0112463 is relevant to the disclosed technology and the entire contents thereof are expressly incorporated by reference herein and made part of this patent application for all purposes. U.S. Patent Publication No. 2023/0112463 discloses tele-manufacturing and tele-welding systems and methods that enable the operation of industrial equipment from one or more locations that are physically remote from the environment in which manufacturing is occurring. Tele-operation is most commonly associated with robotics and mobile robots, but may be applied to an entire range of circumstances in which a device or machine is operated by a person from a distance. Tele-welding systems permit an individual to direct a welding process from a remote location by controlling welding arc on, welding arc off, welding travel speed, and welding torch angles and motions, thereby permitting the individual to make various decisions regarding the welding of a part that is not in their immediate visual or auditory range.
Tele-welding and tele-programming differ from remote welding and programming in that the welding and weld-programming machinery at the remote location (e.g., robot, manipulator, mechanization, automation, etc.) is not running an independent program or motion plan. The person (operator) who is remote from the machinery is in direct control of the welding and weld-programming machinery and makes decisions to move the machinery by use of a hand-held stylus device or similar device. Furthermore, tele-manufacturing systems differ from virtual reality (VR) in that the hand-held stylus is an actual physical device that has various degrees of rotational freedom and provides encoded position information from each axis when queried and converts the motion of the stylus device into motion on a live articulated machine (robot) with various degrees of rotational freedom. Regarding the local manipulator (e.g., stylus) to remote machinery (e.g., robot) motion in implementations wherein the stylus (controller) and articulated machine (robot) both include six degrees of freedom, the robot six degrees of freedom includes X, Y, Z, roll, pitch and yaw (Rx, Ry, Rz) direction motions generated with the six coordinated axes of the articulated robot. The six degrees of freedom of the stylus device include X, Y and Z direction motion and three additional gimbaled axes that provide the additional three degrees of freedom.
Tele-welding and tele-programming systems differ from off-line planning control systems in that motion control is done in real-time with only sub-second delays or latency between user movement of the hand-held stylus and motion produced on remote machinery. The difference between the disclosed control systems and a traditional master-follower control system is that the degrees of freedom of motion on the stylus do not produce the same degrees of freedom motion result on the robot, i.e., the motion output of the robot and hand-held stylus are not identical. Additionally, the number of degrees of freedom present on the hand-held stylus device (or other type of controller) does not necessarily match the number of degrees of freedom on the robot.
The disclosed technology, which is referred to as “tele-programming”, includes systems and methods that enable programming machinery/robot from one or more locations that are physically remote from where the machinery/robot is positioned or installed. The disclosed tele-programming system permits an individual to program and save robot movement, positioning, and timing from a remote location by setting or creating system “teachpoints” in a robot program file. It is to be understood that the disclosed systems and methods can be used in various alternate implementations of robot programming, including but not limited to, robot trajectory, direction, speed, distance, or time.
The disclosed tele-programming system includes components similar to the components included in the tele-welding system described in U.S. Patent Publication No. 2023/0112463.
Processor 100, which may be a computer or computing device that includes various control hardware, runs an overall program for communicating with remote devices over network 1000 using wired or wireless protocols. Processor 100 is connected to manual controller 400, which may be a hand-held stylus, joystick, mouse, or any other electronic or computer-compatible motion control device having at least one degree of rotational freedom. Processor 100 is connected to the Internet and opens a URL/webpage or digital media player software using an Internet browser or similar program. In some implementations, other open or closed computer networks are utilized.
Control software 200 runs on processor 100 and enables communication with and between remote machinery 300 across network 1000. In an example implementation, control software 200 uses digitized environmental data (e.g., point clouds) provided by three-dimensional scanning sensors 600 (such as, for example, laser detection and ranging (LIDAR), blue, white, or stereo scanning sensors) and executes mathematical transformations to convert a digital environment into haptic feedback felt by local user 50 while holding manual controller 400, in real time, thereby providing local user 50 with a physical sense or physical interpretation of an actual working environment. Control software 200 also allows local user 50 to initiate or stop communication with remote machinery 300. Remote machinery 300 can include any type of welding equipment, measurement equipment, inspection equipment, remote assembly equipment, or a combination thereof, including a robot or robot controlled having a process tool end effector. Process tool end effector 800 can be any instrument or tool used in manufacturing processes including, but not limited to, grinders or similar material removal devices, weld torches, and inspection probes. The robot and robot controller hardware are typically controlled by open source or proprietary communication protocols.
Control software 200, an example of which is specific to tele-programming system 10, includes an executable application that can be run on the computer of a remote user, on a computer located in the same environment as machinery 300, or on the robot controller system. Software 200 provides a user interface, or human machine interface (HMI) screen 150, for allowing user 50 to interact with manual controller 400 and remote machinery 300 in a simultaneous manner. Software 200 provides a connection to remote machinery 300 using a local area network (LAN), intranet, or the Internet, (collectively, network 1000) for the purpose of controlling remote machinery 300. Software 200 receives input from user 50 by way of the user interface or HMI screen 150 to begin or end a tele-grinding process, to set or change process settings or parameters, or to set or change manual controller 400 parameters. System software 200 provides a process for communicating with at least one locally connected manual controller 400, thereby allowing user 50 to directly manipulate manual controller 400 while querying manual controller 400 positions and converting the positions to resultant movement of the remote machinery 300. In other implementations, the grinding start/stop process is accomplished using buttons or other additional data input/output features included on manual controller 400.
Controller 400 may be a manually operated device such as a hand-held stylus, a computer mouse, a joystick, or any other suitable device having at least one degree of rotational freedom that can be used to record various hand motions of user 50. Physical manipulations of manual controller 400 by user 50 are ultimately converted into the physical motions of remote machinery 300 by control software 200. Manual controller 400 may also provide a haptic feedback response to user 50, corresponding to either physical environmental objects that exist at the remote location or to virtual barriers.
Manual controller 400 may be a stylus device that is a commercially available haptic feedback system including software libraries that are imported into tele-programming software program 200. For example, tele-programming software 200 queries manual controller/stylus device 400 using stylus software library functions for current axis positions and tele-programming software 200 sets the haptic feedback response of manual controller/stylus device 400 by sending commands to manual controller/stylus device 400 using the stylus software library functions. Manual controller/stylus device 400 applies the commanded settings from tele-programming software 200 to produce a sensation response felt by local user 50 holding manual controller/stylus device 400. The commanded settings change the servo-motor power and response characteristics which produce sensations of touching surfaces of different densities or at differing levels of force, mass, gravity or speed. Tele-programming software 200 determines the settings for the type of response based on the current location of remote machinery 300 and from analysis of the data queried from sensors 500, 600, and 700.
Environmental sensors 500 may include cameras, microphones, digitizers, and other types of sensing devices, and may use optical systems, devices, and methods for determining the displacement of physical objects within remote working environment 30 that encompasses remote machinery 300 and the tele-programming process that is occurring therein. Environmental sensors 500 may also use auditory systems, devices, and methods for capturing sounds within remote working environment 30 that encompasses remote machinery 300 and the tele-programming process that is occurring therein. Sensors 500, 600, and 700 may be used to collect digitized environmental data (e.g., point clouds) that is transmitted to and stored on processor 100. This digital environmental data may then be used to determine when, how, and what type of physical sensation response is to be applied to the manual controller 400 for indicating the presence of a physical object in remote working environment 30 or proximity to a physical object or virtual barrier. With regard to programming processes: (i) inexpensive, digital cameras may be used to assist with proper alignment of the process tool end effector in relation to a part or surface; (ii) specialty manufacturing process cameras may be used to provide a real-time movement views; (iii) microphones may be used to capture sounds useful in programming the equipment; and (iv) livestreaming camera video and audio may be used to provide low latency real-time process data.
Three-dimensional sensors 600, which cooperate with environmental sensors 500, may be mounted to machinery and/or robot 300 to measure the displacement of objects relative to the sensors themselves and provide a digitized topographical representation of the physical environment in which the weld-grinding process is occurring. Optically-based processes such as, for example, infrared (IR), laser detection and ranging (LIDAR), blue, white, or laser vibrometer scanning systems can be used to create a point cloud or three-dimensional digital map of remote working environment 30. Scanning and digitizing may be completed prior to the programming process or may be completed in real-time as the programming process is occurring. With regard to the programming process, three-dimensional sensors 600 may be used to measure displacement distance between the robot arm having the process tool end effector and the part and/or part surface.
Process sensors 700, which cooperate with sensors 500 and 600, may be mounted to the robot to measure pressure, force, and/or strain applied to the sensors 700 from the robot. Data and information from sensors 500, 600, 700 may be used to: (i) override user control of the robot if pressure, force, and/or distance exceed predetermined operating parameter ranges; (ii) adjust haptic feedback response on the manual controller/stylus device; and (iii) control and update the motion and speed of the robot in the X, Y, Z, Rx, Ry, and Rz directions. It is to be understood that the predetermined operating parameter ranges are any ranges of pressure, force, and/or distance that allow machinery/robot 300 to function without crashing.
Tele-programming system 10 provides real-time video and audio feedback of remote working environment 30 to user 50 in local environment 20. Video, audio, or other sensor data is encoded using commercially available encoding hardware servers or an encoding software program running on a processor. The server or processor, using commercially available software, publishes the encoded video and audio or other data stream to the internet or a LAN through a URL or web address 1000. The published livestream uses a low latency protocol to stream the information to devices over the internet or on a local area network (LAN) 1000. User 50 can access livestreamed video over the internet or LAN 1000 using a processor (e.g., a personal computer) and commercial media player application that can read and play the audio or video stream on a personal computer.
With reference to
Again with reference to
All literature and similar material cited in this application, including, but not limited to, patents, patent applications, articles, books, treatises, and web pages, regardless of the format of such literature and similar materials, are expressly incorporated by reference in their entirety. Should one or more of the incorporated references and similar materials differ from or contradict this application, including but not limited to defined terms, term usage, described techniques, or the like, this application controls.
As previously stated and as used herein, the singular forms “a,” “an,” and “the,” refer to both the singular as well as plural, unless the context clearly indicates otherwise. The term “comprising” as used herein is synonymous with “including,” “containing,” or “characterized by,” and is inclusive or open-ended and does not exclude additional, unrecited elements or method steps. Although many methods and materials similar or equivalent to those described herein can be used, particular suitable methods and materials are described herein. Unless context indicates otherwise, the recitations of numerical ranges by endpoints include all numbers subsumed within that range. Furthermore, references to “one implementation” are not intended to be interpreted as excluding the existence of additional implementations that also incorporate the recited features. Moreover, unless explicitly stated to the contrary, implementations “comprising” or “having” an element or a plurality of elements having a particular property may include additional elements whether or not they have that property.
The terms “substantially” and “about”, if or when used throughout this specification describe and account for small fluctuations, such as due to variations in processing. For example, these terms can refer to less than or equal to ±5%, such as less than or equal to ±2%, such as less than or equal to ±1%, such as less than or equal to ±0.5%, such as less than or equal to ±0.2%, such as less than or equal to ±0.1%, such as less than or equal to ±0.05%, and/or 0%.
Underlined and/or italicized headings and subheadings are used for convenience only, do not limit the disclosed subject matter, and are not referred to in connection with the interpretation of the description of the disclosed subject matter. All structural and functional equivalents to the elements of the various implementations described throughout this disclosure that are known or later come to be known to those of ordinary skill in the art are expressly incorporated herein by reference and intended to be encompassed by the disclosed subject matter. Moreover, nothing disclosed herein is intended to be dedicated to the public regardless of whether such disclosure is explicitly recited in the above description.
There may be many alternate ways to implement the disclosed technology. Various functions and elements described herein may be partitioned differently from those shown without departing from the scope of the disclosed technology. Generic principles defined herein may be applied to other implementations. Different numbers of a given module or unit may be employed, a different type or types of a given module or unit may be employed, a given module or unit may be added, or a given module or unit may be omitted.
Regarding this disclosure, the term “a plurality of” refers to two or more than two. Unless otherwise clearly defined, orientation or positional relations indicated by terms such as “upper” and “lower” are based on the orientation or positional relations as shown in the figures, only for facilitating description of the disclosed technology and simplifying the description, rather than indicating or implying that the referred devices or elements must be in a particular orientation or constructed or operated in the particular orientation, and therefore they should not be construed as limiting the disclosed technology. The terms “connected”, “mounted”, “fixed”, etc. should be understood in a broad sense. For example, “connected” may be a fixed connection, a detachable connection, or an integral connection; a direct connection, or an indirect connection through an intermediate medium. For an ordinary skilled in the art, the specific meaning of the above terms in the disclosed technology may be understood according to specific circumstances.
Specific details are given in the above description to provide a thorough understanding of the disclosed technology. However, it is understood that the disclosed embodiments and implementations can be practiced without these specific details. For example, circuits can be shown in block diagrams in order not to obscure the disclosed implementations in unnecessary detail. In other instances, well-known circuits, processes, algorithms, structures, and techniques can be shown without unnecessary detail in order to avoid obscuring the disclosed implementations.
Implementation of the techniques, blocks, steps and means described above can be accomplished in various ways. For example, these techniques, blocks, steps and means can be implemented in hardware, software, or a combination thereof. For a hardware implementation, the processing units can be implemented within one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions described above, and/or a combination thereof.
The disclosed technology can be described as a process which is depicted as a flowchart, a flow diagram, a data flow diagram, a structure diagram, or a block diagram. Although a flowchart can describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations can be re-arranged. A process is terminated when its operations are completed, but could have additional steps not included in the figure. A process can correspond to a method, a function, a procedure, a subroutine, a subprogram, etc. When a process corresponds to a function, its termination corresponds to a return of the function to the calling function or the main function.
Furthermore, the disclosed technology can be implemented by hardware, software, scripting languages, firmware, middleware, microcode, hardware description languages, and/or any combination thereof. When implemented in software, firmware, middleware, scripting language, and/or microcode, the program code or code segments to perform the necessary tasks can be stored in a machine readable medium such as a storage medium. A code segment or machine-executable instruction can represent a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, a software package, a script, a class, or any combination of instructions, data structures, and/or program statements. A code segment can be coupled to another code segment or a hardware circuit by passing and/or receiving information, data, arguments, parameters, and/or memory contents. Information, arguments, parameters, data, etc. can be passed, forwarded, or transmitted via any suitable means including memory sharing, message passing, ticket passing, network transmission, etc.
It should be appreciated that all combinations of the foregoing concepts and additional concepts discussed in greater detail herein (provided such concepts are not mutually inconsistent) are contemplated as being part of the disclosed technology. In particular, all combinations of claimed subject matter appearing at the end of this disclosure are contemplated as being part of the technology disclosed herein. While the disclosed technology has been illustrated by the description of example implementations, and while the example implementations have been described in certain detail, there is no intention to restrict or in any way limit the scope of the appended claims to such detail. Additional advantages and modifications will readily appear to those skilled in the art. Therefore, the disclosed technology in its broader aspects is not limited to any of the specific details, representative devices and methods, and/or illustrative examples shown and described. Accordingly, departures may be made from such details without departing from the spirit or scope of the general inventive concept.