TELE-PROGRAMMING SYSTEM AND METHOD

Abstract
Systems and methods for programming equipment used for or related to a manufacturing process, comprising installing equipment in a manufacturing environment; positioning a plurality of sensors within the manufacturing environment in proximity to the equipment, wherein the plurality of sensors are configured to gather data from the manufacturing environment; connecting at least one processor to the plurality of sensors, wherein the at least one processor includes software for receiving data from the plurality of sensors and the equipment; wherein the software on the processor mathematically transforms the motion input into corresponding motion commands, wherein the equipment, which is physically remote from the at least one controller, executes the motion commands in real-time; and by the software, saving a teachpoint in a program file.
Description
BACKGROUND

The disclosed technology relates in general to industrial manufacturing and fabricating systems, devices, and processes and more specifically to a system for programming manufacturing equipment remotely, also referred to as a “tele-manufacturing” or “tele-programming” system.


Industrial welding is currently challenged by a variety of factors including a decreasing number of skilled users; a lack of individuals wanting to enter what are traditionally considered to be “manual” trades; and an ever-increasing list of hazards and limitations related to welding, grinding, and other hot work activities that make finding and keeping experienced users difficult. Additionally, efforts within industry to optimize weight and space in manufacturing processes have resulted in the construction of manufacturing facilities that are basically inaccessible to humans. Accordingly, there is an ongoing need for welding-related systems, processes, and methods that permit qualified users to enter and remain in the workforce regardless of physical limitations, age, or other perceived obstacles such as those described above.


SUMMARY

The following provides a summary of certain example implementations of the disclosed technology. This summary is not an extensive overview and is not intended to identify key or critical aspects or elements of the disclosed technology or to delineate its scope. However, it is to be understood that the use of indefinite articles in the language used to describe and claim the disclosed technology is not intended in any way to limit the described technology. Rather the use of “a” or “an” should be interpreted to mean “at least one” or “one or more”.


One implementation of the disclosed technology provides a first method for programming equipment used for or related to a manufacturing process. This method comprises installing the equipment in a manufacturing environment; positioning a plurality of sensors within the manufacturing environment in proximity to the equipment, wherein the plurality of sensors are configured to gather data from the manufacturing environment; connecting at least one processor to the plurality of sensors, wherein the at least one processor includes software for receiving data from the plurality of sensors and the equipment; connecting at least one manual controller to the processor, wherein the at least one manual controller receives motion input from a user of the manual controller, wherein the software on the processor mathematically transforms the motion input into corresponding motion commands that are sent to the equipment by the processor, wherein the equipment, which is physically remote from the at least one controller, executes the motion commands in real-time during the manufacturing process; and by the software, saving a teachpoint in a program file.


Implementations of the method further comprise determining whether the equipment has moved a predetermined minimum distance from a previously saved teachpoint location; adding a first new teachpoint to the program file if the equipment moved the predetermined minimum distance from the previously saved teachpoint location; determining whether a predetermined minimum amount of time has passed since a previously saved teachpoint time; and adding a second new teachpoint to the program file if the predetermined minimum amount of time varied from the previously saved teachpoint time. Implementations of the method further comprise displaying a real-time video of the manufacturing environment to the user on a human machine interface screen during the manufacturing process, wherein the user can add the teachpoint to the program file through the human machine interface screen. Implementations of the method further comprise using artificial intelligence to determine if a new teachpoint should be saved to the program file. The artificial intelligence uses project trajectory, direction, speed, distance, or combinations thereof to determine if the new teachpoint should be saved to the program file. Implementations of the method further comprise, from at least one of the plurality of sensors, measuring a distance between an end effector on the equipment and a part surface, wherein the at least one of the plurality of sensors is a displacement data sensor; and disabling the user's control of the inspection equipment if the distances varies from a predetermined operating distance range. Implementations of the method further comprise, by the software, reading the measured distance between the end effector and the part surface; providing a haptic feedback response to the manual controller based on the data from the plurality of sensors and the equipment; and updating the haptic feedback response to the manual controller based on the measured distance from the displacement data sensor. Implementations of the method further comprise, from at least one of the plurality of sensors, measuring a pressure applied between an end effector on the equipment and a part surface; and disabling the user's control of the equipment if the pressure varies from a predetermined operating pressure range. Implementations of the method further comprise, by the software, reading the measured pressure applied between the end effector and the part surface; providing a haptic feedback response to the manual controller based on the data from the plurality of sensors and the equipment; and updating the haptic feedback response to the manual controller based on the measured pressure. The equipment may include welding equipment, measurement equipment, inspection equipment, remote assembly equipment, or combinations thereof. The at least one manual controller may be a hand-held stylus, a computer mouse, or a joystick. The method may further comprise providing a computer network across which the processor communicates with the equipment.


Another implementation of the disclosed technology provides a second method for programming equipment used for or related to a manufacturing process. This method comprises installing the equipment in a manufacturing environment; positioning a plurality of sensors within the manufacturing environment in proximity to the equipment, wherein the plurality of sensors are configured to gather data from the manufacturing environment; connecting at least one processor to the plurality of sensors, wherein the at least one processor includes software for receiving data from the plurality of sensors and the equipment; connecting at least one manual controller to the processor, wherein the at least one manual controller receives motion input from a user of the manual controller, wherein the software on the processor mathematically transforms the motion input into corresponding motion commands that are sent to the equipment by the processor, wherein the equipment, which is physically remote from the at least one controller, executes the motion commands in real-time during the manufacturing process; and saving a teachpoint in a program file, wherein the software determines whether the equipment has moved a predetermined minimum distance from a previously saved teachpoint location; and adds the teachpoint to the program file if the equipment moved the predetermined minimum distance from the previously saved teachpoint location.


The method may further comprise determining whether a predetermined minimum amount of time has passed since a previously saved teachpoint time; and adding a new teachpoint to the program file if the predetermined minimum amount of time varied from the previously saved teachpoint time. The method may further comprise a human machine interface screen during the manufacturing process, wherein the user can add the teachpoint to the program file through the human machine interface screen. The method may further comprise using artificial intelligence to determine if a new teachpoint should be saved to the program file. The artificial intelligence may use project trajectory, direction, speed, distance, or combinations thereof to determine if the new teachpoint should be saved to the program file. The equipment may include welding equipment, measurement equipment, inspection equipment, remote assembly equipment, or combinations thereof. The at least one manual controller may be a hand-held stylus, a computer mouse, or a joystick. The method may further comprise providing a computer network across which the processor communicates with the equipment.


It should be appreciated that all combinations of the foregoing concepts and additional concepts discussed in greater detail below (provided such concepts are not mutually inconsistent) are contemplated as being part of the technology disclosed herein and may be implemented to achieve the benefits as described herein. Additional features and aspects of the disclosed system, devices, and methods will become apparent to those of ordinary skill in the art upon reading and understanding the following detailed description of the example implementations. As will be appreciated by the skilled artisan, further implementations are possible without departing from the scope and spirit of what is disclosed herein. Accordingly, the descriptions provided herein are to be regarded as illustrative and not restrictive in nature.





BRIEF DESCRIPTION OF THE DRAWINGS

The accompanying drawings, which are incorporated into and form a part of the specification, schematically illustrate one or more example implementations of the disclosed technology and, together with the general description given above and detailed description given below, serve to explain the principles of the disclosed subject matter, and wherein:



FIG. 1 is a simplified block diagram of an example implementation of the disclosed tele-programming system showing the basic components of the system and the location of the components relative to one another;



FIGS. 2A-2B are flow charts depicting an example method or process for using the tele-programming system depicted in FIG. 1;



FIG. 3 is a flow chart illustrating an example stepwise process for gathering and interpreting data from sensors located in a remote manufacturing environment and controlling the operation of a robot within the manufacturing environment in conjunction with the method depicted in FIGS. 2A-2B and based on the gathered data;



FIG. 4 is a flow chart illustrating an example stepwise process for updating a haptic feedback response on a local manual controller used with a remote manufacturing environment in conjunction with the method depicted in FIGS. 2A-2B;



FIG. 5 is a flow chart illustrating an example stepwise process for translating motion and speed of a local manual controller to a robot motion path in conjunction with the method depicted in FIGS. 2A-2B;



FIG. 6 is a flow chart illustrating an example stepwise process for commanding a remote robot to move in conjunction with the method depicted in FIGS. 2A-2B; and



FIG. 7 is a flow chart illustrating an example stepwise process for saving teachpoints in a robot program file in conjunction with the method depicted in FIGS. 2A-2B.





DETAILED DESCRIPTION

Example implementations are now described with reference to the Figures. Reference numerals are used throughout the detailed description to refer to the various elements and structures. Although the following detailed description contains many specifics for the purposes of illustration, a person of ordinary skill in the art will appreciate that many variations and alterations to the following details are within the scope of the disclosed technology. Accordingly, the following implementations are set forth without any loss of generality to, and without imposing limitations upon, the claimed subject matter.


The examples discussed herein are examples only and are provided to assist in the explanation of the apparatuses, devices, systems, and methods described herein. None of the features or components shown in the drawings or discussed below should be taken as required for any specific implementation of any of these the apparatuses, devices, systems or methods unless specifically designated as such. For ease of reading and clarity, certain components, modules, or methods may be described solely in connection with a specific Figure. Any failure to specifically describe a combination or sub-combination of components should not be understood as an indication that any combination or sub-combination is not possible. Also, for any methods described, regardless of whether the method is described in conjunction with a flow diagram, it should be understood that unless otherwise specified or required by context, any explicit or implicit ordering of steps performed in the execution of a method does not imply that those steps must be performed in the order presented but instead may be performed in a different order or in parallel.


U.S. Patent Publication No. 2023/0112463 is relevant to the disclosed technology and the entire contents thereof are expressly incorporated by reference herein and made part of this patent application for all purposes. U.S. Patent Publication No. 2023/0112463 discloses tele-manufacturing and tele-welding systems and methods that enable the operation of industrial equipment from one or more locations that are physically remote from the environment in which manufacturing is occurring. Tele-operation is most commonly associated with robotics and mobile robots, but may be applied to an entire range of circumstances in which a device or machine is operated by a person from a distance. Tele-welding systems permit an individual to direct a welding process from a remote location by controlling welding arc on, welding arc off, welding travel speed, and welding torch angles and motions, thereby permitting the individual to make various decisions regarding the welding of a part that is not in their immediate visual or auditory range.


Tele-welding and tele-programming differ from remote welding and programming in that the welding and weld-programming machinery at the remote location (e.g., robot, manipulator, mechanization, automation, etc.) is not running an independent program or motion plan. The person (operator) who is remote from the machinery is in direct control of the welding and weld-programming machinery and makes decisions to move the machinery by use of a hand-held stylus device or similar device. Furthermore, tele-manufacturing systems differ from virtual reality (VR) in that the hand-held stylus is an actual physical device that has various degrees of rotational freedom and provides encoded position information from each axis when queried and converts the motion of the stylus device into motion on a live articulated machine (robot) with various degrees of rotational freedom. Regarding the local manipulator (e.g., stylus) to remote machinery (e.g., robot) motion in implementations wherein the stylus (controller) and articulated machine (robot) both include six degrees of freedom, the robot six degrees of freedom includes X, Y, Z, roll, pitch and yaw (Rx, Ry, Rz) direction motions generated with the six coordinated axes of the articulated robot. The six degrees of freedom of the stylus device include X, Y and Z direction motion and three additional gimbaled axes that provide the additional three degrees of freedom.


Tele-welding and tele-programming systems differ from off-line planning control systems in that motion control is done in real-time with only sub-second delays or latency between user movement of the hand-held stylus and motion produced on remote machinery. The difference between the disclosed control systems and a traditional master-follower control system is that the degrees of freedom of motion on the stylus do not produce the same degrees of freedom motion result on the robot, i.e., the motion output of the robot and hand-held stylus are not identical. Additionally, the number of degrees of freedom present on the hand-held stylus device (or other type of controller) does not necessarily match the number of degrees of freedom on the robot.


The disclosed technology, which is referred to as “tele-programming”, includes systems and methods that enable programming machinery/robot from one or more locations that are physically remote from where the machinery/robot is positioned or installed. The disclosed tele-programming system permits an individual to program and save robot movement, positioning, and timing from a remote location by setting or creating system “teachpoints” in a robot program file. It is to be understood that the disclosed systems and methods can be used in various alternate implementations of robot programming, including but not limited to, robot trajectory, direction, speed, distance, or time.


The disclosed tele-programming system includes components similar to the components included in the tele-welding system described in U.S. Patent Publication No. 2023/0112463. FIG. 1 provides a block diagram of an example implementation of the disclosed tele-programming system, showing the basic components thereof. With reference to FIG. 1, an example tele-programming system 10 includes various components that when used together allow local user 50, who is in direct contact with manual controller 400, to control weld-programming machinery or other machinery 300 that performs manual movements based on and coordinated with the movements of manual controller 400, despite machinery 300 being located in remote environment 30, which is physically separate from local user 50 in local environment 20. An example implementation of tele-programming system 10 includes processor 100; control software 200; remote machinery 300; manual controller 400; environmental sensors 500; three-dimensional scanning sensors 600; and process sensors 700.


Processor 100, which may be a computer or computing device that includes various control hardware, runs an overall program for communicating with remote devices over network 1000 using wired or wireless protocols. Processor 100 is connected to manual controller 400, which may be a hand-held stylus, joystick, mouse, or any other electronic or computer-compatible motion control device having at least one degree of rotational freedom. Processor 100 is connected to the Internet and opens a URL/webpage or digital media player software using an Internet browser or similar program. In some implementations, other open or closed computer networks are utilized.


Control software 200 runs on processor 100 and enables communication with and between remote machinery 300 across network 1000. In an example implementation, control software 200 uses digitized environmental data (e.g., point clouds) provided by three-dimensional scanning sensors 600 (such as, for example, laser detection and ranging (LIDAR), blue, white, or stereo scanning sensors) and executes mathematical transformations to convert a digital environment into haptic feedback felt by local user 50 while holding manual controller 400, in real time, thereby providing local user 50 with a physical sense or physical interpretation of an actual working environment. Control software 200 also allows local user 50 to initiate or stop communication with remote machinery 300. Remote machinery 300 can include any type of welding equipment, measurement equipment, inspection equipment, remote assembly equipment, or a combination thereof, including a robot or robot controlled having a process tool end effector. Process tool end effector 800 can be any instrument or tool used in manufacturing processes including, but not limited to, grinders or similar material removal devices, weld torches, and inspection probes. The robot and robot controller hardware are typically controlled by open source or proprietary communication protocols.


Control software 200, an example of which is specific to tele-programming system 10, includes an executable application that can be run on the computer of a remote user, on a computer located in the same environment as machinery 300, or on the robot controller system. Software 200 provides a user interface, or human machine interface (HMI) screen 150, for allowing user 50 to interact with manual controller 400 and remote machinery 300 in a simultaneous manner. Software 200 provides a connection to remote machinery 300 using a local area network (LAN), intranet, or the Internet, (collectively, network 1000) for the purpose of controlling remote machinery 300. Software 200 receives input from user 50 by way of the user interface or HMI screen 150 to begin or end a tele-grinding process, to set or change process settings or parameters, or to set or change manual controller 400 parameters. System software 200 provides a process for communicating with at least one locally connected manual controller 400, thereby allowing user 50 to directly manipulate manual controller 400 while querying manual controller 400 positions and converting the positions to resultant movement of the remote machinery 300. In other implementations, the grinding start/stop process is accomplished using buttons or other additional data input/output features included on manual controller 400.


Controller 400 may be a manually operated device such as a hand-held stylus, a computer mouse, a joystick, or any other suitable device having at least one degree of rotational freedom that can be used to record various hand motions of user 50. Physical manipulations of manual controller 400 by user 50 are ultimately converted into the physical motions of remote machinery 300 by control software 200. Manual controller 400 may also provide a haptic feedback response to user 50, corresponding to either physical environmental objects that exist at the remote location or to virtual barriers.


Manual controller 400 may be a stylus device that is a commercially available haptic feedback system including software libraries that are imported into tele-programming software program 200. For example, tele-programming software 200 queries manual controller/stylus device 400 using stylus software library functions for current axis positions and tele-programming software 200 sets the haptic feedback response of manual controller/stylus device 400 by sending commands to manual controller/stylus device 400 using the stylus software library functions. Manual controller/stylus device 400 applies the commanded settings from tele-programming software 200 to produce a sensation response felt by local user 50 holding manual controller/stylus device 400. The commanded settings change the servo-motor power and response characteristics which produce sensations of touching surfaces of different densities or at differing levels of force, mass, gravity or speed. Tele-programming software 200 determines the settings for the type of response based on the current location of remote machinery 300 and from analysis of the data queried from sensors 500, 600, and 700.


Environmental sensors 500 may include cameras, microphones, digitizers, and other types of sensing devices, and may use optical systems, devices, and methods for determining the displacement of physical objects within remote working environment 30 that encompasses remote machinery 300 and the tele-programming process that is occurring therein. Environmental sensors 500 may also use auditory systems, devices, and methods for capturing sounds within remote working environment 30 that encompasses remote machinery 300 and the tele-programming process that is occurring therein. Sensors 500, 600, and 700 may be used to collect digitized environmental data (e.g., point clouds) that is transmitted to and stored on processor 100. This digital environmental data may then be used to determine when, how, and what type of physical sensation response is to be applied to the manual controller 400 for indicating the presence of a physical object in remote working environment 30 or proximity to a physical object or virtual barrier. With regard to programming processes: (i) inexpensive, digital cameras may be used to assist with proper alignment of the process tool end effector in relation to a part or surface; (ii) specialty manufacturing process cameras may be used to provide a real-time movement views; (iii) microphones may be used to capture sounds useful in programming the equipment; and (iv) livestreaming camera video and audio may be used to provide low latency real-time process data.


Three-dimensional sensors 600, which cooperate with environmental sensors 500, may be mounted to machinery and/or robot 300 to measure the displacement of objects relative to the sensors themselves and provide a digitized topographical representation of the physical environment in which the weld-grinding process is occurring. Optically-based processes such as, for example, infrared (IR), laser detection and ranging (LIDAR), blue, white, or laser vibrometer scanning systems can be used to create a point cloud or three-dimensional digital map of remote working environment 30. Scanning and digitizing may be completed prior to the programming process or may be completed in real-time as the programming process is occurring. With regard to the programming process, three-dimensional sensors 600 may be used to measure displacement distance between the robot arm having the process tool end effector and the part and/or part surface.


Process sensors 700, which cooperate with sensors 500 and 600, may be mounted to the robot to measure pressure, force, and/or strain applied to the sensors 700 from the robot. Data and information from sensors 500, 600, 700 may be used to: (i) override user control of the robot if pressure, force, and/or distance exceed predetermined operating parameter ranges; (ii) adjust haptic feedback response on the manual controller/stylus device; and (iii) control and update the motion and speed of the robot in the X, Y, Z, Rx, Ry, and Rz directions. It is to be understood that the predetermined operating parameter ranges are any ranges of pressure, force, and/or distance that allow machinery/robot 300 to function without crashing.


Tele-programming system 10 provides real-time video and audio feedback of remote working environment 30 to user 50 in local environment 20. Video, audio, or other sensor data is encoded using commercially available encoding hardware servers or an encoding software program running on a processor. The server or processor, using commercially available software, publishes the encoded video and audio or other data stream to the internet or a LAN through a URL or web address 1000. The published livestream uses a low latency protocol to stream the information to devices over the internet or on a local area network (LAN) 1000. User 50 can access livestreamed video over the internet or LAN 1000 using a processor (e.g., a personal computer) and commercial media player application that can read and play the audio or video stream on a personal computer.



FIGS. 2A-2B provide a flow charts of an example method or process for using an example implementation of the disclosed tele-programming system. With reference to FIG. 2A, the method begins at step 2000. A local user remote from the robot and the manufacturing environment starts a local tele-programming, or tele-ready, software program previously installed on a processor at step 2005. As previously discussed, the processor may be a computer or computing device that includes control hardware. The local tele-ready software program then connects to a remote robot using point-to-point, LAN, or internet connection at step 2010. Remote sensors connected to the robot send data to the robot at step 2015. Remote cameras microphones stream video and sound directly to the processor, to the LAN, or over the internet at step 2020. The local processor then connects to a livestream media software program, a web browser, or a locally installed camera viewer program at step 2025. The local tele-ready software program connects to a local manual controller that is coupled to the local processor at step 2030. As previously discussed, the local manual controller may be a hand-held stylus, a computer mouse, a joystick, or any other suitable device. The local user then views real-time video on the tele-programming software's human machine interface (HMI) screen and listens to real-time sound that are both livestreamed from the remote programming environment at step 2035. Next, the local user manipulates the manual controller and responds physically to a haptic feedback response on the manual controller at step 2040.


With reference to FIG. 2B, the local tele-ready software program receives and reads information from the robot and the remote environmental sensors at step 2045. The tele-ready software program then updates the haptic feedback response on the manual controller based on the data from the remote environmental sensors at step 2050. The tele-ready software program translates the motion and speed of the manual controller to a robot motion path in either a freeform mode (rate controlled) or in-path mode (point-to-point) at step 2055. The tele-ready software program then commands the remote robot to move at step 2060. The tele-ready system determines if the remote robot is in save tele-programming mode at decision step 2065. If it is, the local tele-ready software program saves a teachpoint in a robot program file at step 2070. If it is not, the local user is directed to quit the progress at step 2075. The tele-programming system determines whether or not to terminate (quit) the process at step 2075 and either ends the process at step 2080 or continues the process from step 2035 (see FIG. 2A).



FIG. 3 provides a flow chart of an example stepwise decision process for reading information from the remote robot and the remote sensors at step 2045 of FIG. 2A. Initially, the tele-ready software program reads data from a remote displacement data sensor, or from a contact probe, non-contact laser, LED, photogrammetry, capacitance, inductance proximity, or ultrasonics, which measures the distance between a desired end effector on the robot's arm and the part surface at step 2100. It is to be understood the end effector can be any instrument or process tool used in manufacturing processes including, but not limited to, grinders or similar material removal devices, weld torches, and inspection probes. At step 2110, a pressure sensor, or similar process sensor, can be used to measure pressure from a force applied from the robot to the part's surface. The system alerts the local user by displaying a video of the desired end effector attached to the robot at the distance above the part's surface on the HMI screen at step 2120. The system alerts the local user by updating the haptic feedback response on the manual controller at step 2130.


Again with reference to FIG. 3, at decision step 2140, the system evaluates whether the measured pressure and/or measured distance (step 2110) exceeds operating parameters (e.g., robot is too close to part surface or robot applies too much force). If the measured pressure and/or distance is too great or too close to the part's surface, the system decides whether to override the local user's control to prevent robot crash at step 2160. If the system overrides the local user's control, local user control of the robot's affected axis is disabled at step 2170. The system will enable or continue local user control of the robot at step 2150 if the measured pressure and/or distance does not exceed operating parameters at decision step 2140 or if the system overrides the local user's control at step 2160. The system determines whether to continue the process from step 2100 after steps 2150 and 2170.



FIG. 4 provides a flow chart of an example stepwise process for updating the haptic feedback response on the manual controller at step 2050 (see FIG. 2B). Initially, the tele-ready software program: (i) reads the measured end effector-to-surface height distance from the displacement data sensor at step 2200; (ii) reads the measured end effector-to-surface force from the pressure sensor at step 2210; and (iii) reads the current robot position data at step 2220. At optional step 2230, the tele-ready software program sends the information read in steps 2200, 2210, and 2220 to a data science algorithm to predict the robot's motion path. The system then applies the information from steps 2200, 2210, 2220, and 2230 to force feedback correlation equation at step 2240. At step 2250, the software program adjusts the haptics feedback response on the manual controller and continues the process from step 2200.



FIG. 5 provides a flow chart of an example stepwise process for translating motion and speed of the manual controller to the motion path of the robot at step 2055 (see FIG. 2B). Initially, the tele-ready software program acquires the manual controller positions at step 2300 and acquires robot positions at step 2310. The software program then determines the tool center points (TCP) normal to surface orientation at step 2320. The robot's motion path translation between the manual controller orientation and the current robot orientation is calculated at step 2330. At optional step 2340, the motion path translation of the robot is sent to the data science algorithm for predicting the next likely motion rate change. At step 2350, the robot's motion rate update is sent to the tele-ready software and the process then continues from step 2300.



FIG. 6 provides a flow chart of an example stepwise decision process for commanding the remote robot to move at step 2060 (see FIG. 2B). Initially, the system evaluates, whether user control of all robot motions is enabled at decision step 2400. If local user control is not enabled, system translates the motion and speed of the manual controller to the motion path of the robot as detailed in stepwise process 2055 and FIG. 5. If local user control is enabled, the system evaluates whether the local user's control is in freeform mode (user control of all other robot motions) at decision step 2410. If so, the robot's rate of motion (speed) is updated on all axes at step 2430. If user control is not enabled, the system sends a command to the remote robot to update the position of all axes if in path (point-to-point) mode at step 2420. The system continues the process at step 2400 based on the output of either step 2420 or step 2430.



FIG. 7 provides a flow chart of an example stepwise decision process for saving teachpoints in a robot program file at step 2070 of FIG. 2B. Initially, whether or not the system is already writing in an existing robot program file is determined at step 2500. If not: (i) a new robot program file is opened and created at step 2510; (ii) teachpoint count is reset to value 1 at step 2520; and (iii) teachpoint 1 from step 2520 is set as the start-point in the new robot program file at step 2530. If the system is already writing in an existing robot program file at step 2500, or teachpoint 1 is set in step 2530, at step 2540 the system determines whether the remote robot has moved a predetermined minimum distance (MIN_MOVE_DISTANCE) since the last saved teachpoint location. If the robot has moved the minimum distance, a new robot teachpoint is added to the robot program file at step 2580. If the robot has not moved the minimum distance, the system evaluates at step 2550 if the amount of time (MIN_TIME) passed has varied since the last saved teachpoint timer. If the amount of time passed is variable, a new robot teachpoint is added to the robot program file at step 2580. If the amount of time passed is not variable, the system proceeds to step 2560, which evaluates whether a user added a new point on the HMI screen (ADD_NEW_POINT). The user can add a new point by activating a button or similar component on the HMI screen. If the user adds a new point, a new robot teachpoint is added to the robot program file at step 2580. If the user does not add a new point, the system returns to step 2500 and/or proceeds to optional step 2570. At optional step 2570, artificial intelligence (AI) can be used to determine is a new teachpoint should be saved to the program file based on, for example, project trajectory, direction, speed, or distance. Step 2580 adds a new robot teachpoint to the robot program file if AI determines a new teachpoint should be saved. In any of the steps detailed in step 2070, a new robot teachpoint increases, incrementally by a value of 1 in step 2590 when the new teachpoint is added in step 2580.


All literature and similar material cited in this application, including, but not limited to, patents, patent applications, articles, books, treatises, and web pages, regardless of the format of such literature and similar materials, are expressly incorporated by reference in their entirety. Should one or more of the incorporated references and similar materials differ from or contradict this application, including but not limited to defined terms, term usage, described techniques, or the like, this application controls.


As previously stated and as used herein, the singular forms “a,” “an,” and “the,” refer to both the singular as well as plural, unless the context clearly indicates otherwise. The term “comprising” as used herein is synonymous with “including,” “containing,” or “characterized by,” and is inclusive or open-ended and does not exclude additional, unrecited elements or method steps. Although many methods and materials similar or equivalent to those described herein can be used, particular suitable methods and materials are described herein. Unless context indicates otherwise, the recitations of numerical ranges by endpoints include all numbers subsumed within that range. Furthermore, references to “one implementation” are not intended to be interpreted as excluding the existence of additional implementations that also incorporate the recited features. Moreover, unless explicitly stated to the contrary, implementations “comprising” or “having” an element or a plurality of elements having a particular property may include additional elements whether or not they have that property.


The terms “substantially” and “about”, if or when used throughout this specification describe and account for small fluctuations, such as due to variations in processing. For example, these terms can refer to less than or equal to ±5%, such as less than or equal to ±2%, such as less than or equal to ±1%, such as less than or equal to ±0.5%, such as less than or equal to ±0.2%, such as less than or equal to ±0.1%, such as less than or equal to ±0.05%, and/or 0%.


Underlined and/or italicized headings and subheadings are used for convenience only, do not limit the disclosed subject matter, and are not referred to in connection with the interpretation of the description of the disclosed subject matter. All structural and functional equivalents to the elements of the various implementations described throughout this disclosure that are known or later come to be known to those of ordinary skill in the art are expressly incorporated herein by reference and intended to be encompassed by the disclosed subject matter. Moreover, nothing disclosed herein is intended to be dedicated to the public regardless of whether such disclosure is explicitly recited in the above description.


There may be many alternate ways to implement the disclosed technology. Various functions and elements described herein may be partitioned differently from those shown without departing from the scope of the disclosed technology. Generic principles defined herein may be applied to other implementations. Different numbers of a given module or unit may be employed, a different type or types of a given module or unit may be employed, a given module or unit may be added, or a given module or unit may be omitted.


Regarding this disclosure, the term “a plurality of” refers to two or more than two. Unless otherwise clearly defined, orientation or positional relations indicated by terms such as “upper” and “lower” are based on the orientation or positional relations as shown in the figures, only for facilitating description of the disclosed technology and simplifying the description, rather than indicating or implying that the referred devices or elements must be in a particular orientation or constructed or operated in the particular orientation, and therefore they should not be construed as limiting the disclosed technology. The terms “connected”, “mounted”, “fixed”, etc. should be understood in a broad sense. For example, “connected” may be a fixed connection, a detachable connection, or an integral connection; a direct connection, or an indirect connection through an intermediate medium. For an ordinary skilled in the art, the specific meaning of the above terms in the disclosed technology may be understood according to specific circumstances.


Specific details are given in the above description to provide a thorough understanding of the disclosed technology. However, it is understood that the disclosed embodiments and implementations can be practiced without these specific details. For example, circuits can be shown in block diagrams in order not to obscure the disclosed implementations in unnecessary detail. In other instances, well-known circuits, processes, algorithms, structures, and techniques can be shown without unnecessary detail in order to avoid obscuring the disclosed implementations.


Implementation of the techniques, blocks, steps and means described above can be accomplished in various ways. For example, these techniques, blocks, steps and means can be implemented in hardware, software, or a combination thereof. For a hardware implementation, the processing units can be implemented within one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions described above, and/or a combination thereof.


The disclosed technology can be described as a process which is depicted as a flowchart, a flow diagram, a data flow diagram, a structure diagram, or a block diagram. Although a flowchart can describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations can be re-arranged. A process is terminated when its operations are completed, but could have additional steps not included in the figure. A process can correspond to a method, a function, a procedure, a subroutine, a subprogram, etc. When a process corresponds to a function, its termination corresponds to a return of the function to the calling function or the main function.


Furthermore, the disclosed technology can be implemented by hardware, software, scripting languages, firmware, middleware, microcode, hardware description languages, and/or any combination thereof. When implemented in software, firmware, middleware, scripting language, and/or microcode, the program code or code segments to perform the necessary tasks can be stored in a machine readable medium such as a storage medium. A code segment or machine-executable instruction can represent a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, a software package, a script, a class, or any combination of instructions, data structures, and/or program statements. A code segment can be coupled to another code segment or a hardware circuit by passing and/or receiving information, data, arguments, parameters, and/or memory contents. Information, arguments, parameters, data, etc. can be passed, forwarded, or transmitted via any suitable means including memory sharing, message passing, ticket passing, network transmission, etc.


It should be appreciated that all combinations of the foregoing concepts and additional concepts discussed in greater detail herein (provided such concepts are not mutually inconsistent) are contemplated as being part of the disclosed technology. In particular, all combinations of claimed subject matter appearing at the end of this disclosure are contemplated as being part of the technology disclosed herein. While the disclosed technology has been illustrated by the description of example implementations, and while the example implementations have been described in certain detail, there is no intention to restrict or in any way limit the scope of the appended claims to such detail. Additional advantages and modifications will readily appear to those skilled in the art. Therefore, the disclosed technology in its broader aspects is not limited to any of the specific details, representative devices and methods, and/or illustrative examples shown and described. Accordingly, departures may be made from such details without departing from the spirit or scope of the general inventive concept.

Claims
  • 1. A method for programming equipment used for or related to a manufacturing process, comprising: (a) installing equipment in a manufacturing environment;(b) positioning a plurality of sensors within the manufacturing environment in proximity to the equipment, wherein the plurality of sensors are configured to gather data from the manufacturing environment;(c) connecting at least one processor to the plurality of sensors, wherein the at least one processor includes software for receiving data from the plurality of sensors and the equipment;(d) connecting at least one manual controller to the processor, wherein the at least one manual controller receives motion input from a user of the manual controller, wherein the software on the processor mathematically transforms the motion input into corresponding motion commands that are sent to the equipment by the processor, wherein the equipment, which is physically remote from the at least one controller, executes the motion commands in real-time during the manufacturing process; and(e) using the software to save a teachpoint in a program file.
  • 2. The method of claim 1, further comprising: (a) determining whether the equipment has moved a predetermined minimum distance from a previously saved teachpoint location;(b) adding a first new teachpoint to the program file if the equipment moved the predetermined minimum distance from the previously saved teachpoint location;(c) determining whether a predetermined minimum amount of time has passed since a previously saved teachpoint time; and(d) adding a second new teachpoint to the program file if the predetermined minimum amount of time varied from the previously saved teachpoint time.
  • 3. The method of claim 1, further comprising displaying a real-time video of the manufacturing environment to the user on a human machine interface screen during the manufacturing process, wherein the user can add the teachpoint to the program file through the human machine interface screen.
  • 4. The method of claim 1, further comprising using artificial intelligence to determine if a new teachpoint should be saved to the program file.
  • 5. The method of claim 4, wherein the artificial intelligence uses project trajectory, direction, speed, distance, or combinations thereof to determine if the new teachpoint should be saved to the program file.
  • 6. The method of claim 1, further comprising: (a) using at least one of the sensors in the plurality of sensors to measure a distance between an end effector on the equipment and a part surface, wherein the at least one of the plurality of sensors is a displacement data sensor; and(b) disabling the user's control of the inspection equipment if the distances varies from a predetermined operating distance range.
  • 7. The method of claim 6, further comprising: (a) reading the measured distance between the end effector and the part surface;(b) providing a haptic feedback response to the manual controller based on the data from the plurality of sensors and the equipment; and(c) updating the haptic feedback response to the manual controller based on the measured distance from the displacement data sensor.
  • 8. The method of claim 1, further comprising: (a) using at least one of the sensors in the plurality of sensors to measure a pressure applied between an end effector on the equipment and a part surface; and(b) disabling the user's control of the equipment if the pressure varies from a predetermined operating pressure range.
  • 9. The method of claim 8, further comprising: (a) reading the measured pressure applied between the end effector and the part surface;(b) providing a haptic feedback response to the manual controller based on the data from the plurality of sensors and the equipment; and(c) updating the haptic feedback response to the manual controller based on the measured pressure.
  • 10. The method of claim 1, wherein equipment includes welding equipment, measurement equipment, inspection equipment, remote assembly equipment, or combinations thereof.
  • 11. The method of claim 1, wherein the at least one manual controller is a hand-held stylus, a computer mouse, or a joystick.
  • 12. The method of claim 1, further comprising providing a computer network across which the processor communicates with the equipment.
  • 13. A method for remotely programming equipment used for or related to a manufacturing process, comprising: (a) installing equipment in a manufacturing environment;(b) positioning a plurality of sensors within the manufacturing environment in proximity to the equipment, wherein the plurality of sensors are configured to gather data from the manufacturing environment;(c) connecting at least one processor to the plurality of sensors, wherein the at least one processor includes software for receiving data from the plurality of sensors and the equipment;(d) connecting at least one manual controller to the processor, wherein the at least one manual controller receives motion input from a user of the manual controller, wherein the software on the processor mathematically transforms the motion input into corresponding motion commands that are sent to the equipment by the processor, wherein the equipment, which is physically remote from the at least one controller, executes the motion commands in real-time during the manufacturing process; and(e) saving a teachpoint in a program file, wherein the software: (i) determines whether the equipment has moved a predetermined minimum distance from a previously saved teachpoint location; and(ii) adds the teachpoint to the program file if the equipment moved the predetermined minimum distance from the previously saved teachpoint location.
  • 14. The method of claim 13, further comprising: (a) determining whether a predetermined minimum amount of time has passed since a previously saved teachpoint time; and(b) adding a new teachpoint to the program file if the predetermined minimum amount of time varied from the previously saved teachpoint time.
  • 15. The method of claim 13, further comprising displaying a real-time video of the manufacturing environment to the user on a human machine interface screen during the manufacturing process, wherein the user can add the teachpoint to the program file through the human machine interface screen.
  • 16. The method of claim 13, further comprising using artificial intelligence to determine if a new teachpoint should be saved to the program file.
  • 17. The method of claim 16, wherein the artificial intelligence uses project trajectory, direction, speed, distance, or combinations thereof to determine if the new teachpoint should be saved to the program file.
  • 18. The method of claim 13, wherein equipment includes welding equipment, measurement equipment, inspection equipment, remote assembly equipment, or combinations thereof.
  • 19. The method of claim 13, wherein the at least one manual controller is a hand-held stylus, a computer mouse, or a joystick.
  • 20. The method of claim 13, further comprising providing a computer network across which the processor communicates with the equipment.