TELE-GRINDING SYSTEM AND METHOD

Information

  • Patent Application
  • 20250235982
  • Publication Number
    20250235982
  • Date Filed
    January 22, 2024
    a year ago
  • Date Published
    July 24, 2025
    5 months ago
Abstract
Systems and methods for controlling a material removal process used in a manufacturing environment, comprising installing equipment used for a material removal process in the manufacturing environment; positioning a plurality of sensors within the manufacturing environment wherein the plurality of sensors are configured to gather data from the manufacturing environment; connecting at least one processor to the plurality of sensors, wherein the at least one processor includes software for receiving data from the plurality of sensors and the material removal equipment; and wherein the at least one manual controller receives motion input, wherein the software on the processor mathematically transforms the motion input into corresponding motion commands, wherein the material removal equipment, which is physically remote from the at least one controller, executes the motion commands in real-time during the material removal process.
Description
BACKGROUND

The disclosed technology relates in general to industrial manufacturing and fabricating systems, devices, and processes and more specifically to a grinding system operated from a remote location, also referred to as a “tele-manufacturing” or “tele-grinding” system.


Weld grinding is a process that is used to smooth a weld seam, reduce the size of a weld seam, or shape the weld seam. The grinding process is capable of producing very fine finishes and can achieve accurate dimensions within tight tolerances. Grinding typically includes the use of an abrasive wheel as a cutting tool to grind away excess material. As an industrial process, weld grinding (and industrial welding generally) is currently challenged by a variety of factors including a decreasing number of skilled users; a lack of individuals wanting to enter what are traditionally considered to be “manual” trades; and an ever-increasing list of hazards and limitations related to welding, grinding, and other hot work activities that make finding and keeping experienced users difficult. Additionally, efforts within industry to optimize weight and space in manufacturing processes have resulted in the construction of manufacturing facilities that are basically inaccessible to humans. Accordingly, there is an ongoing need for welding-related systems, processes, and methods that permit qualified users to enter and remain in the workforce regardless of physical limitations, age, or other perceived obstacles such as those described above.


SUMMARY

The following provides a summary of certain example implementations of the disclosed technology. This summary is not an extensive overview and is not intended to identify key or critical aspects or elements of the disclosed technology or to delineate its scope. However, it is to be understood that the use of indefinite articles in the language used to describe and claim the disclosed technology is not intended in any way to limit the described technology. Rather the use of “a” or “an” should be interpreted to mean “at least one” or “one or more”.


One implementation of the disclosed technology provides a first method for controlling a material removal process used in a manufacturing environment. The method comprises installing equipment used for or related to a material removal process in the manufacturing environment; positioning a plurality of sensors within the manufacturing environment in proximity to the material removal equipment, wherein the plurality of sensors are configured to gather data from the manufacturing environment; connecting at least one processor to the plurality of sensors, wherein the at least one processor includes software for receiving data from the plurality of sensors and the material removal equipment; and connecting at least one manual controller to the processor, wherein the at least one manual controller receives motion input from a user of the manual controller, wherein the software on the processor mathematically transforms the motion input into corresponding motion commands that are sent to the material removal equipment by the processor, wherein the material removal equipment, which is physically remote from the at least one controller, executes the motion commands in real-time during the material removal process.


The material removal equipment may include a robot having an end effector for removing an amount material from a weld surface, wherein the end effector includes a grinder, torch, saw, laser, sander, or a combination thereof. Implementations of the method further comprise, from at least one of the plurality of sensors, measuring the weld surface before the material removal process. Implementations of the method further comprise, from at least one of the plurality of sensors, measuring the amount of material removed from the weld surface during the material removal process; from at least one of the plurality of sensors, measuring a distance between the end effector and the weld surface; and from at least one of the plurality of sensors, measuring a pressure applied between the end effector and the weld surface. Implementations of the method further comprise disabling the user's control of the material removal equipment if the distance varies from a predetermined operating distance range; and disabling the user's control of the material removal equipment if the pressure varies from a predetermined operating pressure range. Implementations of the method further comprise before the material removal process, defining a predetermined maximum amount of material to remove from the weld surface; and during the material removal process, stopping power to the end effector if the amount of material removed from the weld surface exceeds the predetermined maximum amount of material. The at least one manual controller may be a hand-held stylus, a computer mouse, or a joystick. The software on the processor may provide a haptic feedback response to the manual controller based on the data from the plurality of sensors and the material removal equipment. Implementations of the method further comprise providing a computer network across which the processor communicates with the material removal equipment. The robot may move with at least six degrees of freedom.


Another implementation of the disclosed technology provides a second method for controlling a material removal process used in a manufacturing environment. The method comprises installing equipment used for or related to a weld grinding process in a manufacturing environment; positioning a plurality of sensors within the manufacturing environment in proximity to the weld grinding equipment, wherein the plurality of sensors are configured to gather data from the manufacturing environment; connecting at least one processor to the plurality of sensors, wherein the at least one processor includes software for receiving data from the plurality of sensors and the weld grinding equipment; and connecting at least one manual controller to the processor, wherein the at least one manual controller receives motion input from a user of the manual controller, wherein the software on the processor mathematically transforms the motion input into corresponding motion commands that are sent to the weld grinding equipment by the processor, wherein the weld grinding equipment, which is physically remote from the at least one controller, executes the motion commands in real-time during the weld grinding process.


The weld grinding equipment may include a robot having a grinder for removing an amount material from a weld surface, wherein the grinder includes a grinding disc, flap disc, or sanding disc. Implementations of the method further comprise, from at least one of the plurality of sensors, measuring the weld surface before the weld grinding process. Implementations of the method further comprise, from at least one of the plurality of sensors, measuring the amount of material removed from the weld surface during the weld grinding process; from at least one of the plurality of sensors, measuring a distance between the grinder and the weld surface; and from at least one of the plurality of sensors, measuring a pressure applied between the grinder and the weld surface. Implementations of the method further comprise disabling the user's control of the weld grinding equipment if the distance varies from a predetermined operating distance range; and disabling the user's control of the weld grinding equipment if the pressure varies from a predetermined operating pressure range. Implementations of the method further comprise before the material removal process, defining a predetermined maximum amount of material to remove from the weld surface; and during the material removal process, stopping power to the grinder if the amount of material removed from the weld surface exceeds the predetermined maximum amount of material. The at least one manual controller may be a hand-held stylus, a computer mouse, or a joystick. The software on the processor may provide a haptic feedback response to the manual controller based on the data from the plurality of sensors and the weld grinding equipment. Implementations of the method further comprise providing a computer network across which the processor communicates with the weld grinding equipment. The robot may move with at least six degrees of freedom.


It should be appreciated that all combinations of the foregoing concepts and additional concepts discussed in greater detail below (provided such concepts are not mutually inconsistent) are contemplated as being part of the technology disclosed herein and may be implemented to achieve the benefits as described herein. Additional features and aspects of the disclosed system, devices, and methods will become apparent to those of ordinary skill in the art upon reading and understanding the following detailed description of the example implementations. As will be appreciated by the skilled artisan, further implementations are possible without departing from the scope and spirit of what is disclosed herein. Accordingly, the descriptions provided herein are to be regarded as illustrative and not restrictive in nature.





BRIEF DESCRIPTION OF THE DRAWINGS

The accompanying drawings, which are incorporated into and form a part of the specification, schematically illustrate one or more example implementations of the disclosed technology and, together with the general description given above and detailed description given below, serve to explain the principles of the disclosed subject matter, and wherein:



FIG. 1 is a simplified block diagram of an example implementation of the disclosed tele-grinding system showing the basic components of the system and the location of the components relative to one another;



FIGS. 2A-2B are flow charts depicting an example method or process for using the tele-grinding system depicted in FIG. 1;



FIG. 3 is a flow chart illustrating an example stepwise process for gathering and interpreting data from sensors located in a remote manufacturing environment and controlling the operation of a robot within the manufacturing environment in conjunction with the method depicted in FIGS. 2A-2B and based on the gathered data;



FIG. 4 is a flow chart illustrating an example stepwise process for updating a haptic feedback response on a local manual controller used with a remote manufacturing environment in conjunction with the method depicted in FIGS. 2A-2B;



FIG. 5 is a flow chart illustrating an example stepwise process for translating motion and speed of a local manual controller to a robot motion path in conjunction with the method depicted in FIGS. 2A-2B; and



FIG. 6 is a flow chart illustrating an example stepwise process for commanding a remote robot to move in conjunction with the method depicted in FIGS. 2A-2B.





DETAILED DESCRIPTION

Example implementations are now described with reference to the Figures. Reference numerals are used throughout the detailed description to refer to the various elements and structures. Although the following detailed description contains many specifics for the purposes of illustration, a person of ordinary skill in the art will appreciate that many variations and alterations to the following details are within the scope of the disclosed technology. Accordingly, the following implementations are set forth without any loss of generality to, and without imposing limitations upon, the claimed subject matter.


The examples discussed herein are examples only and are provided to assist in the explanation of the apparatuses, devices, systems, and methods described herein. None of the features or components shown in the drawings or discussed below should be taken as required for any specific implementation of any of these the apparatuses, devices, systems or methods unless specifically designated as such. For ease of reading and clarity, certain components, modules, or methods may be described solely in connection with a specific Figure. Any failure to specifically describe a combination or sub-combination of components should not be understood as an indication that any combination or sub-combination is not possible. Also, for any methods described, regardless of whether the method is described in conjunction with a flow diagram, it should be understood that unless otherwise specified or required by context, any explicit or implicit ordering of steps performed in the execution of a method does not imply that those steps must be performed in the order presented but instead may be performed in a different order or in parallel.


U.S. Patent Publication No. 2023/0112463 is relevant to the disclosed technology and the entire contents thereof are expressly incorporated by reference herein and made part of this patent application for all purposes. U.S. Patent Publication No. 2023/0112463 discloses tele-manufacturing and tele-welding systems and methods that enable the operation of industrial equipment from one or more locations that are physically remote from the environment in which manufacturing is occurring. Tele-operation is most commonly associated with robotics and mobile robots, but may be applied to an entire range of circumstances in which a device or machine is operated by a person from a distance. Tele-welding systems permit an individual to direct a welding process from a remote location by controlling welding arc on, welding arc off, welding travel speed, and welding torch angles and motions, thereby permitting the individual to make various decisions regarding the welding of a part that is not in their immediate visual or auditory range.


Tele-welding and tele-grinding differ from remote welding and grinding in that the welding and weld-grinding machinery at the remote location (e.g., robot, manipulator, mechanization, automation, etc.) is not running an independent program or motion plan. The person (operator) who is remote from the machinery is in direct control of the welding and weld-grinding machinery and makes decisions to move the machinery by use of a hand-held stylus device or similar device. Furthermore, tele-manufacturing systems differ from virtual reality (VR) in that the hand-held stylus is an actual physical device that has various degrees of rotational freedom and provides encoded position information from each axis when queried and converts the motion of the stylus device into motion on a live articulated machine (robot) with various degrees of rotational freedom. Regarding the local manipulator (e.g., stylus) to remote machinery (e.g., robot) motion in implementations wherein the stylus (controller) and articulated machine (robot) both include six degrees of freedom, the robot six degrees of freedom includes X, Y, Z, roll, pitch and yaw (Rx, Ry, Rz) direction motions generated with the six coordinated axes of the articulated robot. The six degrees of freedom of the stylus device include X, Y and Z direction motion and three additional gimbaled axes that provide the additional three degrees of freedom.


Tele-welding and tele-grinding systems differ from off-line planning control systems in that motion control is done in real-time with only sub-second delays or latency between user movement of the hand-held stylus and motion produced on remote machinery. The difference between the disclosed control systems and a traditional master-follower control system is that the degrees of freedom of motion on the stylus do not produce the same degrees of freedom motion result on the robot, i.e., the motion output of the robot and hand-held stylus are not identical. Additionally, the number of degrees of freedom present on the hand-held stylus device (or other type of controller) does not necessarily match the number of degrees of freedom on the robot.


The disclosed technology, which is referred to as tele-grinding, includes systems and methods that enable the grinding of welded materials from one or more locations that are physically remote from the environment in which the grinding is occurring. Grinding of this nature is typically used to remove a small amount of welded material from a part surface or to correct certain weld defects. The disclosed tele-grinding technology permits an individual to direct a grinding process from a remote location to remove material from a weld by controlling grinder on, grinder off, grinding force, and grinder positions, angles, and ranges of motion. The grinder may be any suitable end effector used for grinding welds, including but not limited to, grinding discs, flap discs, and sanding discs. It is to be understood that the disclosed systems and methods can be used in various alternate implementations of weld material removal, including but not limited to, torch cutting, plasma cutting, sawing, blasting, sanding, and ablating.


The disclosed tele-grinding system includes components similar to the components included in the tele-welding system described in U.S. Patent Publication No. 2023/0112463. FIG. 1 provides a block diagram of an example implementation of the disclosed tele-grinding system, showing the basic components thereof. With reference to FIG. 1, an example tele-grinding system 10 includes various components that when used together allow local user 50 who is in direct contact with manual controller 400 to control weld-grinding machinery or other machinery 300 that performs manual movements based on and coordinated with the movements of manual controller 400 despite machinery 300 being located in remote environment 30, which is physically separate from local user 50 in local environment 20. An example implementation of tele-grinding system 10 includes processor 100; control software 200; remote machinery 300; manual controller 400; environmental sensors 500; three-dimensional scanning sensors 600; and process sensors 700.


Processor 100, which may be a computer or computing device that includes various control hardware, runs an overall program for communicating with remote devices over network 1000 using wired or wireless protocols. Processor 100 is connected to manual controller 400, which may be a hand-held stylus, joystick, mouse, or any other electronic or computer-compatible motion control device having at least one degree of rotational freedom. Processor 100 is connected to the Internet and opens a URL/webpage or digital media player software using an Internet browser or similar program. In some implementations, other open or closed computer networks are utilized.


Control software 200 runs on processor 100 and enables communication with and between remote machinery 300 across network 1000. In an example implementation, control software 200 uses digitized environmental data (e.g., point clouds) provided by three-dimensional scanning sensors 600 (such as, for example, laser detection and ranging (LIDAR), blue, white, or stereo scanning sensors) and executes mathematical transformations to convert a digital environment into haptic feedback felt by local user 50 while holding manual controller 400, in real time, thereby providing local user 50 with a physical sense or physical interpretation of an actual working environment. Control software 200 also allows local user 50 to initiate or stop communication with remote machinery 300. Remote machinery 300 can include any type of material removing equipment, including a robot and robot controller having end effector 800 for removing an amount material from a weld surface. End effector 800 may include a grinder, torch, saw, laser, sander, or combinations thereof. The robot and robot controller hardware are typically controlled by either open source or proprietary communication protocols.


Control software 200, an example of which is specific to tele-grinding system 10, includes an executable application that can be run on the computer of a remote user, on a computer located in the same environment as machinery 300, or on the robot controller system. Software 200 provides a user interface, or human machine interface (HMI) screen 150, for allowing user 50 to interact with manual controller 400 and remote machinery 300 in a simultaneous manner. Software 200 provides a connection to remote machinery 300 using a local area network (LAN), intranet, or the Internet, (collectively, network 1000) for the purpose of controlling remote machinery 300. Software 200 receives input from user 50 by way of the user interface or HMI screen 150 to begin or end a tele-grinding process, to set or change process settings or parameters, or to set or change manual controller 400 parameters. System software 200 provides a process for communicating with at least one locally connected manual controller 400, thereby allowing user 50 to directly manipulate manual controller 400 while querying manual controller 400 positions and converting the positions to resultant movement of the remote machinery 300. In other implementations, the grinding start/stop process is accomplished using buttons or other additional data input/output features included on manual controller 400.


Controller 400 may be a manually operated device such as a hand-held stylus, a computer mouse, a joystick, or any other suitable device having at least one degree of rotational freedom that can be used to record various hand motions of user 50. Physical manipulations of manual controller 400 by user 50 are ultimately converted into the physical motions of remote machinery 300 by control software 200. Manual controller 400 may also provide a haptic feedback response to user 50, corresponding to either physical environmental objects that exist at the remote location or to virtual barriers.


Manual controller 400 may be a stylus device that is a commercially available haptic feedback system including software libraries that are imported into tele-grinding software program 200. For example, tele-grinding software 200 queries manual controller/stylus device 400 using stylus software library functions for current axis positions and tele-grinding software 200 sets the haptic feedback response of manual controller/stylus device 400 by sending commands to manual controller/stylus device 400 using the stylus software library functions. Manual controller/stylus device 400 applies the commanded settings from tele-grinding software 200 to produce a sensation response felt by local user 50 holding manual controller/stylus device 400. The commanded settings change the servo-motor power and response characteristics which produce sensations of touching surfaces of different densities or at differing levels of force, mass, gravity or speed. Tele-grinding software 200 determines the settings for the type of response based on the current location of remote machinery 300 and from analysis of the data queried from sensors 500, 600, and 700.


Environmental sensors 500 may include cameras, microphones, digitizers, and other types of sensing devices, and may use optical systems, devices, and methods for determining the displacement of physical objects within remote working environment 30 that encompasses remote machinery 300 and the tele-grinding process that is occurring therein. Environmental sensors 500 may also use auditory systems, devices, and methods for capturing sounds within remote working environment 30 that encompasses remote machinery 300 and the tele-grinding process that is occurring therein. Sensors 500, 600, and 700 may be used to collect digitized environmental data (e.g., point clouds) that is transmitted to and stored on processor 100. This digital environmental data may then be used to determine when, how, and what type of physical sensation response is to be applied to the manual controller 400 for indicating the presence of a physical object in remote working environment 30 or proximity to a physical object or virtual barrier. With regard to weld-grinding processes: (i) inexpensive, digital cameras may be used to assist with proper line-up and grinder placement on welded material; (ii) specialty weld-grinding process cameras may be used to provide a real-time grind view; (iii) microphones may be used to add grinding sounds to enable an experienced weld-grinder to accurately grind welded material remotely; and (iv) livestreaming camera video and audio may be used to provide low latency real-time process data.


Three-dimensional sensors 600, which cooperate with environmental sensors 500, may be mounted to machinery and/or robot 300 to measure the displacement of objects relative to the sensors themselves and provide a digitized topographical representation of the physical environment in which the weld-grinding process is occurring. Optically-based processes such as, for example, infrared (IR), laser detection and ranging (LIDAR), blue, white, or laser vibrometer scanning systems can be used to create a point cloud or three-dimensional digital map of remote working environment 30. Scanning and digitizing may be completed prior to the grinding process or may be completed in real-time as the weld-grinding process is occurring. User 50 may provide tele-grinding control software 200 with a specified maximum value or amount of weld part surface intended to be removed. With regard to the weld-grinding process, three-dimensional sensors 600 may be used to measure: (i) an amount of material on a weld surface before grinding; (ii) an amount of material removed from the weld surface during the weld-grinding process; and (iii) the distance between the robot arm holding the grinder and the weld surface.


Process sensors 700, which cooperate with sensors 500 and 600, may be mounted to machinery/robot 300 to measure pressure, force, and/or strain applied between the grinder and the weld surface. Data and information from sensors 500, 600, and 700 may be used to: (i) override user control of machinery/robot 300 if pressure, force, strain, and/or distance exceed predetermined operating parameter ranges; (ii) disable grinder power to machinery/robot 300 if determined that the removed material amount is greater than a user-specified maximum value; (iii) adjust haptic feedback response on the manual controller/stylus device 400; and (iv) control and update the motion and speed of machinery/robot 300 in the X, Y, Z, Rx, Ry, and Rz directions. It is to be understood that the predetermined operating parameter ranges are any ranges of pressure, force, and/or distance that allow machinery/robot 300 to function without crashing.


Tele-grinding system 10 provides real-time video and audio feedback of remote working environment 30 to user 50 in local environment 20. Video, audio, or other sensor data is encoded using commercially available encoding hardware servers or an encoding software program running on a processor. The server or processor, using commercially available software, publishes the encoded video and audio or other data stream to the internet or a LAN through a URL or web address 1000. The published livestream uses a low latency protocol to stream the information to devices over the internet or on a local area network (LAN) 1000. User 50 can access livestreamed video over the internet or LAN 1000 using a processor (e.g., a personal computer) and commercial media player application that can read and play the audio or video stream on a personal computer.



FIGS. 2A-2B provide a flow chart of an example method or process for using an example implementation of disclosed tele-grinding system 10. In FIG. 2A, the method starts at step 2000. A (local) user remote from the grinding process location/environment starts a tele-grinding software program previously installed on a processor at step 2005. As previously discussed, the processor may be a computer or computing device that includes various control hardware. The local tele-grinding software program then connects to a remote robot using point-to-point, LAN, or internet connection at step 2010. At step 2015, remote sensors connected to the robot send data to the robot. Remote cameras and microphones stream video and sound directly to the processor, to the LAN, or over the internet at step 2020. The local processor then connects to a livestream media software program, a web browser, or a locally installed camera viewer program at step 2025. At step 2030, the local tele-grinding software program connects to a local manual controller that is coupled to the local processor. As previously discussed, the local manual controller may be a hand-held stylus, a computer mouse, a joystick, or any other suitable device. The local user then views real-time video on the tele-grinding software's human machine interface (HMI) screen and listens to real-time sound that are both livestreamed from the remote grinding environment at step 2035. At optional step 2040, the local user may use the HMI to direct the tele-grinding software to measure the surface of a part before grinding, using a remote surface scanning sensor. At optional step 2045, the local user may use the HMI to provide the tele-grinding software with a maximum amount of part surface to remove. Step 2045 ensures that grinding does not exceed the user's specified maximum amount of part surface to remove. The local user then manipulates the manual controller and responds physically to a haptic feedback response on the manual controller at step 2050.


Further referring to FIG. 2B, the local tele-grinding software program receives and reads information from the robot and the remote sensors at step 2055. The tele-grinding software program then updates the haptic feedback response on the manual controller based on the data from the remote sensors at step 2060. At step 2065, the tele-grinding software program translates the motion and speed of the manual controller to a motion path for the robot. The tele-grinding software program then commands the robot to move at step 2070. The tele-grinding system determines whether or not to terminate (quit) the process at step 2075 and either ends the process at step 2080 or continues the process from step 2035.



FIG. 3 provides a flow chart of an example stepwise decision process for reading information from the remote weld grinding robot and the remote sensors at step 2055 of FIG. 2B. Initially, the tele-grinding software program reads data from: (i) a remote surface scanning sensor that measures the amount of material removed from the part's surface at step 2100; and (ii) a torch height sensor, or similar three-dimensional sensor, that affects Z-axis (grinder to part surface distance) movement of the weld grinding robot at step 2100. At step 2105: (i) a pressure sensor, or similar process sensor, can be used to measure pressure from a force applied from the robot to the part's surface; and/or (ii) the torch height sensor can be used to measure a distance between the part's surface and a grinder connected to the robot's arm. The system alerts the local user by displaying a video of the grinder attached to the robot at the distance above the part's surface on the HMI screen at step 2110. The system alerts the local user by updating the haptic feedback response on the manual controller at step 2115.


Further referring to FIG. 3, at decision step 2120, the system evaluates whether the measured pressure and/or measured distance at step 2105 exceeds operating parameters (e.g., robot is too close to part surface or robot applies too much force). If the measured pressure and/or distance is too great or too close to the surface of the part, the system decides whether to override the local user's control to prevent a robot crash at step 2130. If the system overrides the local user's control, local user control of the robot's affected axis is disabled at step 2135. The system will enable or continue local user control of the robot at step 2125 if the measured pressure and/or distance does not exceed operating parameters at decision step 2120. Next, the system evaluates, at decision point 2140, whether the amount of material removed from the part's surface is greater than the maximum amount specified by the user at step 2045 (see FIG. 2B). If the amount of material removed is greater, the system disables grinder power to the robot at step 2150 and disables user control of the robot's affected axis at step 2155. If the amount of material removed is less, the system enables or continues user control of the robot's specific axis at step 2145. The system determines whether to continue the process from step 2100 after steps 2145 and 2155.



FIG. 4 provides a flow chart of an example stepwise process for updating the haptic feedback response on the manual controller at step 2060 of FIG. 2B. Initially, the tele-grinding software program: (i) reads the measured grinder-to-surface height distance from the torch height sensor at step 2200; (ii) reads the measured grinder-to-surface force from the pressure sensor at step 2210; and (iii) reads the current robot position data at step 2220. At optional step 2230, the tele-grinding software program sends the information read in steps 2200, 2210, and 2220 to a data science algorithm to predict the motion path of the robot. The system then applies the information from steps 2200, 2210, 2220, and 2230 to force feedback correlation equation at step 2240. At step 2250, the software program adjusts the haptics feedback response on the manual controller and continues the process from step 2200.



FIG. 5 provides a flow chart of an example stepwise process for translating motion and speed of the manual controller to the motion path of the robot at step 2065 of FIG. 2B. Initially, the tele-grinding software program gets the manual controller positions at step 2300 and obtains the robot positions at step 2310. The software program then determines the tool center points (TCP) normal to surface orientation at step 2320. The robot's motion path translation between the manual controller orientation and the current robot orientation is calculated at step 2330. At optional step 2340, the robot's motion path translation is sent to data science to predict the next likely motion rate change. At step 2350, the robot's motion rate update is sent to the tele-grinding software and continues the process from step 2300.



FIG. 6 provides a flow chart of an example stepwise decision process for commanding the robot to move at step 2070 of FIG. 2B. Initially, the system evaluates whether user control of the robot's Z-axis (grinder to part surface distance) motion is enabled at decision step 2400. If user control is enabled, the system evaluates whether the amount of material removed from the part's surface is greater than the maximum amount specified by the user at decision step 2410. If the amount of material removed is greater, power to the grinder on the robot is disabled at step 2420. If the amount of material removed is less, the robot's rate of motion in the Z-axis is updated at step 2430, and grinder power continues at step 2440. At decision step 2450, the system evaluates whether user control of all other robot motions (X and Y axes) is enabled. If user control is not enabled, the system continues the process from step 2400. If user control is enabled, the robot's rate of motion (speed) is updated on all axes at step 2460. The system then continues the process from step 2400.


All literature and similar material cited in this application, including, but not limited to, patents, patent applications, articles, books, treatises, and web pages, regardless of the format of such literature and similar materials, are expressly incorporated by reference in their entirety. Should one or more of the incorporated references and similar materials differ from or contradict this application, including but not limited to defined terms, term usage, described techniques, or the like, this application controls.


As previously stated and as used herein, the singular forms “a,” “an,” and “the,” refer to both the singular as well as plural, unless the context clearly indicates otherwise. The term “comprising” as used herein is synonymous with “including,” “containing,” or “characterized by,” and is inclusive or open-ended and does not exclude additional, unrecited elements or method steps. Although many methods and materials similar or equivalent to those described herein can be used, particular suitable methods and materials are described herein. Unless context indicates otherwise, the recitations of numerical ranges by endpoints include all numbers subsumed within that range. Furthermore, references to “one implementation” are not intended to be interpreted as excluding the existence of additional implementations that also incorporate the recited features. Moreover, unless explicitly stated to the contrary, implementations “comprising” or “having” an element or a plurality of elements having a particular property may include additional elements whether or not they have that property.


The terms “substantially” and “about”, if or when used throughout this specification describe and account for small fluctuations, such as due to variations in processing. For example, these terms can refer to less than or equal to ±5%, such as less than or equal to ±2%, such as less than or equal to ±1%, such as less than or equal to ±0.5%, such as less than or equal to ±0.2%, such as less than or equal to ±0.1%, such as less than or equal to ±0.05%, and/or 0%.


Underlined and/or italicized headings and subheadings are used for convenience only, do not limit the disclosed subject matter, and are not referred to in connection with the interpretation of the description of the disclosed subject matter. All structural and functional equivalents to the elements of the various implementations described throughout this disclosure that are known or later come to be known to those of ordinary skill in the art are expressly incorporated herein by reference and intended to be encompassed by the disclosed subject matter. Moreover, nothing disclosed herein is intended to be dedicated to the public regardless of whether such disclosure is explicitly recited in the above description.


There may be many alternate ways to implement the disclosed technology. Various functions and elements described herein may be partitioned differently from those shown without departing from the scope of the disclosed technology. Generic principles defined herein may be applied to other implementations. Different numbers of a given module or unit may be employed, a different type or types of a given module or unit may be employed, a given module or unit may be added, or a given module or unit may be omitted.


Regarding this disclosure, the term “a plurality of” refers to two or more than two. Unless otherwise clearly defined, orientation or positional relations indicated by terms such as “upper” and “lower” are based on the orientation or positional relations as shown in the figures, only for facilitating description of the disclosed technology and simplifying the description, rather than indicating or implying that the referred devices or elements must be in a particular orientation or constructed or operated in the particular orientation, and therefore they should not be construed as limiting the disclosed technology. The terms “connected”, “mounted”, “fixed”, etc. should be understood in a broad sense. For example, “connected” may be a fixed connection, a detachable connection, or an integral connection; a direct connection, or an indirect connection through an intermediate medium. For an ordinary skilled in the art, the specific meaning of the above terms in the disclosed technology may be understood according to specific circumstances.


Specific details are given in the above description to provide a thorough understanding of the disclosed technology. However, it is understood that the disclosed embodiments and implementations can be practiced without these specific details. For example, circuits can be shown in block diagrams in order not to obscure the disclosed implementations in unnecessary detail. In other instances, well-known circuits, processes, algorithms, structures, and techniques can be shown without unnecessary detail in order to avoid obscuring the disclosed implementations.


Implementation of the techniques, blocks, steps and means described above can be accomplished in various ways. For example, these techniques, blocks, steps and means can be implemented in hardware, software, or a combination thereof. For a hardware implementation, the processing units can be implemented within one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions described above, and/or a combination thereof.


The disclosed technology can be described as a process which is depicted as a flowchart, a flow diagram, a data flow diagram, a structure diagram, or a block diagram. Although a flowchart can describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations can be re-arranged. A process is terminated when its operations are completed, but could have additional steps not included in the figure. A process can correspond to a method, a function, a procedure, a subroutine, a subprogram, etc. When a process corresponds to a function, its termination corresponds to a return of the function to the calling function or the main function.


Furthermore, the disclosed technology can be implemented by hardware, software, scripting languages, firmware, middleware, microcode, hardware description languages, and/or any combination thereof. When implemented in software, firmware, middleware, scripting language, and/or microcode, the program code or code segments to perform the necessary tasks can be stored in a machine readable medium such as a storage medium. A code segment or machine-executable instruction can represent a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, a software package, a script, a class, or any combination of instructions, data structures, and/or program statements. A code segment can be coupled to another code segment or a hardware circuit by passing and/or receiving information, data, arguments, parameters, and/or memory contents. Information, arguments, parameters, data, etc. can be passed, forwarded, or transmitted via any suitable means including memory sharing, message passing, ticket passing, network transmission, etc.


It should be appreciated that all combinations of the foregoing concepts and additional concepts discussed in greater detail herein (provided such concepts are not mutually inconsistent) are contemplated as being part of the disclosed technology. In particular, all combinations of claimed subject matter appearing at the end of this disclosure are contemplated as being part of the technology disclosed herein. While the disclosed technology has been illustrated by the description of example implementations, and while the example implementations have been described in certain detail, there is no intention to restrict or in any way limit the scope of the appended claims to such detail. Additional advantages and modifications will readily appear to those skilled in the art. Therefore, the disclosed technology in its broader aspects is not limited to any of the specific details, representative devices and methods, and/or illustrative examples shown and described. Accordingly, departures may be made from such details without departing from the spirit or scope of the general inventive concept.

Claims
  • 1. A method for controlling a material removal process used in a manufacturing environment, comprising: (a) installing equipment used for or related to a material removal process in a manufacturing environment;(b) positioning a plurality of sensors within the manufacturing environment in proximity to the material removal equipment, wherein the plurality of sensors are configured to gather data from the manufacturing environment;(c) connecting at least one processor to the plurality of sensors, wherein the at least one processor includes software for receiving data from the plurality of sensors and the material removal equipment; and(d) connecting at least one manual controller to the processor, wherein the at least one manual controller receives motion input from a user of the manual controller, wherein the software on the processor mathematically transforms the motion input into corresponding motion commands that are sent to the material removal equipment by the processor, wherein the material removal equipment, which is physically remote from the at least one controller, executes the motion commands in real-time during the material removal process.
  • 2. The method of claim 1, wherein the material removal equipment includes a robot having an end effector for removing an amount material from a weld surface, and wherein the end effector includes a grinder, torch, saw, laser, sander, or a combination thereof.
  • 3. The method of claim 2, using at least one sensor in the plurality of sensors to measure the weld surface before the material removal process.
  • 4. The method of claim 2, further comprising: (a) using at least one sensor in the plurality of sensors to measure the amount of material removed from the weld surface during the material removal process;(b) using at least one sensor in the plurality of sensors to measure a distance between the end effector and the weld surface; and(c) using at least one sensor in the plurality of sensors to measure a pressure applied between the end effector and the weld surface.
  • 5. The method of claim 4, further comprising: (a) disabling the user's control of the material removal equipment if the distance varies from a predetermined operating distance range; and(b) disabling the user's control of the material removal equipment if the pressure varies from a predetermined operating pressure range.
  • 6. The method of claim 5, further comprising: (a) before the material removal process, defining a predetermined maximum amount of material to remove from the weld surface; and(b) during the material removal process, stopping power to the end effector if the amount of material removed from the weld surface exceeds the predetermined maximum amount of material.
  • 7. The method of claim 1, wherein the at least one manual controller is a hand-held stylus, a computer mouse, or a joystick.
  • 8. The method of claim 1, wherein the software on the processor provides a haptic feedback response to the manual controller based on the data from the plurality of sensors and the material removal equipment.
  • 9. The method of claim 1, further comprising providing a computer network across which the processor communicates with the material removal equipment.
  • 10. The method of claim 2, wherein the robot moves with at least six degrees of freedom.
  • 11. A method for controlling a material removal process used in a manufacturing environment, comprising: (a) installing equipment used for or related to a weld grinding process in a manufacturing environment;(b) positioning a plurality of sensors within the manufacturing environment in proximity to the weld grinding equipment, wherein the plurality of sensors are configured to gather data from the manufacturing environment;(c) connecting at least one processor to the plurality of sensors, wherein the at least one processor includes software for receiving data from the plurality of sensors and the weld grinding equipment; and(d) connecting at least one manual controller to the processor, wherein the at least one manual controller receives motion input from a user of the manual controller, wherein the software on the processor mathematically transforms the motion input into corresponding motion commands that are sent to the weld grinding equipment by the processor, wherein the weld grinding equipment, which is physically remote from the at least one controller, executes the motion commands in real-time during the weld grinding process.
  • 12. The method of claim 11, wherein the weld grinding equipment includes a robot having a grinder for removing an amount material from a weld surface, and wherein the grinder includes a grinding disc, flap disc, or sanding disc.
  • 13. The method of claim 12, further comprising using at least one sensor in the plurality of sensors to measure the weld surface before the weld grinding process.
  • 14. The method of claim 12, further comprising: (a) using at least one sensor in the plurality of sensors to measure the amount of material removed from the weld surface during the weld grinding process;(b) using at least one sensor in the plurality of sensors to measure a distance between the grinder and the weld surface; and(c) using at least one sensor in the plurality of sensors to measure a pressure applied between the grinder and the weld surface.
  • 15. The method of claim 14, further comprising: (a) disabling the user's control of the weld grinding equipment if the distance varies from a predetermined operating distance range; and(b) disabling the user's control of the weld grinding equipment if the pressure varies from a predetermined operating pressure range.
  • 16. The method of claim 15, further comprising: (a) before the material removal process, defining a predetermined maximum amount of material to remove from the weld surface; and(b) during the material removal process, stopping power to the grinder if the amount of material removed from the weld surface exceeds the predetermined maximum amount of material.
  • 17. The method of claim 11, wherein the at least one manual controller is a hand-held stylus, a computer mouse, or a joystick.
  • 18. The method of claim 11, wherein the software on the processor provides a haptic feedback response to the manual controller based on the data from the plurality of sensors and the weld grinding equipment.
  • 19. The method of claim 11, further comprising providing a computer network across which the processor communicates with the weld grinding equipment.
  • 20. The method of claim 12, wherein the robot moves with at least six degrees of freedom.