Welding is a process that has increasingly become ubiquitous in all industries. While such processes may be automated in certain contexts, a large number of applications continue to exist for manual welding operations performed by skilled welding technicians. However, as the average age of the skilled welder rises, the future pool of qualified welders is diminishing. Furthermore, many inefficiencies plague the welding training process, potentially resulting in injecting a number of improperly trained students into the workforce, while discouraging other possible young welders from continuing their education. For instance, class demonstrations do not allow all students clear views of the welding process. Additionally, instructor feedback during student welds is often prohibited by environmental constraints.
A system provides video data of a welding operation to a remote site. A welding helmet used in the welding operation contains a video display positioned so that a video presentation of the welding operation may be presented to the welder during the welding operation. A video camera is positioned in the helmet for generating raw unprocessed video of the welding operation, which is processed and presented on the display. A transmitter in the helmet transmits video to a remote site.
Referring to
Optionally in any embodiment, the welding equipment 12 may be arc welding equipment that provides a direct current (DC) or alternating current (AC) to a consumable or non-consumable electrode 16 (better shown, for example, in
As shown, and described more fully below, the equipment 12 and headwear 20 may communicate via a link 25 via which the headwear 20 may control settings of the equipment 12 and/or the equipment 12 may provide information about its settings to the headwear 20. Although a wireless link is shown, the link may be wireless, wired, or optical.
Referring to
Antenna 202 may be any type of antenna suited for the frequencies, power levels, etc., used by communication link 25.
Communication port 204 may comprise, for example, an Ethernet over twisted pair port, a USB port, an HDMI port, a passive optical network (PON) port, and/or any other suitable port for interfacing with a wired or optical cable.
Communication interface circuitry 206 is operable to interface control circuitry 210 to antenna 202 and/or port 204 for transmit and receive operations. For transmit operations, communication interface 206 receives data from control circuitry 210 and thereafter packetizes the data and converts the data to physical layer signals in accordance with protocols in use by communication link 25. For receive operations, communication interface 206 receives physical layer signals via antenna 202 or port 204 and thereafter recovers data from the received physical layer signals (demodulate, decode, etc.), and provides the data to control circuitry 210.
User interface 208 comprises electromechanical interface components (e.g., a screen, speakers, a microphone, buttons, a touchscreen, etc.) and associated drive circuitry. User interface 208 may generate electrical signals in response to user input (e.g., screen touches, button presses, control knob activations, mechanical switch activations, voice commands, etc.). User interface 208 includes driver circuitry to condition (e.g., amplify, digitize, etc.) the signals and send the conditioned signals to control circuitry 210. User interface 208 generates audible, visual, and/or tactile outputs (e.g., via speakers, a display, and/or motors/actuators/servos/etc.) in response to signals from control circuitry 210.
Control circuitry 210 may comprise a microcontroller and memory operable to process data from communication interface 206, user interface 208, power supply 212, wire feeder 214, and/or gas supply 216. Control circuitry 210 may output data and/or control signals to communication interface 206, user interface 208, power supply 212, wire feeder 214, and/or gas supply 216. Control circuitry 210 may store data in memory 211 or retrieve data from memory 211.
Power supply circuitry 212 comprises circuitry for generating power to be delivered to welding electrode 16 via conduit 14. Power supply circuitry 212 may comprise, for example, one or more voltage regulators, current regulators, inverters, and/or the like. The voltage and/or current output provided by power supply circuitry 212 may be controlled by a control signal from control circuitry 210. Power supply circuitry 212 may also comprise circuitry for reporting the present current value and/or voltage value to the control circuitry 210. In an example implementation, power supply circuitry 212 may comprise circuitry for measuring the voltage and/or current on conduit 14 (at either or both ends of conduit 14) such that reported voltage and/or current is actual and not simply an expected value based on calibration.
Wire feeder module 214 is configured to deliver a consumable wire electrode 16 to a weld joint, e.g., shown as reference numeral 512 in
The gas supply module 216 is configured to provide shielding gas via conduit 14 for use during the welding process. The gas supply module 216 may comprise an electrically controlled valve for controlling the rate of gas flow. The valve may be controlled by a control signal from control circuitry 210 (which may be routed through the wire feeder 214 or come directly from the control 210 as indicated by the dashed line). The gas supply module 216 may also comprise circuitry for reporting the present gas flow rate to the control circuitry 210. In an example implementation, the gas supply module 216 may comprise circuitry and/or mechanical components for measuring the gas flow rate such that reported flow rate is actual and not simply an expected value based on calibration.
Referring to
Each of the camera's optical components 302a, 302b comprises, for example, one or more lenses, filters, and/or other optical components for capturing electromagnetic waves in the spectrum ranging from, for example, infrared to ultraviolet. Optical components 302a, 302b are for two cameras respectively and are positioned approximately centered with the eyes of a wearer of helmet 20 to capture stereoscopic images (at any suitable frame rate ranging from still photos to video at 30 fps, 100 fps, or higher) of the field of view the wearer of helmet 20 as if looking through a lens.
Display 304 may comprise, for example, a LCD, LED, OLED. E-ink, and/or any other suitable type of display operable to convert electrical signals into optical signals viewable by a wearer of helmet 20.
Electromechanical user interface 308 may comprise, for example, one or more touchscreen elements, speakers, microphones, physical buttons, switches, control knobs, etc. that generate electric signals in response to user input or user activation. For example, electromechanical user interface 308 may comprise capacitive, inductive, or resistive touchscreen sensors mounted on the back of display 304 (i.e., on the outside of helmet 20) that enable a wearer of helmet 20 to interact with user interface elements displayed on the front of display 304 (i.e., on the inside of helmet 20). In an example implementation, the optics 302, image sensors 416, and GPU 418 may operate as user interface components 308 by allowing a user to interact with the helmet 20 through, for example, hand gestures captured by the optics 302 and images sensors 416 and then interpreted by the GPU 418. For example, a gesture such as would be made to turn a knob clockwise may be interpreted to generate a first signal while a gesture such as would be made to turn a knob counterclockwise may be interpreted to generate a second signal.
Antenna 402 may be any type of antenna suited for the frequencies, power levels, etc. used by communication link 25.
Communication port 404 may comprise, for example, an Ethernet over twisted pair port, a USB port, an HDMI port, a passive optical network (PON) port, and/or any other suitable port for interfacing with a wired or optical cable.
Communication interface circuitry 406 is operable to interface control circuitry 410 to the antenna 402 and port 404 for transmit and receive operations. For transmit operations, communication interface 406 receives data from control circuitry 410, and packetizes the data and converts the data to physical layer signals in accordance with protocols in use by communication link 25. The data to be transmitted may comprise, for example, control signals for controlling equipment 12. For receive operations, communication interface 406 receives physical layer signals via antenna 402 or port 404, recovers data from the received physical layer signals (demodulate, decode, etc.), and provides the data to control circuitry 410. The received data may comprise, for example, indications of current settings and/or actual measured output of equipment 12 (e.g., voltage, amperage, and/or wire speed settings and/or measurements).
User interface driver circuitry 408 is operable to condition (e.g., amplify, digitize, etc.) signals from user interface 308.
Control circuitry 410 may comprise a microcontroller and memory operable to process data. Data may be processed from communication interface 406, user interface driver 408, and GPU 418, and to generate control and/or data signals to be output to speaker driver circuitry 412, GPU 418, and communication interface 406. Control circuitry 410 may store data in memory 211 or retrieve data from memory 211.
Signals output to communication interface 406 may comprise, for example, signals to control the settings of equipment 12. Such signals may be generated based on signals from GPU 418 and/or the user interface driver 408.
Signals from communication interface 406 comprise, for example, indications (received via antenna 402, for example) of current settings and/or actual measured output of equipment 12.
Speaker driver circuitry 412 is operable to condition (e.g., convert to analog, amplify, etc.) signals from control circuitry 410 for output to one or more speakers of user interface components 308. Such signals may, for example, carry audio to alert a wearer of helmet 20 that a welding parameter is out of tolerance, to provide audio instructions to the wearer of helmet 20, etc. For example, if the travel speed of the torch is determined to be too slow, such an alert may comprise a voice saying “too slow.”
Signals to GPU 418 comprise, for example, signals to control graphical elements of a user interface presented on display 304. Signals from the GPU 418 comprise, for example, information determined based on analysis of pixel data captured by images sensors 416. Image sensor(s) 416 may comprise, for example, CMOS or CCD image sensors operable to convert optical signals from cameras 303 to digital pixel data and output the pixel data to GPU 418.
Graphics processing unit (GPU) 418 is operable to receive and process pixel data (e.g., of stereoscopic or two-dimensional images) from image sensor(s) 416. GPU 418 outputs one or more signals to the control circuitry 410, and outputs pixel data to the display 304 via display driver 420. GPU 418 may also output unprocessed pixel data to memory 411 under control of control circuitry 410. Additionally, GPU 418 may also output processed pixel data to memory 411 under control of control circuitry 410.
The processing of pixel data by GPU 418 may comprise, for example, analyzing the pixel data, (e.g., a barcode, part number, time stamp, work order, etc.) to determine, in real time (e.g., with latency less than 100 milliseconds or, more preferably, less than 20 milliseconds, or more preferably still, 5 milliseconds), one or more of the following: name, size, part number, type of metal, or other characteristics of workpiece 24; name, size, part number, type of metal, or other characteristics of torch 504, electrode 16 and/or filler material; type or geometry of joint 512 to be welded; 2-D or 3-D positions of items (e.g., electrode, workpiece, etc.) in the captured field of view, one or more weld parameters (e.g., such as those described below with reference to
The information output from GPU 418 to control circuitry 410 may comprise the information determined from the pixel analysis. Such information may be stored in memory 411 by control circuitry 410.
The pixel data output from GPU 418 to display 304 may provide a mediated reality view for the wearer of helmet 20. In such a view, the wearer experiences a video presented on display 304 as if s/he is looking through a lens. The image may be enhanced and/or supplemented by an on-screen display. The enhancements (e.g., adjust contrast, brightness, saturation, sharpness, etc.) may enable the wearer of helmet 20 to see things s/he could not see with simply a lens. The on-screen display may comprise text, graphics, etc. overlaid on the video to provide visualizations of equipment settings received from control circuit 410 and/or visualizations of information determined from the analysis of the pixel data. The pixel data output from GPU 418 may be stored in memory 411 by control circuitry 410.
Display driver circuitry 420 is operable to generate control signals (e.g., bias and timing signals) for display 304 and to condition (e.g., level control synchronize, packetize, format, etc.) pixel data from GPU 418 for conveyance to display 304.
In
Contact-tip-to-work distance may include a vertical distance 506 from a tip of torch 504 to workpiece 24 as illustrated in
The travel angle 502 is the angle of gun 504 and/or electrode 16 along the axis of travel (X axis in the example shown in
A work angle 508 is the angle of gun 504 and/or electrode 16 perpendicular to the axis of travel (Y axis in the example shown in
The travel speed is the speed at which gun 504 and/or electrode 16 moves along the joint 512 being welded.
The aim is a measure of the position of electrode 16 with respect to the joint 512 to be welded. Aim may be measured, for example, as distance from the center of the joint 512 in a direction perpendicular to the direction of travel.
Referring to
The process begins with block 601, in which one or more welds to be performed are determined by the headwear 20. The determination may be based on an identifier (e.g., a work order number, a part number, etc.) entered by the welder 18 through, for example, voice recognition and/or tactile input. Alternatively, or additionally, the welder 18 may view the workpiece to be welded from a distance and/or angle that permit the camera(s) 302 to capture an image of the workpiece from which an image processing algorithm can detect welds to be performed. For example, unique shapes, markings, and/or other features of a workpiece in the captured image view may be detected and used to retrieve an identifier associated with the workpiece.
In block 602, welder 18 initiates a welding operation. For example, welder 18 may give a voice command for welding system 10 to enter a weld mode, which voice command is responded to by user interface 308 of helmet 20. Control circuitry 410 configures the components of helmet 20 according to the voice command in order to display, on display 304, the live welding operation for viewing by the welder. The welder views the weld on display 304 and controls operation and positioning of electrode 16. Control circuitry 410 may respond to the voice command and send a signal to equipment 12 to trigger the weld mode in equipment 12. For example, control circuitry 210 disables a lock out so that power is delivered to electrode 16 via power supply 212 when a trigger on the torch is pulled by the welder. Wire feeder 214 and gas supply 216 may also be activated accordingly.
Block 602 thus represents the step of the welder placing the welding system in a weld mode so that the workpiece may be welded. Equipment 12 is configured by the welder 18 using user interface 208 based on the determined characteristics of the weld to be performed. For example, a constant current or constant voltage mode may be selected, a nominal voltage and/or nominal current may be set, a voltage limit and/or current limit may be set, and/or the like. Camera(s) 303 may be configured using electromechanical user interface 308. For example, expected brightness of the arc may be predicted (based on the equipment configuration and the characteristics of the weld to be made). The electric signals from user interface 308 may configure the darkness of a lens filter, for example.
In block 604, the operator begins welding. Workpiece 24 is placed into position, together with the electrode, relative to the field of view of camera lenses 302a, 302b. The trigger is activated by the welder, and a multimedia file is created/opened in memory and images of the weld operation begin to be captured by the camera 303 and stored to the multimedia file. The images may be stored as raw unprocessed pixel data coming from camera(s) 303. Alternatively (or additionally), the images may be stored as processed pixel data from GPU 418. In an example implementation, these events may be sequenced such that image capture starts first and allows a few frames during which the cameras 303 and/or display 304 are calibrated (adjusting focus, brightness, contrast, saturation, sharpness, etc.) before current begins flowing to the electrode, this may ensure sufficient image quality even at the very beginning of the welding operation. The multimedia file may be stored in memory 411 of helmet 20. Alternatively (or additionally), control circuitry 410 may transmit the images (unprocessed or processed) to the communication interface 406 for transmission to a remote memory such as memory 211 in equipment 12 and/or memory in server 30.
Still in block 604, in addition to storing the captured images, the images may be displayed in real-time on the display 304 and/or on one or more other remote displays to which the captured images are transmitted in real-time via link 25. In an example implementation, different amounts of image processing may be performed on one video stream output to the display 304 and another video stream output via link 25. In this regard, higher latency may be tolerable to the remote viewer such that additional processing may be performed on the images prior to presentation on the remote display.
In block 606, as the welding operation proceeds, the captured image data is processed and may be used to determine, in real-time (e.g., with latency less than 100 ms or, more preferably, less than 5 ms), present welding parameters such as those described above with reference to
Still referring to block 606, as the welding operation proceeds, settings and/or measured output of the equipment 12 may be received via link 25. Control circuitry 410 may adjust the settings based on the parameters determined. In this manner, equipment settings such as voltage, current, wire speed, and/or others may be adjusted in an attempt to compensate for deviations of the parameters from their ideal values. The equipment settings and/or measured output may be stored along with the captured image data. For example, the settings and/or measured output may be synchronized with the captured images and converted to text/graphics which are overlaid on the image data by GPU 418 prior to storing the image data and/or the identifier may be stored in metadata of the multimedia file in which the image data is stored.
Still referring to block 606, as the welding operation proceeds, other information may be captured (by the camera(s) 303 and/or other sensors) and stored along with the captured images. This other data may then be synchronized to the captured images and stored with the captured images (e.g., as metadata and/or converted to text/graphics and overlaid on the images). Such data may include, for example, an overall identifier of the weld operation determined in block 601, individual part numbers of the parts being welded (e.g., barcoded such that they can be automatically detected from the captured images), timestamps, climate (temperature, humidity, etc.), and/or the like. The multimedia file containing the may be indexed by any of this information for later searching and retrieval.
In block 608, the first weld operation on workpiece 24 is completed. In block 608 the multimedia file to which the images and other data were written during blocks 604 and 606 may be closed (e.g., file headers added, checksums calculated, etc.). In some instances, the file may be transferred for long term storage (e.g., from memory 411 of the helmet 20 to a database residing in memory of server 30).
Where the captured image data is stored as raw unprocessed pixel data, such raw unprocessed pixel data may be processed externally of helmet 20. In block 610, control circuitry 410 transmits the pixel data to, for example, a memory at server 30 via antenna 402 or port 404. A processor (not shown) at server 30 processes the raw unprocessed data and stores the processed data in memory at server 30. There may be more compute power at the server 30 and greater latency may be tolerated as compared to processing in helmet 20 prior to presentation on display 304. If there is too much latency inside the helmet, the welder may become disoriented. Similarly, pixel data already processed in helmet 20 under latency constraints (e.g., to condition it for real-time presentation on the display 304) may be further processed by the helmet 30 and/or by an external processor (such as in server 30). Such additional processing may enable determining additional and/or more-detailed information about the weld that there wasn't time and/or compute power to determine prior to real-time presentation of the captured images.
In block 612, the images captured during block 604 are transmitted from the memory of server 30 to a second remote location. For example, the images may be retrieved by an instructor or supervisor to review the work of a student or employee. As another example, the images may be reviewed by a quality control auditor as part of random quality inspections and/or as part of an investigation into a failed weld (e.g., if the welded part later fails in the field, the captured images and the information stored along with the images may be viewed to see if the weld process was the likely cause of the failure).
The graphics 720, 724, 728, and 730 present to the viewer one or more welding parameters measured during the weld being performed in the image. In the example shown, the graphic 720 comprises positional coordinate axes representing work angle and travel angle. The center of the coordinate system indicates the optimal orientation of the welding torch 504 during the weld. An actual orientation of the torch is indicated by dot 722. Other graphical representations of torch angle may be used instead of the “bull' s-eye” shown in
A system in accordance with an example implementation of this disclosure, comprises welding headwear (e.g., 20) to be worn by a welder (e.g., 18) during a live welding operation, the headwear comprising: a video camera (e.g., 303) operable to capture an image of a live welding operation; circuitry (e.g., 410 and 418) operable to analyze the captured image to determine a characteristic of the live welding operation, and associate the characteristic with the captured image; and memory (e.g., 411 and/or memory of server 30) operable to store the captured image and the associated characteristic for later retrieval. The headwear may comprise a communication interface operable to communicate with a remote server (e.g., 30). The determined characteristic may comprises a welding parameter of a welding torch in the captured image. The determined characteristic may comprise a setting, or measured output, of welding equipment that powers and/or feeds wire to a torch being used in the live welding operation, where the setting is received via a communication link between the welding equipment and the welding headwear. The determined characteristics may comprise a work order number associated with the live welding operation, an identification of a welder performing the live welding operation, and/or a part number of a workpiece appearing in the captured image. The circuitry may operable to associate the determined characteristics with the captured image by generating a graphic indicative of the characteristic (e.g., 702, 720, 742, 728), and overlaying the graphic on the captured image (e.g., as shown in
The present methods and systems may be realized in hardware, software, or a combination of hardware and software. The present methods and/or systems may be realized in a centralized fashion in at least one computing system, or in a distributed fashion where different elements are spread across several interconnected computing systems. Any kind of computing system or other apparatus adapted for carrying out the methods described herein is suited. A typical combination of hardware and software may include a general-purpose computing system with a program or other code that, when being loaded and executed, controls the computing system such that it carries out the methods described herein. Another typical implementation may comprise an application specific integrated circuit or chip. Some implementations may comprise a non-transitory machine-readable (e.g., computer readable) medium (e.g., FLASH drive, optical disk, magnetic storage disk, or the like) having stored thereon one or more lines of code executable by a machine, thereby causing the machine to perform processes as described herein.
While the present method and/or system has been described with reference to certain implementations, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the scope of the present method and/or system. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the present disclosure without departing from its scope. Therefore, it is intended that the present method and/or system not be limited to the particular implementations disclosed, but that the present method and/or system will include all implementations falling within the scope of the appended claims.
As utilized herein the terms “circuits” and “circuitry” refer to physical electronic components (i.e. hardware) and any software and/or firmware (“code”) which may configure the hardware, be executed by the hardware, and or otherwise be associated with the hardware. As used herein, for example, a particular processor and memory may comprise a first “circuit” when executing a first set of one or more lines of code and may comprise a second “circuit” when executing a second set of one or more lines of code. As utilized herein, “and/or” means any one or more of the items in the list joined by “and/or”. As an example, “x and/or y” means any element of the three-element set {(x), (y), (x, y)}. In other words, “x and/or y” means “one or both of x and y”. As another example, “x, y, and/or z” means any element of the seven-element set {(x), (y), (z), (x, y), (x, z), (y, z), (x, y, z)}. In other words, “x, y and/or z” means “one or more of x, y and z”. As utilized herein, the term “exemplary” means serving as a non-limiting example, instance, or illustration. As utilized herein, the terms “e.g. and for example” set off lists of one or more non-limiting examples, instances, or illustrations. As utilized herein, circuitry is “operable” to perform a function whenever the circuitry comprises the necessary hardware and code (if any is necessary) to perform the function, regardless of whether performance of the function is disabled or not enabled (e.g., by a user-configurable setting, factory trim, etc.).
This application claims priority to the following application(s), each of which is hereby incorporated herein by reference: U.S. provisional patent application 62/121,841 titled “A WELDING SYSTEM PROVIDING REMOTE STORAGE OF VIDEO WELD DATA” filed on Feb. 27, 2015.
Number | Date | Country | |
---|---|---|---|
62121841 | Feb 2015 | US |