Systems and methods providing a computerized eyewear device to aid in welding

Information

  • Patent Grant
  • 10930174
  • Patent Number
    10,930,174
  • Date Filed
    Wednesday, July 26, 2017
    7 years ago
  • Date Issued
    Tuesday, February 23, 2021
    3 years ago
Abstract
A system to support communication and control in a welding environment is disclosed. In one embodiment the system includes an internet-of-things (IoT) technology platform configured to provide scalable, interoperable, and secure communication connections between a plurality of disparate devices within a welding environment. The system also includes a welding power source configured to communicate with the IoT technology platform. The system further includes a computerized eyewear device. The computerized eyewear device includes a control and communication circuitry configured to communicate with the welding power source via the IoT technology platform. The computerized eyewear device also includes a transparent display configured to display information received from the welding power source via the IoT technology platform while allowing a user to view a surrounding portion of the welding environment through the transparent display.
Description
TECHNICAL FIELD

Certain embodiments of the present invention relate to welding. More particularly, certain embodiments of the present invention relate to systems and methods providing visualization and communication capabilities to a welder using a welding system via a computerized eyewear device.


BACKGROUND

Providing information to a welding student in real time during a welding process (whether a real-world welding process or a simulated welding process) is important to aid the welding student in the learning process. Similarly, providing information to an expert welder in real time during a real-world welding process can aid the expert welder in the welding process. Furthermore, providing the ability for a welding student or an expert welder to easily communicate with (e.g., provide commands to) a welding system (real or simulated) can allow for a more efficient and user-friendly welding experience. Today, a welding helmet may be provided with simple light indicators representative of welding information which don't require a welder to be able to focus sharply on the light indicators, since the light indicators may be within one inch of the welder's eye. Simply being able to see that the color of a light indicator is red or green or yellow, for example, is provided. Thus, there is an ongoing need to improve how a welder or welding student interacts with a welding system and how information is provided and viewed in real time.


Further limitations and disadvantages of conventional, traditional, and proposed approaches will become apparent to one of skill in the art, through comparison of such systems and methods with embodiments of the present invention as set forth in the remainder of the present application with reference to the drawings.


SUMMARY

In one embodiment, a system is provided. The system includes an internet-of-things (IoT) technology platform including at least one server computer having a connection server application. The IoT technology platform is configured to be implemented as part of a real world welding environment. The IoT technology platform is also configured to provide scalable, interoperable, and secure wireless communication connections between a plurality of disparate devices within the real world welding environment. The IoT technology platform is further configured to enable protocol-independent deployment of the plurality of disparate devices within the real world welding environment. The system also includes at least one welding power source, being at least one of the plurality of disparate devices, configured to wirelessly communicate, two-way, with the IoT technology platform using the connection server application. The system further includes at least one computerized eyewear device, being at least one of the plurality of disparate devices. The computerized eyewear device includes a control and communication circuitry having a processor and a memory. The control and communication circuitry is configured to wirelessly communicate, two-way, with the welding power source via the IoT technology platform using the connection server application. The computerized eyewear device also includes a transparent display configured to display information received by the control and communication circuitry from the welding power source via the IoT technology platform using the connection server application. A user is able to view a surrounding portion of the real world welding environment through the transparent display. In one embodiment, the IoT technology platform is configured to provide the scalable, interoperable, and secure communication connections between the plurality of disparate devices via WebSockets. The IoT technology platform is configured to handle message routing and translation between the plurality of disparate devices. The IoT technology platform is configured to allow a developer to build, run, and grow applications to control and report data to and from any of the plurality of disparate devices. The information displayed by the transparent display may be in the form of at least one of text, a graphic, or an image and may include at least one welding parameter received from the welding power source via the IoT technology platform. In one embodiment, the computerized eyewear device includes a microphone operatively connected to the control and communication circuitry. The microphone, as operatively connected to the control and communication circuitry, is configured to receive voice-activated user command information from the user and communicate the voice-activated user command information to the welding power source via the IoT technology platform. In one embodiment, the computerized eyewear device includes a camera operatively connected to the control and communication circuitry. The camera, as operatively connected to the control and communication circuitry, is configured to capture at least one still image (picture) or moving video of the real world welding environment during a welding operation from the point-of-view of the user and communicate the still image (picture) or moving video to the IoT technology platform for recording and storage. In one embodiment, the computerized eyewear device includes a touch-sensitive user interface operatively connected to the control and communication circuitry. The touch-sensitive user interface, as operatively connected to the control and communication circuitry, is configured to allow a user to select command information and provide the command information to the welding power source via the IoT technology platform to control the welding power source. The welding power source may be an inverter-based welding power source that supports at least one of a gas metal arc welding (GMAW) operation, a gas tungsten arc welding (GTAW) operation, or a shielded metal arc welding (SMAW) operation.


In another embodiment, a system is provided. The system includes a plurality of disparate devices within a real world welding environment having disparate wireless communication capabilities. The system also includes an IoT technology platform including at least one server computer having a connection server application. The IoT technology platform is configured to provide scalable, interoperable, and secure wireless communication connections between the plurality of disparate devices. The IoT technology platform is also configured to enable protocol-independent deployment of the plurality of disparate devices within the real world welding environment. The plurality of disparate devices includes at least one welding power source configured to wirelessly communicate, two-way, with the IoT technology platform using the connection server application. The plurality of disparate devices further includes at least one computerized eyewear device. The computerized eyewear device includes a control and communication circuitry having a processor and a memory. The control and communication circuitry is configured to wirelessly communicate, two-way, with the welding power source via the IoT technology platform using the connection server application. The computerized eyewear device also includes a transparent display configured to display information received by the control and communication circuitry from the welding power source via the IoT technology platform using the connection server application. A user is able to view a surrounding portion of the real world welding environment through the transparent display. In one embodiment, the computerized eyewear device is configured to provide an augmented reality capability via at least the transparent display. The plurality of disparate devices may include, for example, at least one of at least one welding wire feeder, at least one welding torch or gun, at least one gas meter/sensor operatively connected to at least one tank of shielding gas, at least one mobile phone device (e.g., a “smart” phone), at least one welding helmet, or at least one welding fume extractor. In one embodiment, at least one of the plurality of disparate devices includes at least one sensor configured to sense at least one parameter associated with the at least one of the plurality of disparate devices and communicate the at least one parameter to the IoT technology platform using the connection server application. The at least one parameter may include, for example, at least one of a temperature parameter, a pressure parameter, a humidity parameter, a voltage parameter, a current parameter, a wire feed speed parameter, a flow rate parameter, a spatial position parameter, a spatial orientation parameter, or a travel speed parameter.


In one embodiment, a system is provided. The system includes a welding power source of an arc welding system and a computerized eyewear device having a head-up display (HUD). The computerized eyewear device is configured to be worn by a user as eye glasses are worn, while the user also wears a protective welding helmet. The computerized eyewear device is further configured to wirelessly communicate with the welding power source of the arc welding system. The computerized eyewear device may receive information from the welding power source and display the information on the HUD. Furthermore, the user may provide commands to the welding power source via the computerized eyewear device (e.g., via voice activation). The welding power source and the computerized eyewear device may be cooperatively configured to provide one or more of augmented indicators indicative of a user's welding technique and sequencer functionality indicative of a next weld to be made on the HUD, for example.


In another embodiment, a system is provided. The system includes a programmable processor-based subsystem of a virtual reality welding simulation system and a computerized eyewear device having a head-up display (HUD). The computerized eyewear device is configured to be worn by a user as eye glasses are worn, while the user also wears a protective welding helmet. The computerized eyewear device is further configured to wirelessly communicate with the programmable processor-based subsystem of the virtual reality welding simulation system. The computerized eyewear device may receive information from the programmable processor-based subsystem and display the information on the HUD. Furthermore, the user may provide commands to the programmable processor-based subsystem via the computerized eyewear device (e.g., via voice activation). The programmable processor-based subsystem and the computerized eyewear device may be cooperatively configured to provide one or more of virtual reality images associated with a virtual reality welding process and virtual cues and indicators associated with a virtual reality welding process on the HUD, for example.


In accordance with an embodiment, the computerized eyewear device includes a frame configured to be worn on the head of a user, the frame including a bridge configured to be supported on the nose of the user, a brow portion coupled to and extending away from the bridge to a first end remote therefrom and configured to be positioned over a first side of a brow of the user, and a first arm having a first end coupled to the first end of the brow portion and extending to a free end, the first arm being configured to be positioned over a first temple of the user with the free end disposed near a first ear of the user, wherein the bridge is adjustable for selective positioning of the brow portion relative to an eye of the user. The computerized eyewear device also includes a transparent display (the HUD) which may be affixed to the frame and may be movable with respect to the frame through rotation about a first axis that extends parallel to the first brow portion. The computerized eyewear device also includes a housing containing control and communication circuitry affixed to the frame. As an example, the computerized eyewear device may be a Google Glass™ device configured for operation with an arc welding system or a virtual reality arc welding simulation system.


Details of illustrated embodiments of the present invention will be more fully understood from the following description and drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 illustrates a diagram of an exemplary embodiment of an arc welding system and a computerized eyewear device configured to communicate with the arc welding system;



FIG. 2 illustrates a diagram of an exemplary embodiment of the computerized eyewear device of FIG. 1;



FIG. 3 illustrates a diagram of an exemplary embodiment of a virtual reality welding system and a computerized eyewear device configured to communicate with the virtual reality welding system;



FIG. 4 illustrates a diagram of an exemplary embodiment of an arc welding system and a computerized eyewear device configured to communicate with each other via an internet-of-things (IoT) technology platform;



FIG. 5 illustrates a diagram of an exemplary embodiment of an internet-of-things (IoT) technology platform having multiple server computers;



FIG. 6 illustrates a diagram of an exemplary embodiment of a server computer, of the internet-of-things (IoT) technology platform of FIG. 5, having a connection server application; and



FIG. 7 illustrates a diagram of an exemplary embodiment of a welding environment having multiple welding power sources and multiple computerized eyewear devices that are configured to communicate with each other via an internet-of-things (IoT) technology platform.





DETAILED DESCRIPTION

The following are definitions of exemplary terms that may be used within the disclosure. Both singular and plural forms of all terms fall within each meaning:


“Software” or “computer program” as used herein includes, but is not limited to, one or more computer readable and/or executable instructions that cause a computer or other electronic device to perform functions, actions, and/or behave in a desired manner. The instructions may be embodied in various forms such as routines, algorithms, modules or programs including separate applications or code from dynamically linked libraries. Software may also be implemented in various forms such as a stand-alone program, a function call, a servlet, an applet, an application, instructions stored in a memory, part of an operating system or other type of executable instructions. It will be appreciated by one of ordinary skill in the art that the form of software is dependent on, for example, requirements of a desired application, the environment it runs on, and/or the desires of a designer/programmer or the like.


“Computer” or “processing element” or “computerized device” as used herein includes, but is not limited to, any programmed or programmable electronic device that can store, retrieve, and process data. “Non-transitory computer-readable media” include, but are not limited to, a CD-ROM, a removable flash memory card, a hard disk drive, a magnetic tape, and a floppy disk.


“Computer memory”, as used herein, refers to a storage device configured to store digital data or information which can be retrieved by a computer or processing element.


“Controller”, as used herein, refers to the logic circuitry and/or processing elements and associated software or program involved in controlling a device, system, or portion of a system.


The terms “signal”, “data”, and “information” may be used interchangeably herein and may be in digital or analog form.


The term “welding parameter” is used broadly herein and may refer to characteristics of a portion of a welding output current waveform (e.g., amplitude, pulse width or duration, slope, electrode polarity), a welding process (e.g., a short arc welding process or a pulse welding process), wire feed speed, a modulation frequency, a welding travel speed, or some other parameter associated with real-world welding or simulated welding.


The term “head up display”, as used herein, refers to a transparent display that presents information (e.g., high quality images) without requiring a user to look away from their usual viewpoints.


In one embodiment, an arc welding system is provided. The arc welding system includes a welding power source and a computerized eyewear device having a head-up display (HUD) and control and communication circuitry (CCC) operatively connected to the HUD. The computerized eyewear device is configured to be worn by a user as eye glasses are worn, while also wearing a protective welding helmet, and wirelessly communicate with the welding power source. The control and communication circuitry is configured to wirelessly receive information from the welding power source and display the information on the HUD.


In accordance with an embodiment, the computerized eyewear device includes a microphone operatively connected to the control and communication circuitry. The microphone and the control and communication circuitry are configured to receive voice-activated user command information and wirelessly transmit the voice-activated user command information to the welding power source. In accordance with an embodiment, the computerized eyewear device includes a camera operatively connected to the control and communication circuitry. The camera and the control and communication circuitry are configured to capture one or more of still pictures and moving video. In accordance with an embodiment, the control and communication circuitry is configured to access the internet through a wireless access point.


In accordance with an embodiment, the computerized eyewear device includes a frame configured to be worn on the head of a user and at least one housing affixed to the frame containing one or more of the control and communication circuitry, the microphone, and the camera. The HUD is also affixed to the frame and is movable with respect to the frame through rotation about a first axis that extends parallel to a first brow portion. Optionally, the computerized eyewear device may include at least one prescription optical lens held in place by the frame.


In accordance with an embodiment, the frame includes a bridge configured to be supported on the nose of the user, a brow portion coupled to and extending away from the bridge to a first end remote therefrom and configured to be positioned over a first side of a brow of the user, and a first arm having a first end coupled to the first end of the brow portion and extending to a free end. The first arm is configured to be positioned over a first temple of the user with the free end disposed near a first ear of the user. In accordance with an embodiment, the bridge is adjustable for selective positioning of the brow portion relative to an eye of the user.



FIG. 1 illustrates a diagram of an exemplary embodiment of an arc welding system 100 and a computerized eyewear device 150 configured to communicate with the arc welding system 100. The arc welding system 100 includes a wire feeder 110, a welding gun or tool 120, a shielding gas supply 130 (e.g., a tank of shielding gas) with a gas meter/sensor 131, and a welding power source 140. The wire feeder 110, the welding gun 120, the shielding gas supply 130, and the power source 140 are operatively connected to allow a welder to create an electric arc between a welding wire and a workpiece W to create a weld as is well known in the art.


In accordance with an embodiment, the welding power source 140 includes a switching power supply (not shown), a waveform generator (not shown), a controller (not shown), a voltage feedback circuit (not shown), a current feedback circuit (not shown), and a wireless communication circuit 145. The wire feeder 110 feeds the consumable wire welding electrode E toward the workpiece W through the welding gun (welding tool) 120 at a selected wire feed speed (WFS). The wire feeder 110, the consumable welding electrode E, and the workpiece W are not part of the welding power source 140 but may be operatively connected to the welding power source 140 via a welding output cable.


The computerized eyewear device 150 is configured to be worn by a user as eye glasses are worn, while also wearing a conventional protective welding helmet. The protective welding helmet may be a conventional welding helmet that does not have to be modified in any way to accommodate the computerized eyewear device 150. Furthermore, the computerized eyewear device 150 is configured to wirelessly communicate with the welding power source 140 via the wireless communication circuit 145 of the welding power source 140. The wireless communication circuit 145 may include a processor, computer memory, a transmitter, a receiver, and an antenna, in accordance with an embodiment.


Referring now to FIG. 1 and FIG. 2, where FIG. 2 illustrates a diagram of an exemplary embodiment of the computerized eyewear device 150 of FIG. 1, the computerized eyewear device 150 includes a frame 151 configured to be worn on the head of a user. The frame 151 includes a bridge 152 configured to be supported on the nose of the user and a brow portion 153 coupled to and extending away from the bridge 152 to a first and second ends remote therefrom and configured to be positioned over the brows of the user.


The frame also includes a first arm 154 having a first end coupled to the first end of the brow portion 153 and extending to a free end, the first arm being configured to be positioned over a first temple of the user with the free end disposed near a first ear of the user. The frame 151 also includes a second arm 155 having a first end coupled to the second end of the brow portion 153 and extending to a free end, the second arm being configured to be positioned over a second temple of the user with the free end disposed near a second ear of the user. The bridge 152 may be adjustable for selective positioning of the brow portion 153 relative to the eyes of the user, in accordance with an embodiment.


The computerized eyewear device 150 includes a transparent display (e.g., a HUD) 156 affixed to the frame 151. The HUD 156 may be movable with respect to the frame 151 through rotation about a first axis that extends parallel to the brow portion 153, in accordance with an embodiment, and may be configured to display text, graphics, and images. The computerized eyewear device 150 also includes control and communication circuitry (e.g., a computer) 157 enclosed in a housing 162 and affixed to the frame 151. The control and communication circuitry 157 may include a processor and memory, for example. The memory may be coupled to the processor and store software that can be accessed and executed by the processor. The processor may be a microprocessor or a digital signal processor, for example. As an option, the computerized eyewear device 150 may include a camera 158. The HUD 156 and the control and communication circuitry 157 (and, optionally, the camera 158) are operatively connected to provide the functionality described herein. In accordance with an embodiment, the camera 158 is configured to capture still pictures (images) and moving video. In this way, a user may record the welding scenario as viewed by the user from inside the welding helmet.


In accordance with an embodiment, the control and communication circuitry 157 provides two-way communication with the wireless communication circuit 145 of the welding power source 140. Information may be provided from the welding power source 140 to the computerized eyewear device 150 and displayed on the HUD 156. Furthermore, in accordance with an embodiment, the control and communication circuitry 157 is configured to accept voice-activated commands from a user and transmit the commands to the welding power source 140. Communication between the welding power source 140 and the computerized eyewear device 150 may be accomplished by way of, for example, Bluetooth® radio technology, communication protocols described in IEEE 802.11 (including any IEEE 802.11 revisions), cellular technology (such as GSM, CDMA, UMTS, EVDO, WiMax, or LTE), or ZigBee® technology, among other possibilities. In accordance with an embodiment, the computerized eyewear device may also include at least one optical lens 163 that matches a user's corrective visual prescription. In accordance with a further embodiment, the computerized eyewear device may be modular and attachable to normal prescription eye glasses.


Furthermore, in accordance with an embodiment, the welding power source 140 may be accessible by the computerized eyewear device 150 via the Internet. For example, the control and communication circuitry 157 may be configured to access the Internet through a wireless hot spot (e.g., a smart phone or a wireless router) and access the welding power source 140 therethrough. Alternatively, the welding power source 140 may be configured to access the Internet and provide information obtained from the Internet to the computerized eyewear device 150.


Information that may be displayed on the HUD 156 during a real-world welding scenario that may be useful to a welder may be in the form of text, an image, or a graphic. Such information may include, for example, the arc welding process, a welding tool travel angle, a welding tool travel speed, a tip-to-work distance, a wire feed speed, a welding polarity, an output voltage level, an output current level, an arc length, a dime spacing, a whip time, a puddle time, a width of weave, a weave spacing, a tolerance window, a number score, and welding sequence steps. Other information may be displayed as well, in accordance with other embodiments. For example, in an augmented mode, instructional indicators that are used in a virtual reality training environment may be superimposed over an actual weld using the HUD 156. In this manner, a welding student who trained on a virtual reality welding system can transition to a real welding scenario and have the same instructional indicators provided via the HUD. Visual cues or indicators may be displayed to the welder on the HUD of the computerized eyewear device to indicate to the welder if a particular parameter (e.g., a welding tool travel angle) is within an acceptable range or not. Such visual cues or indicators may aid in training by helping an inexperienced welder or welding student to improve his welding technique.


The acquisition of some of the information may rely on the welding tool being spatially tracked (e.g., travel angle, travel speed, tip-to-work distance). In accordance with an embodiment, the welding tool may include an accelerometer device that is operatively connected to the welding power source to provide spatial position or movement information. Other methods of tracking the welding tool are possible as well, such as magnetic tracking techniques, for example.


In accordance with an embodiment, the computerized eyewear device 150 includes a microphone 159 for receiving voice-activated commands from a user. The voice-activated commands, as initiated by a welder, that may be accommodated by the computerized eyewear device 150 in communication with the welding power source 140 may include, for example, commands to change a welding parameter such as a wire feed speed, a welding polarity, and a welding output current level. Other types of commands may be possible as well, in accordance with other embodiments.


In accordance with an embodiment, the computerized eyewear device 150 and/or the welding power source 140 may be programmed with one or more welding software applications configured to accommodate use of the computerized eyewear device 150 with the arc welding system 100. For example, an embodiment of one welding software application may provide a “good weld” recognition capability. Similar to a facial recognition capability, the “good weld” recognition capability may use the camera 158 to acquire an image of a weld created by the user, analyze the image, and provide feedback to the user on the HUD 156 as to the overall external quality of the weld. For example, the text “poor weld”, “fair weld”, or “good weld” may be displayed to the user. The user may have to take off his welding helmet or lift a visor on the welding helmet to acquire an image of the weld. The welding software application may reside in the computerized eyewear device 150, the welding power source 140, or a combination of both, in accordance with various embodiments.


As another example, an embodiment of a welding software application may provide a welding sequencing capability. When welding a part or assembly with many welds, it is not desirable for a welder to miss a weld. A welding software application may step a welder through the multiple welds for the part. For example, as a welder finishes a current weld on a part or assembly requiring multiple welds, the welder may give a voice command of “next weld”. As a result, the welding software application may display to the welder on the HUD 156 an image or graphic (e.g., a 3D representation of the part) providing the location of the next weld to be performed. The type of weld and other information associated with the weld may also be displayed. In accordance with an embodiment where the computerized eyewear device 150 is being spatially tracked, as discussed later herein, the welding software application may display a graphic on the HUD such that graphic indicator is overlaid onto the assembly at the next location to be welded. Other types of welding software applications that operate with the computerized eyewear device are possible as well, in accordance with other embodiments.


In one embodiment, a virtual reality welding system is provided. The virtual reality welding system includes a programmable processor-based subsystem and a computerized eyewear device having a head-up display (HUD) and control and communication circuitry (CCC) operatively connected to the HUD. The computerized eyewear device is configured to be worn by a user as eye glasses are worn, and to wirelessly communicate with the programmable processor-based subsystem. The control and communication circuitry is configured to wirelessly receive information from the programmable processor-based subsystem and display the information on the HUD.


In accordance with an embodiment, the computerized eyewear device further includes a microphone operatively connected to the control and communication circuitry and configured to receive voice-activated user command information and wirelessly transmit the voice-activated user command information to the programmable processor-based subsystem. Alternatively, or in addition, the computerized eyewear device may include a touch-sensitive user interface operatively connected to the control and communication circuitry and configured to allow a user to select command information and wirelessly transmit the command information to the programmable processor-based subsystem.


In accordance with an embodiment, the computerized eyewear device includes a camera operatively connected to the control and communication circuitry. The camera and the control and communication circuitry are configured to capture one or more of still pictures and moving video. In accordance with an embodiment, the control and communication circuitry is configured to access the internet through a wireless access point.


In accordance with an embodiment, the computerized eyewear device includes a frame configured to be worn on the head of a user and at least one housing affixed to the frame containing one or more of the control and communication circuitry, the microphone, and the camera. The HUD is also affixed to the frame and is movable with respect to the frame through rotation about a first axis that extends parallel to a first brow portion. Optionally, the computerized eyewear device may include at least one prescription optical lens held in place by the frame.


In accordance with an embodiment, the frame includes a bridge configured to be supported on the nose of the user, a brow portion coupled to and extending away from the bridge to a first end remote therefrom and configured to be positioned over a first side of a brow of the user, and a first arm having a first end coupled to the first end of the brow portion and extending to a free end. The first arm is configured to be positioned over a first temple of the user with the free end disposed near a first ear of the user. In accordance with an embodiment, the bridge is adjustable for selective positioning of the brow portion relative to an eye of the user.


In accordance with an embodiment, the computerized eyewear device includes at least one motion sensing device operatively connected to the control and communication circuitry and configured to provide spatial information to the programmable processor-based subsystem as a user moves his head.



FIG. 3 illustrates a diagram of an exemplary embodiment of a virtual reality arc welding system 300 and a computerized eyewear device 150 configured to communicate with the virtual reality welding system 300. The virtual reality arc welding (VRAW) system includes a programmable processor-based subsystem, a spatial tracker operatively connected to the programmable processor-based subsystem, at least one mock welding tool capable of being spatially tracked by the spatial tracker, and at least one display device operatively connected to the programmable processor-based subsystem. In accordance with an embodiment, the computerized eyewear device 150 may also be spatially tracked by the spatial tracker. The system is capable of simulating, in a virtual reality space, a weld puddle having real-time molten metal fluidity and heat dissipation characteristics. The system is also capable of displaying the simulated weld puddle on the display device in real-time.


The system 300 includes a programmable processor-based subsystem (PPS) 310. The system 300 further includes a spatial tracker (ST) 320 operatively connected to the PPS 310. The system 300 also includes a physical welding user interface (WUI) 330 operatively connected to the PPS 310 as well as the computerized eyewear device 150 in operative wireless communication with the PPS 310 via a wireless communication circuit 145 of the PPS 310. The system 300 further includes an observer display device (ODD) 340 operatively connected to the PPS 310. The system 300 also includes at least one mock welding tool (MWT) 350 operatively connected to the ST 320 and the PPS 310. The system 300 further includes a table/stand (T/S) 360 and at least one welding coupon (WC) 370 capable of being attached to the T/S 360. In accordance with an alternative embodiment of the present invention, a mock gas bottle is provided (not shown) simulating a source of shielding gas and having an adjustable flow regulator.


In accordance with an embodiment, the computerized eyewear device 150 is configured as previously described herein. However, in this embodiment, the control and communication circuitry 157 provides two-way communication with the wireless communication circuit 145 of the PPS 310. Information may be provided from the PPS 310 to the computerized eyewear device 150 and displayed on the HUD 156.


Furthermore, in accordance with an embodiment, the control and communication circuitry 157 is configured to accept voice-activated commands from a user and transmit the commands to the PPS 310. Communication between the PPS 310 and the computerized eyewear device 150 may be accomplished by way of, for example, Bluetooth® radio technology, communication protocols described in IEEE 802.11 (including any IEEE 802.11 revisions), cellular technology (such as GSM, CDMA, UMTS, EVDO, WiMax, or LTE), or ZigBee® technology, among other possibilities.


Furthermore, in accordance with an embodiment, the PPS 310 may be accessible by the computerized eyewear device 150 via the Internet. For example, the control and communication circuitry 157 may be configured to access the Internet through a wireless hot spot (e.g., a smart phone or a wireless router) and access the PPS 310 therethrough. Alternatively, the PPS 310 may be configured to access the Internet and provide information obtained from the Internet to the computerized eyewear device 150.


As before, the user may wear a conventional welding helmet over the computerized eyewear device 150. However, since the welding scenario is a simulated welding scenario, the conventional welding helmet may be fitted with a transparent lens instead of a protective lens that protects against the light and other radiation emitted by a real arc. As such, the user may see through the transparent lens to view the welding coupon 370 and the mock welding tool 350, for example.


In accordance with an embodiment, the computerized eyewear device 150 is configured with an accelerometer device 160 that is operatively connected to the control and communication circuitry 157. Spatial information provided by the accelerometer device as the user moves his head is communicated to the PPS 310 and then to the spatial tracker 320. In this manner, the spatial relationship between the surrounding environment and what the user is seeing through the HUD 156 of the computerized eyewear device 150 may be correlated. As the user proceeds with the virtual welding process using the system 300, anything displayed on the HUD 156 (e.g., a virtual weld puddle) will appear overlaid onto, for example, the welding coupon 370 as the user views the welding coupon through the transparent lens of the conventional welding helmet. In accordance with other embodiments, other motion sensing devices besides that of an accelerometer device may be used. A calibration procedure may be initially performed to correlate the view of the user through the HUD to the surrounding environment, in accordance with an embodiment.


The real-time molten metal fluidity and heat dissipation characteristics of the simulated weld puddle provide real-time visual feedback to a user of the mock welding tool when displayed (e.g., on the HUD of the computerized eyewear device 150 as tracked by the spatial tracker 320), allowing the user to adjust or maintain a welding technique in real-time in response to the real-time visual feedback (i.e., helps the user learn to weld correctly). When the computerized eyewear device 150 is being spatially tracked, the weld puddle will appear at a correct location with respect to the welding coupon as viewed through the HUD.


The displayed weld puddle is representative of a weld puddle that would be formed in the real-world based on the user's welding technique and the selected welding process and parameters. By viewing a puddle (e.g., shape, color, slag, size, stacked dimes), a user can modify his technique to make a good weld and determine the type of welding being done. The shape of the puddle is responsive to the movement of the gun or stick.


The term “real-time”, as used herein with respect to a virtual reality or simulated environment, means perceiving and experiencing in time in a virtual or simulated environment in the same way that a user would perceive and experience in a real-world welding scenario. Furthermore, the weld puddle is responsive to the effects of the physical environment including gravity, allowing a user to realistically practice welding in various positions including overhead welding and various pipe welding angles (e.g., 1G, 2G, 5G, 6G).


Information that may be useful to a welding student to display on the HUD 156 during a virtual or simulated welding scenario may be in the form of text, an image, or a graphic. Such information may include, for example, the arc welding process, a welding tool travel angle, a welding tool travel speed, a tip-to-work distance, a set wire feed speed, a set welding polarity, a simulated output voltage level, a set output current level, a simulated arc length, a dime spacing, a whip time, a puddle time, a width of weave, a weave spacing, a tolerance window, a number score, and welding sequence steps. Other information may be displayed as well, in accordance with other embodiments.


In accordance with an embodiment, the computerized eyewear device 150 includes a microphone 159 that is operatively connected to the control and communication circuitry 157 for receiving voice-activated commands from a user. The voice-activated commands, as initiated by a welder, that may be accommodated by the computerized eyewear device 150 in communication with the PPS 310 may include, for example, commands to change a welding parameter such as a simulated wire feed speed, a simulated welding polarity, and a simulated welding output current level. Other types of commands may be possible as well, in accordance with other embodiments.


In accordance with an embodiment, the computerized eyewear device 150 and/or the PPS 310 may be programmed with one or more welding training software applications configured to accommodate use of the computerized eyewear device 150 with the virtual reality arc welding system 300. For example, an embodiment of one welding software application may provide a “good weld” recognition capability. Similar to a facial recognition capability, the “good weld” recognition capability may use an image of a simulated weld created by the user, analyze the image, and provide feedback to the user on the HUD 156 as to the overall external quality of the weld. For example, the text “poor weld”, “fair weld”, or “good weld” may be displayed to the user. The welding software application may reside in the computerized eyewear device 150, the PPS 310, or a combination of both, in accordance with various embodiments.


As another example, an embodiment of a welding software application may provide a welding sequencing capability. As a welder finishes a current simulated weld on a welding coupon requiring multiple welds, the welder may give a voice command of “next weld”. As a result, the welding software application may display to the welder on the HUD 156 an image or graphic providing the location of the next weld to be performed. The type of weld and other information associated with the weld may also be displayed. In accordance with an embodiment where the computerized eyewear device 150 is being spatially tracked, as discussed herein, the welding software application may display a graphic on the HUD such that the graphic is overlaid onto the welding coupon at the next location to be welded. Other types of welding software applications that operate with the computerized eyewear device are possible as well, in accordance with other embodiments.


The computerized eyewear device 150 may be configured to be used with other welding simulation systems in accordance with other embodiments. For example, welding simulations performed on a personal computer (PC) or a tablet computer may be communicatively and functionally integrated with the computerized eyewear device 150 to aid a welding student in learning how to weld. In some simulated and/or virtual welding environments, a welding student may not wear a welding helmet of any kind. Instead, the computerized eyewear device may be the only head gear worn. One optional embodiment of the computerized eyewear device may provide a touch-sensitive user interface (TSUI) 161 which the welding student can use instead of or in addition to voice-activated commands. Such a TSUI would be accessible to the welding student when not wearing a welding helmet, for example. In accordance with an embodiment, the TSUI 161 is operatively connected to the control and communication circuitry 157.



FIG. 4 illustrates a diagram of an exemplary embodiment of the arc welding system 100 of FIG. 1 and the computerized eyewear device 150 of FIG. 1 or FIG. 2, for example, configured to communicate with each other via an internet-of-things (IoT) technology platform 400. The arc welding system 100, having a welding power source 140, and the computerized eyewear device 150 may exist in a real world welding environment (e.g., a manufacturing facility). The term “real world” is used herein to refer to an actual welding environment as opposed to a virtual welding environment. The IoT technology platform 400 may also exist in the real world welding environment along with the welding power source 140, the computerized eyewear device 150, and possibly other devices (e.g., other welding power sources, other computerized eyewear devices, etc.). In an alternative embodiment, a portion of the IoT technology platform 400 exists within the welding environment and another portion of the IoT technology platform 400 exists externally to the welding environment (e.g. as part of a server farm remotely located from the welding environment). That is, the IoT technology platform 400 may be distributed across several real world environments.


The IoT technology platform 400 provides scalable, interoperable, and secure wired and/or wireless communication connections (e.g., via WebSockets) between multiple disparate devices within the real world welding environment. The IoT technology platform 400 enables protocol-independent deployment of the multiple disparate devices within the real world welding environment. That is, devices that may communicate using different communication protocols can be accommodated by the IoT technology platform 400, allowing the disparate devices to communicate with each other through the IoT technology platform 400. The IoT technology platform 400 is configured to handle message routing and translation between the multiple disparate devices and allow a developer to build, run, and grow applications to control and report data to and from any of the multiple disparate devices. An example of an IoT technology platform is provided by ThingWorx®.


As shown in FIG. 4, the welding power source 140 of the arc welding system 100, being one of the multiple disparate devices, is configured to wirelessly communicate (two-way) with the IoT technology platform 400. In one embodiment, the welding power source 140 is an inverter-based welding power source that supports at least one of a gas metal arc welding (GMAW) operation, a gas tungsten arc welding (GTAW) operation, or a shielded metal arc welding (SMAW) operation. The computerized eyewear device 150, being one of the multiple disparate devices, includes a control and communication circuitry 157 and a transparent display 156.


In one embodiment, the control and communication circuitry 157 is configured to wirelessly communicate (two-way) with the welding power source 140 via the IoT technology platform 400. The transparent display 156 is configured to display information received by the control and communication circuitry 157 from the welding power source 140 via the IoT technology platform 400 while allowing a user to view a surrounding portion of the real world welding environment through the transparent display. That is, the computerized eyewear device 150 is configured to provide an augmented reality capability via at least the transparent display. For example, the information displayed by the transparent display 156 may be in the form of any of text, graphics, or images and may include welding parameters received from the welding power source 140 via the IoT technology platform 400.


In one embodiment, the computerized eyewear device 150 includes a microphone 159 (see FIG. 2) operatively connected to the control and communication circuitry 157. Together, the microphone 159 and the control and communication circuitry 157 are configured to receive voice-activated user command information from the user and communicate the voice-activated user command information to the welding power source 140 via the internet-of-things (IoT) technology platform 400.


In one embodiment, the computerized eyewear device 150 includes a camera 158 operatively connected to the control and communication circuitry 157. Together, the camera 158 and the control and communication circuitry 157 are configured to capture at least one still image or moving video of the real world welding environment during a welding operation from the point-of-view of the user and communicate the at least one still image or video to the internet-of-things (IoT) technology platform 400 for recording and storage.


In one embodiment, the computerized eyewear device 150 includes a touch-sensitive user interface 161 (see FIG. 2) operatively connected to the control and communication circuitry 157. Together, the touch-sensitive user interface 161 and the control and communication circuitry 157 are configured to allow a user to select command information and provide the command information to the welding power source 140 via the internet-of-things (IoT) technology platform 400 to control the welding power source 140.



FIG. 5 illustrates a diagram of an exemplary embodiment of the internet-of-things (IoT) technology platform 400 of FIG. 4 having multiple server computers 410. The multiple server computers 410 support communication and interaction between the multiple disparate devices within the welding environment. FIG. 6 illustrates a diagram of an exemplary embodiment of one server computer 410 (of the multiple server computers), of the internet-of-things (IoT) technology platform 400 of FIG. 5, having a connection server application 420. In one embodiment, the connection server application 420 is configured to support the scalable, interoperable, and secure communication connections between the multiple disparate devices, enabling protocol-independent deployment of the multiple disparate devices within the real world welding environment. The connection server application 420 may include, for example, software or a combination of software and hardware, in accordance with various embodiments.



FIG. 7 illustrates a diagram of an exemplary embodiment of a real world welding environment 700 having multiple disparate devices (e.g., multiple welding power sources 140 and multiple computerized eyewear devices 150) that are configured to communicate with each other via an internet-of-things (IoT) technology platform 400. As illustrated in FIG. 7, communication between the IoT technology platform 400 and any of the multiple disparate devices is via wireless means. In alternative embodiments, wired means and/or a combination of wired and wireless means may be employed. The IoT technology platform 400 facilitates communication between the multiple disparate devices. For example, in one embodiment, any disparate device may communicate with any other disparate device within the welding environment 700 via the IoT technology platform.


The multiple disparate devices in the real world welding environment 700 have disparate wireless communication capabilities. Wireless communication supported by the disparate devices and the IoT technology platform 400 may be via any of, for example, Bluetooth® radio technology, communication protocols described in IEEE 802.11 (including any IEEE 802.11 revisions), cellular technology (such as GSM, CDMA, UMTS, EVDO, WiMax, or LTE), or ZigBee® technology, among other possibilities. Again, the IoT technology platform 400 is configured to enable protocol-independent deployment of the multiple disparate devices within the real world welding environment at least by handling message routing and translation between the multiple disparate devices.


The disparate devices in the real world welding environment 700 may also include one or more welding wire feeders 110 (see FIG. 1), one or more welding guns or torches 120 (see FIG. 1), one or more gas meters/sensors 131 operatively connected to one or more tanks of shielding gas 130 (see FIG. 1), one or more mobile phone devices 710 (“smart” phones), one or more welding helmets 720 (“smart” welding helmets), and one or more welding fume extractors 730 (“smart” fume extractors). The term “smart” is used herein to refer to devices that have data communication capability and at least some limited capability to process and/or analyze the data which is communicated. Other types of disparate devices, in the real world welding environment 700, are possible as well, in accordance with other embodiments.


Any of the multiple disparate devices in the real world welding environment 700 may include one or more sensors (e.g., gas meter/sensor 131) to sense one or more corresponding parameters associated with the multiple disparate devices. The parameters may be communicated to the IoT technology platform 400 (e.g., using the connection server application 420). The parameters may include, for example, a temperature parameter, a pressure parameter, a humidity parameter, a voltage parameter, a current parameter, a wire feed speed parameter, a flow rate parameter, a spatial position parameter, a spatial orientation parameter, or a travel speed parameter. Other types of sensors and corresponding parameters are possible as well, in accordance with other embodiments.


In summary, systems and methods to aid a welder or welding student are provided. A system may include a real-world arc welding system or a virtual reality arc welding system along with a computerized eyewear device having a head-up display (HUD). The computerized eyewear device may be worn by a user under a conventional welding helmet as eye glasses are worn and may wirelessly communicate with a welding power source of a real-world arc welding system or a programmable processor-based subsystem of a virtual reality arc welding system.


A system to support communication and control in a welding environment is also disclosed. In one embodiment the system includes an internet-of-things (IoT) technology platform configured to provide scalable, interoperable, and secure communication connections between a plurality of disparate devices within a welding environment. The system also includes a welding power source configured to communicate with the IoT technology platform. The system further includes a computerized eyewear device. The computerized eyewear device includes a control and communication circuitry configured to communicate with the welding power source via the IoT technology platform. The computerized eyewear device also includes a transparent display configured to display information received from the welding power source via the IoT technology platform while allowing a user to view a surrounding portion of the welding environment through the transparent display.


In appended claims, the terms “including” and “having” are used as the plain language equivalents of the term “comprising”; the term “in which” is equivalent to “wherein.” Moreover, in appended claims, the terms “first,” “second,” “third,” “upper,” “lower,” “bottom,” “top,” etc. are used merely as labels, and are not intended to impose numerical or positional requirements on their objects. Further, the limitations of the appended claims are not written in means-plus-function format and are not intended to be interpreted based on 35 U.S.C. § 112, sixth paragraph, unless and until such claim limitations expressly use the phrase “means for” followed by a statement of function void of further structure. As used herein, an element or step recited in the singular and proceeded with the word “a” or “an” should be understood as not excluding plural of said elements or steps, unless such exclusion is explicitly stated. Furthermore, references to “one embodiment” of the present invention are not intended to be interpreted as excluding the existence of additional embodiments that also incorporate the recited features. Moreover, unless explicitly stated to the contrary, embodiments “comprising,” “including,” or “having” an element or a plurality of elements having a particular property may include additional such elements not having that property. Moreover, certain embodiments may be shown as having like or similar elements, however, this is merely for illustration purposes, and such embodiments need not necessarily have the same elements unless specified in the claims.


As used herein, the terms “may” and “may be” indicate a possibility of an occurrence within a set of circumstances; a possession of a specified property, characteristic or function; and/or qualify another verb by expressing one or more of an ability, capability, or possibility associated with the qualified verb. Accordingly, usage of “may” and “may be” indicates that a modified term is apparently appropriate, capable, or suitable for an indicated capacity, function, or usage, while taking into account that in some circumstances the modified term may sometimes not be appropriate, capable, or suitable. For example, in some circumstances an event or capacity can be expected, while in other circumstances the event or capacity cannot occur—this distinction is captured by the terms “may” and “may be.”


This written description uses examples to disclose the invention, including the best mode, and also to enable one of ordinary skill in the art to practice the invention, including making and using any devices or systems and performing any incorporated methods. The scope of the invention is defined by the claims, and may include other examples that occur to one of ordinary skill in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differentiate from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal language of the claims.


While the invention of the present application has been described with reference to certain embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the scope of the invention. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the invention without departing from its scope. Therefore, it is intended that the invention not be limited to the particular embodiments disclosed, but that the invention will include all embodiments falling within the scope of the appended claims.

Claims
  • 1. A welding system, comprising: an internet-of-things (IoT) technology platform including at least one server computer having a connection server application, wherein the internet-of-things (IoT) technology platform, includes means for providing scalable, interoperable, and secure wireless communication connections between a plurality of disparate devices, and includes means for enabling protocol-independent deployment of the plurality of disparate devices;at least one welding power source, being at least one of the plurality of disparate devices, including means for wirelessly communicating, two-way, with the internet-of-things (IoT) technology platform using the connection server application, wherein the at least one welding power source is an inverter-based welding power source that includes means for supporting at least one of a gas metal arc welding (GMAW) operation, a pas tungsten arc welding (GTAW) operation, or a shielded metal arc welding (SMAW) operation; andat least one computerized eyewear device, being at least one of the plurality of disparate devices, including: a control and communication circuitry, having a processor and a memory, configured to wirelessly communicate, two-way, with the at least one welding power source via the internet-of-things (IoT) technology platform using the connection server application, anda transparent display configured to display information received by the control and communication circuitry from the at least one welding power source via the internet-of-things (IoT) technology platform using the connection server application while allowing a user to view a surrounding portion of a real world welding environment through the transparent display.
  • 2. The welding system of claim 1, wherein the means for providing scalable, interoperable, and secure communication connections between the plurality of disparate devices includes WebSockets.
  • 3. The system of claim 1, wherein the internet-of-things (IoT) technology platform includes means for handling message routing and translation between the plurality of disparate devices.
  • 4. The system of claim 1, wherein the internet-of-things (IoT) technology platform includes means for allowing a developer to build, run, and grow applications to control and report data to and from any of the plurality of disparate devices.
  • 5. The system of claim 1, wherein the information displayed by the transparent display is in the form of at least one of text, a graphic, an image, or a video.
  • 6. The system of claim 1, wherein the information displayed by the transparent display includes at least one welding parameter received from the at least one welding power source via the internet-of-things (IoT) technology platform.
  • 7. The system of claim 1, wherein the at least one computerized eyewear device includes a microphone operatively connected to the control and communication circuitry and configured to receive voice-activated user command information from the user and communicate the voice-activated user command information to the at least one welding power source via the internet-of-things (IoT) technology platform.
  • 8. The system of claim 1, wherein the at least one computerized eyewear device includes a camera operatively connected to the control and communication circuitry and configured to capture at least one still image or moving video of the real world welding environment during a welding operation from the point-of-view of the user and communicate the at least one still image or video to the internet-of-things (IoT) technology platform for recording and storage.
  • 9. The system of claim 1, wherein the at least one computerized eyewear device includes a touch-sensitive user interface operatively connected to the control and communication circuitry and configured to allow a user to select command information and provide the command information to the at least one welding power source via the internet-of-things (IoT) technology platform to control the at least one welding power source.
  • 10. A welding system, comprising: a plurality of disparate devices within a real world welding environment having disparate wireless communication capabilities; andan internet-of-things (IoT) technology platform including at least one server computer having a connection server application, wherein the internet-of-things (IoT) technology platform includes means for providing scalable, interoperable, and secure wireless communication connections between the plurality of disparate devices, and includes means for enabling protocol-independent deployment of the plurality of disparate devices within the real world welding environment,wherein the plurality of disparate devices includes at least one welding power source including means for wirelessly communicating, two-way, with the internet-of-things (IoT) technology platform using the connection server application, andwherein the plurality of disparate devices includes at least one computerized eyewear device including: a control and communication circuitry, having a processor and a memory, configured to wirelessly communicate, two-way, with the at least one welding power source via the internet-of-things (IoT) technology platform using the connection server application, anda transparent display configured to display information received by the control and communication circuitry from the at least one welding power source via the internet-of-things (IoT) technology platform using the connection server application while allowing a user to view a surrounding portion of the real world welding environment through the transparent display.
  • 11. The system of claim 10, wherein the at least one computerized eyewear device includes means for providing an augmented reality capability via at least the transparent display.
  • 12. The system of claim 10, wherein the plurality of disparate devices includes at least one welding wire feeder.
  • 13. The system of claim 10, wherein the plurality of disparate devices includes at least one welding gun or torch.
  • 14. The system of claim 10, wherein the plurality of disparate devices includes at least one gas meter operatively connected to at least one tank of shielding gas.
  • 15. The system of claim 10, wherein the plurality of disparate devices includes at least one mobile phone device.
  • 16. The system of claim 10, wherein the plurality of disparate devices includes at least one welding helmet.
  • 17. The system of claim 10, wherein the plurality of disparate devices includes at least one welding fume extractor.
  • 18. The system of claim 10, wherein at least one of the plurality of disparate devices includes at least one sensor configured to sense at least one parameter associated with the at least one of the plurality of disparate devices and communicate the at least one parameter to the internet-of-things (IoT) technology platform using the connection server application.
  • 19. The system of claim 18, wherein the at least one parameter includes at least one of a temperature parameter, a pressure parameter, a humidity parameter, a voltage parameter, a current parameter, a wire feed speed parameter, a flow rate parameter, a spatial position parameter, a spatial orientation parameter, or a travel speed parameter.
CROSS-REFERENCE TO RELATED APPLICATIONS AND INCORPORATION BY REFERENCE

This application is a continuation-in-part (CIP) of U.S. application Ser. No. 15/245,535, filed on Aug. 24, 2016, entitled “SYSTEMS AND METHODS PROVIDING A COMPUTERIZED EYEWEAR DEVICE TO AID IN WELDING,” which is a continuation of U.S. application Ser. No. 14/105,758, filed Dec. 13, 2013, which claims priority to and the benefit of U.S. provisional patent application Ser. No. 61/827,248 filed on May 24, 2013. The entireties of the aforementioned applications are incorporated herein by reference. U.S. Pat. No. 9,285,592, entitled “WEARABLE DEVICE WITH INPUT AND OUTPUT STRUCTURES,” filed on Aug. 18, 2011, and which issued on Mar. 15, 2016, is incorporated by reference herein in its entirety. U.S. Pat. No. 8,747,116, entitled “SYSTEM AND METHOD PROVIDING ARC WELDING TRAINING IN A REAL-TIME SIMULATED VIRTUAL REALITY ENVIRONMENT USING REAL-TIME WELD PUDDLE FEEDBACK,” filed on Jul. 10, 2009, and which issued on Jun. 10, 2014, is incorporated by reference herein in its entirety.

US Referenced Citations (407)
Number Name Date Kind
317063 Wittenstrom May 1885 A
428459 Coffin May 1890 A
483428 Coffin Sep 1892 A
1159119 Springer Nov 1915 A
D140630 Garibay Mar 1945 S
D142377 Dunn Sep 1945 S
D152049 Welch, Jr. Dec 1948 S
2681969 Burke Jun 1954 A
D174208 Abildgaard Mar 1955 S
2728838 Barnes Dec 1955 A
D176942 Cross Feb 1956 S
2894086 Rizer Jul 1959 A
3035155 Hawk May 1962 A
3059519 Stanton Oct 1962 A
3356823 Waters et al. Dec 1967 A
3555239 Kerth Jan 1971 A
3621177 McPherson et al. Nov 1971 A
3654421 Streetman et al. Apr 1972 A
3739140 Rotilio Jun 1973 A
3866011 Cole Feb 1975 A
3867769 Schow et al. Feb 1975 A
3904845 Minkiewicz Sep 1975 A
3988913 Metcalfe et al. Nov 1976 A
D243459 Bliss Feb 1977 S
4024371 Drake May 1977 A
4041615 Whitehill Aug 1977 A
D247421 Driscoll Mar 1978 S
4124944 Blair Nov 1978 A
4132014 Schow Jan 1979 A
4237365 Lambros et al. Dec 1980 A
4280041 Kiessling et al. Jul 1981 A
4280042 Berger et al. Jul 1981 A
4280137 Ashida et al. Jul 1981 A
4314125 Nakamura Feb 1982 A
4354087 Osterlitz Oct 1982 A
4359622 Dostoomian et al. Nov 1982 A
4375026 Kearney Feb 1983 A
4410787 Kremers et al. Oct 1983 A
4429266 Tradt Jan 1984 A
4452589 Denison Jun 1984 A
D275292 Bouman Aug 1984 S
D277761 Korovin et al. Feb 1985 S
4525619 Ide et al. Jun 1985 A
D280329 Bouman Aug 1985 S
4611111 Baheti et al. Sep 1986 A
4616326 Meier et al. Oct 1986 A
4629860 Lindbom Dec 1986 A
4677277 Cook et al. Jun 1987 A
4680014 Paton et al. Jul 1987 A
4689021 Vasiliev et al. Aug 1987 A
4707582 Beyer Nov 1987 A
4716273 Paton et al. Dec 1987 A
D297704 Bulow Sep 1988 S
4867685 Brush et al. Sep 1989 A
4877940 Bangs et al. Oct 1989 A
4897521 Burr Jan 1990 A
4907973 Hon Mar 1990 A
4931018 Herbst et al. Jun 1990 A
4973814 Kojima et al. Nov 1990 A
4998050 Nishiyama et al. Mar 1991 A
5034593 Rice et al. Jul 1991 A
5061841 Richardson Oct 1991 A
5089914 Prescott Feb 1992 A
5192845 Kirmsse et al. Mar 1993 A
5206472 Myking et al. Apr 1993 A
5266930 Ichikawa et al. Nov 1993 A
5285916 Ross Feb 1994 A
5305183 Teynor Apr 1994 A
5320538 Baum Jun 1994 A
5337611 Fleming et al. Aug 1994 A
5360156 Ishizaka et al. Nov 1994 A
5360960 Shirk Nov 1994 A
5370071 Ackermann Dec 1994 A
D359296 Witherspoon Jun 1995 S
5424634 Goldfarb et al. Jun 1995 A
5436638 Bolas et al. Jul 1995 A
5464957 Kidwell et al. Nov 1995 A
D365583 Viken Dec 1995 S
5562843 Yasumoto Oct 1996 A
5662822 Tada Sep 1997 A
5670071 Ueyama et al. Sep 1997 A
5676503 Lang Oct 1997 A
5676867 Van Allen Oct 1997 A
5708253 Bloch et al. Jan 1998 A
5710405 Solomon et al. Jan 1998 A
5719369 White et al. Feb 1998 A
D392534 Degen et al. Mar 1998 S
5728991 Takada et al. Mar 1998 A
5751258 Fergason et al. May 1998 A
D395296 Kaye et al. Jun 1998 S
D396238 Schmitt Jul 1998 S
5781258 Dabral et al. Jul 1998 A
5823785 Matherne, Jr. Oct 1998 A
5835077 Dao et al. Nov 1998 A
5835277 Hegg Nov 1998 A
5845053 Watanabe et al. Dec 1998 A
5877777 Colwell Mar 1999 A
5916464 Geiger Jun 1999 A
5963891 Walker et al. Oct 1999 A
6008470 Zhang et al. Dec 1999 A
6037948 Liepa Mar 2000 A
6049059 Kim Apr 2000 A
6051805 Vaidya et al. Apr 2000 A
6114645 Burgess Sep 2000 A
6155475 Ekelof et al. Dec 2000 A
6155928 Burdick Dec 2000 A
6230327 Briand et al. May 2001 B1
6236013 Delzenne May 2001 B1
6236017 Smartt et al. May 2001 B1
6242711 Cooper Jun 2001 B1
6271500 Hirayama et al. Aug 2001 B1
6330938 Herve et al. Dec 2001 B1
6330966 Eissfeller Dec 2001 B1
6331848 Stove et al. Dec 2001 B1
D456428 Aronson, II et al. Apr 2002 S
6373465 Jolly et al. Apr 2002 B2
D456828 Aronson, II et al. May 2002 S
D461383 Blackburn Aug 2002 S
6441342 Hsu Aug 2002 B1
6445964 White et al. Sep 2002 B1
6492618 Flood et al. Dec 2002 B1
6506997 Matsuyama Jan 2003 B2
6552303 Blankenship et al. Apr 2003 B1
6560029 Dobbie et al. May 2003 B1
6563489 Latypov et al. May 2003 B1
6568846 Cote et al. May 2003 B1
D475726 Suga et al. Jun 2003 S
6572379 Sears et al. Jun 2003 B1
6583386 Ivkovich Jun 2003 B1
6621049 Suzuki Sep 2003 B2
6624388 Blankenship et al. Sep 2003 B1
D482171 Vui et al. Nov 2003 S
6647288 Madill et al. Nov 2003 B2
6649858 Wakeman Nov 2003 B2
6655645 Lu et al. Dec 2003 B1
6660965 Simpson Dec 2003 B2
6697701 Hillen et al. Feb 2004 B2
6697770 Nagetgaal Feb 2004 B1
6703585 Suzuki Mar 2004 B2
6708385 Lemelson Mar 2004 B1
6710298 Eriksson Mar 2004 B2
6710299 Blankenship et al. Mar 2004 B2
6715502 Rome et al. Apr 2004 B1
D490347 Meyers May 2004 S
6730875 Hsu May 2004 B2
6734393 Friedl et al. May 2004 B1
6744011 Hu et al. Jun 2004 B1
6750428 Okamoto et al. Jun 2004 B2
6765584 Wloka et al. Jul 2004 B1
6768974 Nanjundan et al. Jul 2004 B1
6772802 Few Aug 2004 B2
6788442 Potin et al. Sep 2004 B1
6795778 Dodge et al. Sep 2004 B2
6798974 Nakano et al. Sep 2004 B1
6857553 Hartman et al. Feb 2005 B1
6858817 Blankenship et al. Feb 2005 B2
6865926 O'Brien et al. Mar 2005 B2
D504449 Butchko Apr 2005 S
6920371 Hillen et al. Jul 2005 B2
6940037 Kovacevic et al. Sep 2005 B1
6940039 Blankenship et al. Sep 2005 B2
7021937 Simpson et al. Apr 2006 B2
7024342 Waite et al. Apr 2006 B1
7126078 Demers et al. Oct 2006 B2
7132617 Lee et al. Nov 2006 B2
7170032 Flood Jan 2007 B2
7194447 Harvey et al. Mar 2007 B2
7247814 Ott Jul 2007 B2
D555446 Picaza Ibarrondo et al. Nov 2007 S
7315241 Daily et al. Jan 2008 B1
D561973 Kinsley et al. Feb 2008 S
7353715 Myers Apr 2008 B2
7363137 Brant et al. Apr 2008 B2
7375304 Kainec et al. May 2008 B2
7381923 Gordon et al. Jun 2008 B2
7414595 Muffler Aug 2008 B1
7465230 LeMay et al. Dec 2008 B2
7478108 Townsend et al. Jan 2009 B2
D587975 Aronson, II et al. Mar 2009 S
7516022 Lee et al. Apr 2009 B2
7557327 Matthews Jul 2009 B2
7580821 Schirm et al. Aug 2009 B2
D602057 Osicki Oct 2009 S
7621171 O'Brien Nov 2009 B2
D606102 Bender et al. Dec 2009 S
7643890 Hillen et al. Jan 2010 B1
7687741 Kainec et al. Mar 2010 B2
D614217 Peters et al. Apr 2010 S
D615573 Peters et al. May 2010 S
7817162 Bolick et al. Oct 2010 B2
7853645 Brown et al. Dec 2010 B2
D631074 Peters et al. Jan 2011 S
7874921 Baszucki et al. Jan 2011 B2
7970172 Hendrickson Jun 2011 B1
7972129 O'Donoghue Jul 2011 B2
7991587 Ihn Aug 2011 B2
8069017 Hallquist Nov 2011 B2
8224881 Spear et al. Jul 2012 B1
8248324 Nangle Aug 2012 B2
8265886 Bisiaux et al. Sep 2012 B2
8274013 Wallace Sep 2012 B2
8287522 Moses et al. Oct 2012 B2
8316462 Becker et al. Nov 2012 B2
8363048 Gering Jan 2013 B2
8365603 Lesage et al. Feb 2013 B2
8512043 Choquet Aug 2013 B2
8569646 Daniel et al. Oct 2013 B2
8680434 Stoger et al. Mar 2014 B2
8747116 Zboray et al. Jun 2014 B2
8777629 Kreindl et al. Jul 2014 B2
RE45062 Maguire, Jr. Aug 2014 E
8851896 Wallace et al. Oct 2014 B2
8860760 Chen Oct 2014 B2
8915740 Zboray Dec 2014 B2
RE45398 Wallace Mar 2015 E
8992226 Leach et al. Mar 2015 B1
9011154 Kindig et al. Apr 2015 B2
9293056 Zboray et al. Mar 2016 B2
9293057 Zboray et al. Mar 2016 B2
9318026 Peters et al. Apr 2016 B2
9323056 Williams Apr 2016 B2
9522437 Pfeifer Dec 2016 B2
20010045808 Hietmann et al. Nov 2001 A1
20010052893 Jolly et al. Dec 2001 A1
20020032553 Simpson et al. Mar 2002 A1
20020046999 Veikkolainen et al. Apr 2002 A1
20020050984 Roberts May 2002 A1
20020085843 Mann Jul 2002 A1
20020175897 Pelosi Nov 2002 A1
20030000931 Ueda et al. Jan 2003 A1
20030011673 Eriksson Jan 2003 A1
20030025884 Hamana et al. Feb 2003 A1
20030075534 Okamoto et al. Apr 2003 A1
20030106787 Santilli Jun 2003 A1
20030111451 Blankenship et al. Jun 2003 A1
20030172032 Choquet Sep 2003 A1
20030186199 McCool et al. Oct 2003 A1
20030223592 Deruginsky et al. Dec 2003 A1
20030234885 Pilu Dec 2003 A1
20040020907 Zauner et al. Feb 2004 A1
20040035990 Ackeret Feb 2004 A1
20040050824 Samler Mar 2004 A1
20040088071 Kouno et al. May 2004 A1
20040140301 Blankenship et al. Jul 2004 A1
20040181382 Hu et al. Sep 2004 A1
20040217096 Lipnevicius Nov 2004 A1
20050007504 Fergason Jan 2005 A1
20050017152 Fergason Jan 2005 A1
20050029326 Henrickson Feb 2005 A1
20050046584 Breed Mar 2005 A1
20050050168 Wen et al. Mar 2005 A1
20050101767 Clapham et al. May 2005 A1
20050103766 Iizuka et al. May 2005 A1
20050103767 Kainec et al. May 2005 A1
20050109735 Flood May 2005 A1
20050128186 Shahoian et al. Jun 2005 A1
20050133488 Blankenship et al. Jun 2005 A1
20050159840 Lin et al. Jul 2005 A1
20050163364 Beck et al. Jul 2005 A1
20050189336 Ku Sep 2005 A1
20050199602 Kaddani et al. Sep 2005 A1
20050230573 Ligertwood Oct 2005 A1
20050252897 Hsu et al. Nov 2005 A1
20050275913 Vesely et al. Dec 2005 A1
20050275914 Vesely et al. Dec 2005 A1
20060014130 Weinstein Jan 2006 A1
20060076321 Maev et al. Apr 2006 A1
20060136183 Choquet Jun 2006 A1
20060142656 Malackowski et al. Jun 2006 A1
20060154226 Maxfield Jul 2006 A1
20060163227 Hillen et al. Jul 2006 A1
20060166174 Rowe et al. Jul 2006 A1
20060169682 Kainec et al. Aug 2006 A1
20060173619 Brant et al. Aug 2006 A1
20060189260 Sung Aug 2006 A1
20060207980 Jacovetty et al. Sep 2006 A1
20060213892 Ott Sep 2006 A1
20060214924 Kawamoto et al. Sep 2006 A1
20060226137 Huismann et al. Oct 2006 A1
20060252543 Van Noland et al. Nov 2006 A1
20060258447 Baszucki et al. Nov 2006 A1
20070034611 Drius et al. Feb 2007 A1
20070038400 Lee et al. Feb 2007 A1
20070045488 Shin Mar 2007 A1
20070080153 Albrecht Apr 2007 A1
20070088536 Ishikawa Apr 2007 A1
20070112889 Cook et al. May 2007 A1
20070198117 Wajihuddin Aug 2007 A1
20070209586 Ebensberger et al. Sep 2007 A1
20070211026 Ohta Sep 2007 A1
20070221797 Thompson et al. Sep 2007 A1
20070256503 Wong et al. Nov 2007 A1
20070277611 Portzgen et al. Dec 2007 A1
20070291035 Vesely et al. Dec 2007 A1
20080021311 Goldbach Jan 2008 A1
20080031774 Magnant et al. Feb 2008 A1
20080038702 Choquet Feb 2008 A1
20080061113 Seki et al. Mar 2008 A9
20080078811 Hillen et al. Apr 2008 A1
20080078812 Peters et al. Apr 2008 A1
20080117203 Gering May 2008 A1
20080120075 Wloka May 2008 A1
20080128398 Schneider Jun 2008 A1
20080135533 Ertmer et al. Jun 2008 A1
20080140815 Brant et al. Jun 2008 A1
20080149686 Daniel et al. Jun 2008 A1
20080203075 Feldhausen et al. Aug 2008 A1
20080233550 Solomon Sep 2008 A1
20080303197 Paquette et al. Dec 2008 A1
20080314887 Stoger et al. Dec 2008 A1
20090015585 Klusza Jan 2009 A1
20090021514 Klusza Jan 2009 A1
20090045183 Artelsmair et al. Feb 2009 A1
20090050612 Serruys Feb 2009 A1
20090057286 Ihara et al. Mar 2009 A1
20090152251 Dantinne et al. Jun 2009 A1
20090173726 Davidson et al. Jul 2009 A1
20090184098 Daniel et al. Jul 2009 A1
20090200281 Hampton Aug 2009 A1
20090200282 Hampton Aug 2009 A1
20090231423 Becker Sep 2009 A1
20090259444 Dolansky et al. Oct 2009 A1
20090298024 Batzler et al. Dec 2009 A1
20090325699 Delgiannidis Dec 2009 A1
20100012017 Miller Jan 2010 A1
20100012637 Jaeger Jan 2010 A1
20100048273 Wallace et al. Feb 2010 A1
20100062405 Zboray et al. Mar 2010 A1
20100062406 Zboray et al. Mar 2010 A1
20100096373 Hillen et al. Apr 2010 A1
20100121472 Babu et al. May 2010 A1
20100133247 Mazumder et al. Jun 2010 A1
20100133250 Sardy et al. Jun 2010 A1
20100176107 Bong Jul 2010 A1
20100201803 Melikian Aug 2010 A1
20100224610 Wallace Sep 2010 A1
20100276396 Cooper et al. Nov 2010 A1
20100299101 Shimada et al. Nov 2010 A1
20100307249 Lesage et al. Dec 2010 A1
20110006047 Penrod et al. Jan 2011 A1
20110060568 Goldfine et al. Mar 2011 A1
20110091846 Kreindl et al. Apr 2011 A1
20110114615 Daniel et al. May 2011 A1
20110116076 Chantry et al. May 2011 A1
20110117527 Conrardy et al. May 2011 A1
20110122495 Togashi May 2011 A1
20110183304 Wallace et al. Jul 2011 A1
20110187746 Suto et al. Aug 2011 A1
20110220616 Mehn Sep 2011 A1
20110248864 Becker et al. Oct 2011 A1
20110284500 Rappl Nov 2011 A1
20110290765 Albrecht et al. Dec 2011 A1
20110316516 Schiefermuller et al. Dec 2011 A1
20120122062 Yang et al. May 2012 A1
20120180180 Steve Jul 2012 A1
20120189993 Kindig et al. Jul 2012 A1
20120291172 Wills et al. Nov 2012 A1
20120298640 Conrardy et al. Nov 2012 A1
20130026150 Chantry et al. Jan 2013 A1
20130040270 Albrecht Feb 2013 A1
20130044042 Olsson et al. Feb 2013 A1
20130049976 Maggiore Feb 2013 A1
20130075380 Albrech Mar 2013 A1
20130182070 Peters et al. Jul 2013 A1
20130183645 Wallace et al. Jul 2013 A1
20130189657 Wallace et al. Jul 2013 A1
20130189658 Peters et al. Jul 2013 A1
20130206741 Pfeifer Aug 2013 A1
20130209976 Postlethwaite et al. Aug 2013 A1
20130215281 Hobby Aug 2013 A1
20130230832 Peters et al. Sep 2013 A1
20130231980 Elgart et al. Sep 2013 A1
20130291271 Becker Nov 2013 A1
20130327747 Dantinne et al. Dec 2013 A1
20140017642 Postlethwaite et al. Jan 2014 A1
20140038143 Daniel et al. Feb 2014 A1
20140051358 Dina et al. Feb 2014 A1
20140065584 Wallace et al. Mar 2014 A1
20140134579 Becker May 2014 A1
20140134580 Becker May 2014 A1
20140205976 Peters Jul 2014 A1
20140263224 Becker Sep 2014 A1
20140272835 Becker Sep 2014 A1
20140272836 Becker Sep 2014 A1
20140272837 Becker Sep 2014 A1
20140272838 Becker Sep 2014 A1
20140312020 Daniel Oct 2014 A1
20140315167 Kreindl et al. Oct 2014 A1
20140322684 Wallace et al. Oct 2014 A1
20140346158 Matthews Nov 2014 A1
20150056584 Boulware et al. Feb 2015 A1
20150056585 Boulware et al. Feb 2015 A1
20150056586 Penrod et al. Feb 2015 A1
20150154884 Salsich et al. Jun 2015 A1
20150209887 DeLisio Jul 2015 A1
20150228203 Kindig Aug 2015 A1
20150261015 Han et al. Sep 2015 A1
20150375323 Becker Dec 2015 A1
20160045971 Holverson Feb 2016 A1
20160148098 Barhorst et al. May 2016 A1
20160163221 Sommers et al. Jun 2016 A1
20160236303 Matthews Aug 2016 A1
20160250706 Beeson et al. Sep 2016 A1
20160267806 Hsu et al. Sep 2016 A1
20160365004 Matthews et al. Dec 2016 A1
20170046974 Becker Feb 2017 A1
20170053557 Daniel Feb 2017 A1
Foreign Referenced Citations (104)
Number Date Country
2698078 Sep 2011 CA
101209512 Jul 2008 CN
101214178 Jul 2008 CN
201083660 Jul 2008 CN
101419755 Apr 2009 CN
201229711 Apr 2009 CN
101571887 Nov 2009 CN
101587659 Nov 2009 CN
102014819 Apr 2011 CN
102165504 Aug 2011 CN
102298858 Dec 2011 CN
202684308 Jan 2013 CN
103871279 Jun 2014 CN
105057869 Nov 2015 CN
2833638 Feb 1980 DE
3046634 Jul 1982 DE
3244307 May 1984 DE
3522581 Jan 1987 DE
4037879 Jun 1991 DE
19615069 Oct 1997 DE
19739720 Oct 1998 DE
19834205 Feb 2000 DE
20009543 Aug 2001 DE
102005047204 Apr 2007 DE
102010038902 Feb 2012 DE
202012013151 Feb 2015 DE
0108599 May 1984 EP
0127299 Dec 1984 EP
0145891 Jun 1985 EP
0319623 Jun 1989 EP
0852986 Jul 1998 EP
1 010 490 Jun 2000 EP
1527852 May 2005 EP
1905533 Apr 2008 EP
2274736 May 2007 ES
1456780 Jul 1966 FR
2827066 Jan 2003 FR
2926660 Jul 2009 FR
1455972 Nov 1976 GB
1511608 May 1978 GB
2254172 Sep 1992 GB
2435838 Sep 2007 GB
2454232 May 2009 GB
2224877 Sep 1990 JP
5329645 Dec 1993 JP
07047471 Feb 1995 JP
07232270 Sep 1995 JP
8132274 May 1996 JP
8150476 Jun 1996 JP
H08505091 Jun 1996 JP
11104833 Apr 1999 JP
2000167666 Jun 2000 JP
2000-237872 Sep 2000 JP
2001071140 Mar 2001 JP
2002278670 Sep 2002 JP
2003200372 Jul 2003 JP
2003-240562 Aug 2003 JP
2003326362 Nov 2003 JP
2006006604 Jan 2006 JP
2006281270 Oct 2006 JP
2007290025 Nov 2007 JP
2009500178 Jan 2009 JP
2009160636 Jul 2009 JP
2010-019646 Jan 2010 JP
2012024867 Feb 2012 JP
20090010693 Jan 2009 KR
20140030644 Mar 2014 KR
2008108601 Sep 2009 RU
1038963 Aug 1983 SU
1651309 May 1991 SU
WO-9845078 Oct 1998 WO
WO-0112376 Feb 2001 WO
WO-0143910 Jun 2001 WO
WO-0158400 Aug 2001 WO
WO-2005102230 Nov 2005 WO
WO-2006034571 Apr 2006 WO
WO-2007009131 Jan 2007 WO
WO-2007039278 Apr 2007 WO
WO-2009060231 May 2009 WO
WO-2009120921 Oct 2009 WO
2009137379 Nov 2009 WO
WO-2009149740 Dec 2009 WO
WO-2010000003 Jan 2010 WO
WO-2010020867 Feb 2010 WO
WO-2010020870 Feb 2010 WO
WO-2010044982 Apr 2010 WO
WO-2010091493 Aug 2010 WO
WO-2011045654 Apr 2011 WO
WO-2011058433 May 2011 WO
WO-2011067447 Jun 2011 WO
WO-2011097035 Aug 2011 WO
WO-2012082105 Jun 2012 WO
WO-2012143327 Oct 2012 WO
WO-2013014202 Jan 2013 WO
2013025672 Feb 2013 WO
WO-2013061518 May 2013 WO
WO-2013114189 Aug 2013 WO
WO-2013175079 Nov 2013 WO
WO-2014007830 Jan 2014 WO
WO-2014019045 Feb 2014 WO
WO-2014020386 Feb 2014 WO
2016137578 Sep 2016 WO
2016144741 Sep 2016 WO
20121327060 Oct 2020 WO
Non-Patent Literature Citations (211)
Entry
“High Performance Computer Architectures_ A Historical Perspective,” downloaded May 5, 2016.
http://homepages.inf.ed.ac.uk/cgi/rni/comparch. pl?Paru/perf.html,Paru/perf-f.html,Paru/menu-76.html.
Abbas, et al., Code Aster (Software) EDR (France) 14 pages, Oct. 2001.
Abbas, et al., Code_Aster; Introduction to Code_Aster; User Manual; Booket U1.0-: Introduction to Code_Aster; Document: U1.02.00; Version 7.4; Jul. 22, 2005.
Abida et al., “Numerical simulation to study the effect of tack welds and root gap on welding deformations and residual stresses of a pipe-flange joint”, Faculty of Mechanical Engineering, GIK Institute of Engineering Sciences and Technology, Topi, NWFP, Pakistan. Available on-line Aug. 25, 2005.
Adams, et al., “Adaptively sampled particle fluids,” ACM SIGGRAPH 2007 papers, Aug. 5-9, 2007, San Diego, California.
Agren, “Sensor Integration for Robotic Arc Welding;” 1995; vol. 5604C of Dissertations Abstracts International p. 1123; Dissertation Abs Online (Dialog® File 35): © 2012 ProQuest Info& Learning: http://dialogweb.com/cgi/dwclient?req=1331233317524; one (1) page; printed Mar. 8, 2012.
Aidun et al., Penetration in Spot GTA Welds during Centrifugation, Journal of Materials Engineering and Performance vol. 7(5) Oct. 1998—597.
Aidun, D., “Influence of Simulated High-g on the Weld Size of Al—Li Alloy” Elevator Science Ltd.; 2001; 4 pages.
Aiteanu et al., “Generation and Rendering of a Virtual Welding Seam in an Augmented Reality Training Environment” Proceedings of the Sixth IASTED International Conference, Aug. 2006, 8 pages.
Aiteanu, “Virtual and Augmented Reality Supervisor for a New Welding Helmet” Dissertation Nov. 15, 2005.
Aiteanu, et al., “A Step Forward in Manual Welding:; Demonstration of Augmented Reality Helmet” Institute of Automation, University of Bremen,; Germany, Proceedings ofthe Second IEEE and ACM International Symposium on Mixed and; Augmented Reality; 2003; 2 pages.
Aiteanu, et al., “Computer-Aided Manual Welding Using an Augmented; Reality Supervisor” Sheet Metal Welding Conference XII, Livonia, MI, May 9-12, 2006, 14 pages.
American Welding Society Advance Program of Programs and Events. Nov. 11-14, 2007. 31 pages. Chicago, IL.
American Welding Society Detroit Section, “Sheet Metal Welding Conference XII”, May 2006, 11 pages.
American Welding Society, “Vision for Welding Industry”; 41 pages, Estimated Jan. 1998.
American Welding Society, ANSI/A WS D 10.11 MID 10. 11 :2007 Guide for Root Pass Welding of Pipe without Backing Edition: 3rd American Welding Society / Oct. 13, 2006/36 pages ISBN: 0871716445.
American Welding Society, http://www.nsrp.org/6-presentations/WDVirtual_Welder. pdf (Virtual Reality Welder Training,; Project No. SI051, Navy ManTech Program, Project Review for Ship Tech 2005); 22 pages.; Biloxi, MS.
American Welding Society, https://app.aws.org/conferences/defense/live index.html (AWS Welding in the Defense; Industry conference schedule, estimated Jan. 2004); 12 pages.
American Welding Society, https://app.aws.org/w/r/www/wj/2005/03/WJ_2005_03.pdf (AWS Welding Journal, Mar. 2005; (see, e.g., p. 54)).; 114 pages.
American Welding Society, https://app.aws.org/wj/2004/04/052/njc (AWS Virtual Reality Program to Train Welders for; Shipbuilding, workshop information, 2004); 7 pages.
American Welding Society, https://app.aws.org/wj/2007 /11/WJ200711.pdf (AWS Welding Journal, Nov. 2007); 240 pages.
American Welding Society, Welding Handbook, Welding Science & Technology, Ninth Ed., Copyright 2001. Appendix A “Terms and Definitions”.
Antonelli et al, “A Semi-Automated Welding Station Exploiting Human-Robot Interaction,” Advanced Manufacturing Systems and Technology (2011) pp. 249-260.
ARC+—Archived Press Release from WayBack Machine from Jan. 31, 2008-Apr. 22, 2013, Page, https://web.archive.org/web/20121006041803/http://www.123certification.com/en/article_press/index.htm, Jan. 21, 2016, 3 pages.
ARC+ simulator; http://www.123arc.com/en/depliant_ang.pdf; Estimated Jan. 2000.
Kenneth Fast; Virtual Welding—A Low Cost Virtual Reality Welder system training system phase II; NSRP ASE Technology Investment Agreement; Feb. 29, 2012; pp. 1-54.
ArcSentry Weld Quality Monitoring System; Native American Technologies, allegedly 2002, 5; pages.
ARS ELECTRONICA LINZ GMBH, Fronius, 2 pages, May 18, 1997.
Arvika Forum Vorstellung Projekt PAARi. BMW Group Virtual Reality Center. 4 pages.; Nuernberg. 2003.
asciencetutor.com, A division of Advanced Science and Automation Corp., VWL (Virtual Welding Lab), 2 pages, 2007.
ASME Definitions, Consumables, Welding Positions, dated Mar. 19, 2001. See http://www.gowelding.com/wp/asme4.htm.
Balijepalli, et al. “Haptic Interfaces for Virtual Environment and Teleoperator Systems,” Haptics 2003, Department of Mechanical & Aerospace Engineering, State University of New York at Buffalo, NY.
Bargteil, et al., “A semi-lagrangian contouring method for fluid simulation,” ACM Transactions on Graphics, 25(1), 2006.
Bargteil, et al., “A texture synthesis method for liquid animations,” In Proceedings of the ACM SIGGRAPH/Eurographics Symposium on Computer Animation, Sep. 2006.
Bender Shipbuilding and Repair Co. Virtual Welding—A Low Cost Virtual Reality Welding; Training System. Proposal submitted pursuant to MSRP Advanced Shipbuilding Enterprise; Research Announcement, Jan. 23, 2008. 28 pages, See also, http://www.nsrp.org/6-; Presentations/WD/020409 Virtual Weldinq Wilbur.pdf;.
Borzecki, et al., Specialist Committee V.3 Fabrication Technology Committee Mandate, Aug. 20-25, 2006, 49 pages, vol. 2, 16th International Ship and Offshore Structures Congress, Southampton, UK.
Catalina, et al., “Interaction of Porosity with a Planar Solid/Liquid Interface” (“Catalina”), Metallurgical and Materials Transactions, vol. 35A, May 2004, pp. 1525-1538.
ChemWeb.com—Journal of Materials Engineering (printedSep. 26, 2012) (01928041).
Chen, et al., “Self-Learning Fuzzy Neural Networks and Computer Vision for Control of Pulsed GTAW,” dated May 1997.
Chentanez, et al., “Liquid simulation on lattice-based tetrahedral meshes.” In ACM SIGGRAPH/Eurographics Symposium on Computer Animation 2007, pp. 219-228, Aug. 2007.
Chentanez, et al., “Simultaneous coupling of fluids and deformable bodies,” In ACM SIGGRAPH/Eurographics Symposium on Computer Animation, pp. 83-89, Aug. 2006.
Choquet, C., “ARC+: Today's Virtual Reality Solution for Welders” Internet Page, Jan. 1, 2008; 6 pages.
Choquet, C., “ARC+®: Today's Virtual Reality Solution for Welders”, Published in Proceedings of the IIW Internatioal Conference; Jul. 10-11, 2008; 19 pages.
Clausen, et al., “Simulating liquids and solid-liquid interactions with lagrangian meshes,” ACM Transactions on Graphics, 32(2):17:1-15, Apr. 2013. Presented at SIGGRAPH 2013.
Cooperative Research Program, Virtual Reality Welder Training, Summary Report SR 0512, 4 pages, Jul. 2005.
CS Wave, The Virtual Welding Trainer, 6 pages, 2 estimated Jan. 2007.
CS Wave—Manual, “Virtual Welding Workbench User Manual 3.0” estimated Jan. 2007.
CUDA Programming Guide Version 1.1, Nov. 29, 2007.
Da Dalto, et al. “CS Wave, A Virtual learning tool for welding motion”, 10 pages, Mar. 14, 2008.
Da Dalto, et al. “CS Wave: Learning welding motion in a virtual environment” Published in Proceedings of the IIW International Conference, Jul. 10-11, 2008.
Desroches, X.; Code-Aster, Note of use for aciculations of welding; Instruction manual U2.03 booklet: Thermomechanical; Document: U2.03.05; Oct. 1, 2003.
D'Huart, et al., “Virtual Environment for Training” 6th International Conference, ITS 20002, Jun. 2002; 6 pages.
Dotson, “Augmented Reality Welding Helmet Prototypes How Awesome the Technology Can Get,” Sep. 26, 2012, Retrieved from the Internet: URL:http://siliconangle.com/blog/2012/09/26/augmented-reality-welding-helmet-prototypes-how-awesome-the-technology-can-get/,retrieved on Sep. 26, 2014, 1 page.
Echtler et al, “17 The Intelligent Welding Gun: Augmented Reality for Experimental Vehicle Construction,” Virtual and Augmented Reality Applications in Manufacturing (2003) pp. 1-27.
Edison Welding Institute, E-Weld Predictor, 3 pages, 2008.
Eduwelding+, Training Activities with arc+ simulator; Weld Into the Future, Online Welding Simulator—A virtual training environment; 123arc.com; 6 pages, May 2008.
Eduwelding+, Weld Into the Future; Online Welding Seminar—A virtual training environment; 123arc.com; 4 pages, 2005.
Energetics, Inc. “Welding Technology Roadmap”, Sep. 2000, 38 pages.
Fast, K. et al., “Virtual Training for Welding”, Mixed and Augmented Reality, 2004, ISMAR 2004, Third IEEE and CM International Symposium on Arlington, VA, Nov. 2-5, 2004.
Feldman, et al., “Animating Suspended Particle Explosions”. In Proceedings of ACM SIGGRAPH 2003, pp. 708-715, Aug. 2003.
Feldman, et al., “Fluids in deforming meshes” In ACM SIGGRAPH/Eurographics Symposium on Computer Animation 2005, Jul. 2005.
Fite-Georgel, “Is there a Reality in Industrial Augmented Reality?” 10th IEEE International Symposium on Mixed and Augmented Reality (ISMAR). 10 pages, allegedly 2011.
Foster, et al., “Realistic animation of liquids,” Graphical Models and Image Processing, v.58 n.5, p. 471-483, Sep. 1996.
Foster, et al., “Practical animation of liquids,” Proceedings of the 28th annual conference on Computer graphics and interactive techniques, p. 23-30, Aug. 2001.
Garcia-Allende, et al., “Defect Detection in Arc-Welding Processes by Means of the Line-to-Continuum Method and Feature Selection” www.mdpi.com/journal/sensors; Sensors 2009, 9, 7753-7770; DOI; 10.3390/s91007753.
Goktekin, et al., “A Method for Animating Viscoelastic Fluids”. ACM Transactions on Graphics (Proc. of ACM SIGGRAPH 2004), 23(3):463-468, 2004.
Graham, “Texture Mapping” Carnegie Mellon University Class 15-462 Computer graphics, Lecture 10 dated Feb. 13, 2003; 53 pages.
Grahn, A., “Interactive Simulation of Contrast Fluid using Smoothed Particle Hydrodynamics,” Jan. 1, 2008, Master's Thesis in Computing Science, Umeå University, Department of Computing Science, Umeå, Sweden.
Guu et al., “Technique for Simultaneous Real-Time Measurements of Weld Pool Surface Geometry and Arc Force,” Dec. 1992.
Heston, Virtually Welding—raining in a virtual environment gives welding students a leg up, retrieved on Apr. 12, 2010 from: http://www.thefabricator.com/article/arcwelding/virtually-welding.
Hillers, et al., “Augmented Reality—Helmet for the Manual; Welding Process” Institute of Automation, University of Bremen, Germany; 21 pages, 2004.
Hillers, et al., “Direct welding arc observation without harsh flicker,” 8 pages, allegedly FABTECH International and AWS welding show, 2007.
Hillers, et al., “Real time Arc-Welding Video Observation System.” 62nd International Conference of IIW, Jul. 12-17, 2009, 5 pages Singapore 2009.
Hillers, et al., “TEREBES:; Welding Helmet with AR Capabilities”, Institute of Automatic University Bremen; Institute of; Industrial Engineering and Ergonomics, 10 pages, alleqedlv 2004.
Hillis, et al., “Data Parallel Algorithms”, Communications of the ACM, Dec. 1986, vol. 29, No. 12, p. 1170.
Hirche, et al. “Hardware Accelerated Per-Pixel Displacement Mapping” University of Tubingen, Germany, Alexander Ehlert, Stefan Guthe, WStIGRfS & Michael Doggett, ATI Research; 8 pages.
Holmberg et al, “Efficient modeling and rendering of turbulent water over natural terrain,” In Proceedings of the 2nd international conference on Computer graphics and interactive techniques in Australasia and South East Asia (GRAPHITE '04) 2004.
Sun Yaoming; Application of Micro Computer in Robotic Technologies; Science and Technology Literature Press; Catalogue of New Books of Science and Technology; Sep. 1987, pp. 360-363.
Hu et al. “Heat and mass transfer in gas metal arc welding. Part 1: the arc” found in ScienceDirect, International Journal of Heat and Mass transfer 50 (2007) 833-846 Available on Line on Oct. 24, 2006 http://web.mst.edu/˜tsai/publications/Hu-IJHMT-2007-1-60.pdf.
Impact Welding: examples from current and archived website, trade shows, etc. See, e.g.,; http://www.impactweldinq.com. 53 pages; estimated Jan. 2000.
Irving, et al., “Efficient simulation of large bodies of water by coupling two and three dimensional techniques,” ACM SIGGRAPH 2006 Papers, Jul. 30-Aug. 3, 2006, Boston, Massachusetts.
Jeffus, “Welding Principles and Applications” Sixth Edition, 2008, 10 pages.
Jonsson et al. “Simulation of Tack Welding Procedures in Butt Joint Welding of Plates” Research Supplement, Oct. 1985.
Juan Vicenete Rosell Gonzales, “RV-Sold: simulator virtual para la formacion de soldadores”; Deformacion Metalica, Es. vol. 34, No. 301 Jan. 1, 2008.
Kass, et al., “Rapid, Stable Fluid Dynamics for Computer Graphics,” Proceedings of SIGGRAPH '90, in Computer Graphics, vol. 24, No. 4, pp. 49-57, 1990.
Klingner, et al., “Fluid animation with dynamic meshes,” In Proceedings of ACM SIGGRAPH 2006, pp. 820-825, Aug. 2006.
Kobayashi, et al., “Simulator of Manual Metal Arc Welding with Haptic Display” (“Kobayashi 2001”), Proc. of the 11th International Conf. on Artificial Reality and Telexistence (ICAT), Dec. 5-7, 2001, pp. 175-178, Tokyo, Japan.
Kobayashi, et al., “Skill Training System of Manual Arc Welding by Means of Face-Shield-Like HMD and Virtual Electrode” (“Kobayashi 2003”), Entertainment Computing, vol. 112 of the International Federation for Information Processing (IFIP), Springer Science + Business Media, New York, copyright 2003, pp. 389-396.
Lincoln Global, Inc., “VRTEX 360: Virtual Reality Arc Welding Trainer” Brochure (2015) 4 pages.
Lindholm, et al., “NVIDIA Testla: A Unifired Graphics and Computing Architecture”, IEEE Computer Society, 2008.
Mahrle, A., et al.; “the influence of fluid flow phenomena on the laser beam welding process” International Journal of Heat and Fluid Flow 23 (2002, No. 3, pp. 288-297; Institute of Fluid Dynamics and Thermodynamics, Otto-von-Guericke University Magdeburg, P.O. Box 4120, D-39016 Magdeburg, Germany.
Mann, et al., “Realtime HDR (High Dynamic Range) Video for Eyetap Wearable Computers, FPGA-Based Seeing Aids, and Glasseyes (EYETAPS),” 2012 25th IEEE Canadian Conference on Electrical and Computer Engineering (CCECE),pp. 1-6, Apr. 29, 2012, 6 pages.
Mantinband, et al., “Autosteroscopic, field-sequential display with full freedom of movement OR Let the display were the shutter-glasses,” 3ality (Israel) Ltd., 2002.
Mavrikios D et al, A prototype virtual reality-based demonstrator for immersive and interactive simulation of welding processes, International Journal of Computer Integrated manufacturing, Taylor and Francis, Basingstoke, GB, vol. 19, No. 3, Apr. 1, 2006, pp. 294-300.
Miller Electric Mfg. Co, “LiveArc: Welding Performance Management System” Owner's Manual, (Jul. 2014) 64 pages.
Miller Electric Mfg. Co., “LiveArc Welding Performance Management System” Brochure, (Dec. 2014) 4 pages.
Miller Electric Mfg. Co.; MIG Welding System features weld monitoring software; NewsRoom 2010 (Dialog® File 992); © 2011 Dialog. 2010; http://www.dialogweb.com/cgi/dwclient?reg=1331233430487; three (3) pages; printed Mar. 8, 2012.
Moore, “No exponential is forever: but ‘Forever’ can be delayed!,” IEEE International Solid-State Circuits Conference, 2003.
Müller, et al., “Particle-based fluid simulation for interactive applications,” Proceedings of the 2003 ACM SIGGRAPH/Eurographics symposium on Computer animation, Jul. 26-27, 2003, San Diego, California.
Müller, et al., “Point Based Animation of Elastic, Plastic and Melting Objects,” Eurographics/ACM SIGGRAPH Symposium on Computer Animation (2004).
N. A. Tech., P/NA.3 Process Modeling and Optimization, 11 pages, Jun. 4, 2008.
Nasios, “Improving Chemical Plant Safety Training Using Virtual Reality,” Thesis submitted to the University of Nottingham for the Degree of Doctor of Philosophy, Dec. 2001.
Nealen, A., “Point-Based Animation of Elastic, Plastic, and Melting Objects,” CG topics, Feb. 2005.
Nordruch, et al., “Visual Online Monitoring of PGMAW Without a Lighting Unit”, Jan. 2005.
NSRP ASE, Low-Cost Virtual Realtiy Welder Training System, 1 Page, 2008.
O'Brien et al.,“Dynamic Simulation of Splashing Fluids”. In Proceedings of Computer Animation 95, pp. 198-205, Apr. 1995.
O'Brien, “Google's Project Glass gets some more details”,Jun. 27, 2012 (Jun. 27, 2012), Retrieved from the Internet: http://www.engadget.com/2012/06/27/googles-project-glass-gets-some-more-details/, retrieved on Sep. 26, 2014, 1 page.
P/NA.3 Process Modelling and Optimization; Native American Technologies, allegedly 2002,; 5 pages.
Penrod, “New Welder Training Tools.” EWI PowerPoint presentation; 16 pages allegedly 2008.
Phar, “GPU Gems 2 Programming Techniques for High-Performance Graphics and General-Purpose Computation,” 2005, 12 pages.
Porter, et al. Virtual Reality Welder Trainer, Session 5: Joining Technologies for Naval Applications: earliest date Jul. 14, 2006 (http://weayback.archive.org) Edision Welding Institute; J. Allan Cote, General Dynamics Electric Boat; Timothy D. Gifford, VRSim, and Wim Lam, FCS Controls.
Porter, et al., Virtual Reality Training, Paper No. 2005-P19, 14 pages, 2005.
Porter, et al., Virtual Reality Training, vol. 22, No. 3, Aug. 2006; 13 pages.
Porter, et al., Virtual Reality Welder Training, dated Jul. 14, 2006.
Praxair Technology Inc., “The RealWeld Trainer System: Real Weld Training Under Real Conditions” Brochure (Est. Jan. 2013) 2 pages.
Premoze, et al., “Particle-based simulation of fluids,” Comput. Graph. Forum 22, 3, 401-410, 2003.
Rasmussen, et al., “Directable photorealistic liquids,” Proceedings of the 2004 ACM SIGGRAPH/Eurographics symposium on Computer animation, Aug. 27-29, 2004, Grenoble, France.
Ratnam, et al., “Automatic classification of weld defects using simulated data and an MLP neutral network.” Insight vol. 49, No. 3; Mar. 2007.
Reeves, “Particles Systems—A Technique for Modeling a Class of Fuzzy Objects”, Computer Graphics 17:3 pp. 359-376, 1983.
Renwick, et al., “Experimental Investigation of GTA Weld Pool Oscillations” Welding Research—Supplement to the Welding Journal, Feb. 1983, 7 pages.
Rodjito, “Position tracking and motion prediction using Fuzzy Logic,” 2006, Colby College.
Russel, et al., “Artificial Intelligence: A Modern Approach”, Prentice-Hall (Copywrite 1995).
Sandor, et al., “Lessons Learned in Designing Ubiquitous Augmented; Reality User Interfaces.” 21 pages, allegedly from Emerging Technologies of Augmented; Reality: Interfaces Eds. Haller, M.; Billinghurst, M.; Thomas, B. Idea Group Inc. 2006.
Sandor, et al., “PAARTI: Development of an Intelligent Welding Gun for; BMW.” PIA2003, 7 pages, Tokyo. 2003.
Sandter, et al. Fronius—virtual welding, FH JOANNE UM, Gesellschaft mbH, University of; Annlied Sciences 2 pages, May 12, 2008.
Schoder, “Design and Implementation of a Video Sensor for Closed Loop Control of Back Bead Weld Puddle Width,” Massachusetts Institute of Technology, Dept. of Mechanical Engineering, May 27, 1983.
Screen Shot of CS Wave Control Centre V3.0.0 https://web.archive.org/web/20081128081915/http:/wave.c-s.fr/images/english/snap_evolution4.jpg; Estimated Jan. 2007.
Screen Shot of CS Wave Control Centre V3.0.0 https://web.archive.org/web/20081128081817/http:/wave.c-s.fr/images/english/snap_evolution6.jpg, estimated Jan. 2007.
Screen Shot of CS Wave Exercise 135.FWPG Root Pass Level 1 https://web.archive.org/web/20081128081858/http:/wave.c-s.fr/images/english/snap_evolution2.jpg, estimated Jan. 2007.
Sim Welder, retrieved on Apr. 12, 2010 from: http://www.simwelder.com.
Simfor / Cesol, “RV-SOLD” Welding Simulator, Technical and Functional Features, 20 pages, estimated Jan. 2010.
Slater, et al., “Mechanisms and Mechanical Devices Sourcebook,” McGraw Hill; 2nd Addition, 1996.
Stam, J., “Stable fluids,” Proceedings of the 26th annual conference on Computer graphics and interactive techniques, p. 121-128, Jul. 1999.
Swantec corporate web page downloaded Apr. 19, 2016. http://www.swantec.com/technology/numerical-simulation/.
Tamasi, T., “The Evolution of Computer Graphics,” NVIDIA, 2008.
Teeravarunyou, et al, “Computer Based Welding Training System,” International Journal of Industrial Engineering (2009) 16(2): 116-125.
Terebes: examples from http://www.terebes.uni-bremen.de.; 6 pages.
The Fabricator, Virtual Welding, 4 pages, Mar. 2008.
The Lincoln Electric Company, “VRTEX Virtual Reality Arc Welding Trainer,” http://www.lincolnelectric.com/en-us/equipment/training-equipment/Pages/vrtex.aspx as accessed on Jul. 10, 2015, 3 pages.
The Lincoln Electric Company, Production Monitoring 2 brochure, 4 pages, May 2009.
The Lincoln Electric Company; CheckPoint Production Monitoring borchure; four (4) pages; http://www.lincolnelectric.com/assets/en_US/products/literature/s232.pdf; Publication S2.32; Issue Date Feb. 2012.
Thurey, et al., “Real-time Breaking Waves for Shallow Water Simulations,” In Proceedings of the 15th Pacific Conference on Computer Graphics and Applications (PG '07) 2007.
Tonnesen, D., “Modeling Liquids and Solids using Thermal Particles,” Proceedings of Graphics Interface'91, pp. 255-262, Calgary, Alberta, 1991.
Tschirner, et al., “Virtual and Augmented Reality for Quality Improvement of Manual Welds” National Institute of Standards and Technology, Jan. 2002, Publication 973, 24 pages.
Tschirner, et al, “A Concept for the Application of Augmented Reality in Manual Gas Metal Arc Welding.” Proceedings of the International Symposium on Mixed and Augmented Reality; 2 pages; 2002.
Vesterlund, M., “Simulation and Rendering of a Viscous Fluid using Smoothed Particle Hydrodynamics,” Dec. 3, 2004, Master's Thesis in Computing Science, Umeå University, Department of Computing Science, Umeå, Sweden.
Viega, et al. “Simulation of a Work Cell in the IGRIP Program” dated 2006; 50 pages.
Virtual Welding: A Low Cost Virtual Reality Welder Training System, NSRP RA 07-01—BRP Oral Review Meeting in Charleston, SC at ATI, Mar. 2008.
ViziTech USA, “Changing the Way America Learns,” http://vizitechusa.com/ accessed on Mar. 27, 2014; 2 pages.
VRSim Inc. “About Us—History” www.vrsim.net/history, 2016, 1 page.
VRSim Powering Virtual Reality, www.lincolnelectric.com/en-us/equipmenl/lraining-equipmenl/Pages/powered-by-; 'rsim.aspx, 2016, 1 page.
Wade, “Human uses of ultrasound: ancient and modern”, Ultrasonics vol. 38, dated 2000.
Wahi, et al., “Finite-Difference Simulation of a Multi-Pass Pipe Weld” (“Wahi”), vol. L, paper 3/1, International Conference on Structural Mechanics in Reactor Technology, San Francisco, CA, Aug. 15-19, 1977.
Wang, et al. “Numerical Analysis of Metal Tranfser in Gas Metal Arc Welding, ” Departements of Mechanical and Electrical Engineering. University of Kentucky, Dec. 10, 2001.
Wang, et al., “Impingement of Filler Droplets and Weld Pool During Gas Metal Arc Welding Process” International Journal of Heat and Mass Transfer, Sep. 1999, 14 pages.
Wang, et al., “Study on welder training by means of haptic guidance and virtual reality for arc welding,” 2006 IEEE International Conference on Robotics and Biomimetics, ROBIO 2006 ISBN-10: 1424405718, p. 954-958.
Webster's II new college dictionary, 3rd ed., Houghton Mifflin Co., copyright 2005, Boston, MA, p. 1271, definition of “wake.”
White, et al., Virtual welder training, 2009 IEEE Virtual Reality Conference, p. 303, 2009.
Wu, “Microcomputer-based welder training simulator”, Computers in Industry, vol. 20, No. 3, Oct. 1992, pp. 321-325, XP000205597, Elsevier Science Publishers, Amsterdam, NL.
Wuhan Onew Technology Co Ltd, “ONEW-360 Welding Training Simulator” http://en.onewtech.com/_d276479751.htm as accessed on Jul. 10, 2015, 12 pages.
Yao, et al., “Development of a Robot System for Pipe Welding” 2010 International Conference on Measuring Technology and Mechatronics Automation. Retrieved from the Internet: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=5460347&tag=1; pp. 1109-1112.
Yoder, Fletcher, Opinion RE45398 and U.S. Appl. No. 14/589,317, including Appendices ; Sep. 9, 2015; 1700 pages.
United States Provisional Patent Application for “System for Characterizing Manual Welding Operations on Pipe and Other Curved Structures,” U.S. Appl. No. 62/055,724, filed Sep. 26, 2014, 35 pages.
Office Action from U.S. Appl. No. 14/526,914 dated Feb. 3, 2017.
Arc Simulation & Certification, Weld Into the Future, 4 pages, 2005, Jan. 2008.
International Search Report and Written Opinion from PCT/IB10/02913 dated Apr. 19, 2011.
International Search Report for PCT/IB2014/001796, dated Mar. 24, 3016; 8 pages.
International Search Report for PCT/IB2015/000161, dated Aug. 25, 2016; 9 pages.
International Search Report for PCT/IB2015/000777, dated Dec. 15, 2016; 11 pages.
International Search Report for PCT/IB2015/000814 dated Dec. 15, 2016; 9 pages.
International Preliminary Report from PCT/IB2015/001084 dated Jan. 26, 2017.
Petition for Inter Partes Review of U.S. Pat. No. 8,747,116; IPR 2016-00749; Apr. 7, 2016; 70 pages.
Declaration of Edward Bohnart, Apr. 27, 2016, exhibit to IPR 2016-00749.
Declaration of Dr. Michael Zyda, May 3, 2016, exhibit to IPR 2016-00749.
Trial Denied IPR Proceeding of U.S. Pat. No. 8,747,116; IPR 2016-00749; Sep. 21, 2016; 21 pages.
Petition for Inter Partes Review of U.S. Pat. No. Re. 45,398; IPR 2016-00840; Apr. 18, 2016; 71 pages.
Declaration of AxelGraeser, Apr. 17, 2016, exhibit to IPR 2016-00840; 88 pages.
Decision Denying Request for Rehearing of U.S. Pat. No. Re. 45,398; IPR 2016-00840; Nov. 17,2016; 10 pages.
Petition for Inter Partes Review of U.S. Pat. No. 8,747,116; IPR 2016-01568; Aug. 9, 2016; 75 pages.
Decision Termination Proceeding of U.S. Pat. No. 8,747,116; IPR 2016-01568; Nov. 15, 2016; 4 pages.
Petition for Inter Partes Review of U.S. Pat. No. 9,293,056; IPR 2016-00904; May 9, 2016; 91 pages.
Declaration of Edward Bohnart, Apr. 27, 2016, exhibit to IPR 2016-00904; 22 pages.
Declaration of Dr. Michael Zyda, May 3, 2016, exhibit to IPR 2016-00904; 76 pages.
Decision Trial Denied IPR Proceeding of U.S. Pat. No. 9,293,056; IPR 2016-00904; Nov. 3, 2016; 15 pages.
Petition for Inter Partes Review of U.S. Pat. No. 9,293,057; IPR 2016-00905; May 9, 2016; 87 pages.
Declaration of Edward Bohnart, Apr. 27, 2016, exhibit to IPR 2016-00905; 23 pages.
Declaration of Dr. Michael Zyda, May 3, 2016, exhibit to IPR 2016-00905; 72 pages.
Decision Trial Denied IPR Proceeding of U.S. Pat. No. 9,293,057; IPR 2016-00905; Nov. 3, 2016; 21 pages.
Lincoln Electric Company et al v. Seabery Soluciones SL et al—1:15-cv-01575-DCN—Complaint filed Aug. 15, 2015 (Dkt 01).
Lincoln Electric Company et al v. Seabery Soluciones SL et al—1:15-cv-01575-DCN—Amended Answer filed Mar. 1, 2016 by Seabery North America (docket 44).
Lincoln Electric Company et al v. Seabery Soluciones SL et al—1:15-cv-01575-DCN—Amended Answer filed Mar. 1, 2016 by Seabery Soluciones SL (docket 45).
Lincoln Electric Company et al v. Seabery Soluciones SL et al—1:15-cv-01575-DCN—Amended Answer filed Mar. 22, 2016 by Lincoln Electri c Company (docket 46).
Lincoln Electric Company et al v. Seabery Soluciones SL et al—1:15-cv-01575-DCN—Answer filed Mar. 22, 2016 by Lincoln Global Inc. (docket 47).
Exhibit B from Declaration of Morgan Lincoln in Lincoln Electric Co. et al. v. Seabery Soluciones, S.L. et al., Case No. 1:15-cv-01575-DCN, dated Dec. 20, 2016, 5 pages.
International Serach Report and Written Opinion for International Application No. PCT/IB2009/006605.
Extended European Search Report from Corresponding Application No. 18185849.9; dated Jan. 30, 2019; pp. 1-8.
European Examination Report from Corresponding Application No. 14728279.2; dated Mar. 13, 2019; pp. 1-4.
The Lincoln Electric Company, Check Point Operator's Manual, 188 pages, issue date Aug. 2015.
William Huff, Khoi Nguyen,“ Computer Vision Based Registration Techniques for Augmented Reality”, Colorado School of Mines, Division of Engineering, Proceedings of Intellectual Robots and Computer Vision XV, pp. 538-548; SPIE vol. 2904, Nov. 18-22, 1996, Boston MA.
European Search Report for European Patent Application 10860823.3-1702, pp. 1-8, dated Jun. 6, 2017.
Benkai Xie, Qiang Zhou and Liang Yu; A Real Time Welding Training System Base on Virtual Reality; Onew 360; Wuhan University of Technology; IEEE Virtual Reality Conference; Mar. 23-27, 2015.
Lindh; “Strength in numbers: How the Industrial Internet of Things applies to fabricators”; thefabricatior.com; http://www.thefabricator.com/article/shopmanagement/theres-strength-in-numbers-how-the-industrial-internet-of-things-applies-to-fabricators; Dated Feb. 11, 2016; pp. 1-2.
“ITAMCO Engineer Wins Prize for Google Glass Application”; http://www.fabricatingandmetalworking.com/2014/05/itamco-engineer-wins-75000-for-google-glass-application/; Dated May 6, 2014; pp. 1-2.
Bennett; “OK, Glass, take a video of me welding this pipeline”; https://www.biv.com/article/2013/8/ok-glass-take-a-video-of-me-welding-this-pipeline/; Dated Aug. 12, 2013; pp. 1-2.
Mann; “Steve Mann: My “Augmediated” Life”; https://spectrum.ieee.org/geek-life/profiles/steve-mann-my-augmediated-life; Dated Mar. 1, 2013; pp. 1-6.
Wheeler; “Understanding Augmented Reality Headsets”; https://www.engineering.com/DesignSoftware/DesignSoftwareArticles/ArticleID/12859/Understanding-Augmented-Reality-Headsets.aspx; Dated Aug. 10, 2016; pp. 1-7.
Kemppi; “Welding production management: WeldEye Welding Management Software”; https://www.kemppi.com/en-US/offering/product/welding-production-management/; Accessed Apr. 11, 2017; pp. 1-16.
“ESAB Welding & Cutting GmbH: Helping ESAB Realize an IoT Connected Vision”; PAC Innovation Register; https://www.pac-online.com/sites/pac-online.com/files/upload_path/PDFspac_innovation_register_case_study_esab_iot_connected_vision_16_0.pdf; Dated 2016; pp. 1-4.
“ThingWorx Delivers Industrial Innovation”; https://www.ptc.com/en/products/iot; Accessed Apr. 11, 2017; pp. 1-5.
Zhu; “Computer and Network Oriented Welding Power Source Design”; Applied Mechanics and Materials; vols. 58-60; Dated 2011; pp. 864-868.
Related Publications (1)
Number Date Country
20170323584 A1 Nov 2017 US
Provisional Applications (1)
Number Date Country
61827248 May 2013 US
Continuations (1)
Number Date Country
Parent 14105758 Dec 2013 US
Child 15245535 US
Continuation in Parts (1)
Number Date Country
Parent 15245535 Aug 2016 US
Child 15660525 US