This invention relates in general to equipment used in welding. Devices, systems, and methods consistent with the invention relate to the monitoring of welding parameters and specifically to a welding system that provides visual and audio cues to a display located in the welder's helmet.
Welding is an important process in the manufacture and construction of various products and structures. Applications for welding are widespread and used throughout the world, for example, the construction and repair of ships, buildings, bridges, vehicles, and pipe lines, to name a few. Welding may performed in a variety of locations, such as in a factory with a fixed welding operation or on site with a portable welder.
In manual or semi-automated welding a user/operator (i.e. welder) directs welding equipment to make a weld. For example, in arc welding the welder may manually position a welding rod or welding wire and produce a heat generating arc at a weld location. In this type of welding the spacing of the electrode from the weld location is related to the arc produced and to the achievement of optimum melting/fusing of the base and welding rod or wire metals. The quality of such a weld is often directly dependent upon the skill of the welder.
Welders generally rely upon a variety of information when welding. This information includes, for example, current and voltage. Traditionally, welders would need to look at gauges on the control panel of the welding equipment to gain this information. This would require the welder to direct their field of vision away from the welding work area and as such was undesirable. In addition, in many cases, the welding machine may not be located close to the work space. In such cases, the welding machine is operated by a cable-connected remote control that can be used to change parameters such as, e.g., welding power, polarity, arc characteristics, etc. However, before the process can be set up, the welder may need to see the display readouts that are physically located on the machine. The setting-up process may require many trips before the set-up is completed.
In the past, efforts have been made to provide welders with information during welding, such as in the method disclosed in U.S. Pat. No. 4,677,277, where current and voltage are monitored to produce an audio indication to the operator as to the condition of the arc in arc welding. However, monitors consisting only of audio arc parameter indicators are hard to hear and interpolate and are not capable of achieving the desired closeness of control and quality of weld often required.
More recently, as disclosed in U.S. Pat. No. 6,242,711, an apparatus for monitoring arc welding has been developed that provides a welder with real-time voltage and current conditions of the welding arc where information in the form of lights, illuminated bar graphs, light projections, illuminated see-through displays, or the like are placed within the visual range of the helmet wearing operator and located in proximity to the helmet viewing window in the helmet. However, in this apparatus a welder must still move their visual focus away from the welding work area in order to focus on the information located proximate to the welding window or the welder must accept the information peripherally while continuing to focus on the welding work area. In addition, related art welding devices have limited guidance with respect to aiding the welder, whether a beginner or exercised, as the weld is being performed.
This invention relates to a welding helmet that is capable of providing an image representative of information from an associated welding operation where the image appears on a display in the welding helmet. The display can be face-mounted display such as an LCD display or a head up display (HUD).
Various aspects will become apparent to those skilled in the art from the following detailed description and the accompanying drawings.
The above and/or other aspects of the invention will be more apparent by describing in detail exemplary embodiments of the invention with reference to the accompanying drawings, in which:
Exemplary embodiments of the invention will now be described below by reference to the attached Figures. The described exemplary embodiments are intended to assist the understanding of the invention, and are not intended to limit the scope of the invention in any way. Like reference numerals refer to like elements throughout.
Referring now to the drawings, there is illustrated in
The welding system 14 includes welding equipment for generating a welding current and voltage, a welding control system for controlling the welding current and voltage, and a monitoring system for monitoring the welding current and voltage. That is, the welding system, can be on known or used welding power supply having a known construction and operation. The monitoring system may also monitor a variety of other operating parameters, such as but not limited to, wire feed speed, amount of wire used/amount of wire remaining, any type of welding feedback desired by the operator and any other desired operating parameter.
The welding helmet 12 includes a main body 22 with a visual display 24 connected to the main body 22. The display 24 may be a window including a welding lens, a video monitor, such as an LCD display or LED array, or any other device suitable to allow a welder to see the welding work area 20. It must be understood that in such an example where the display 24 is a video monitor video processing may be utilized to enhance the pictures of the welding operation. Further, recording devices may optionally be included in the display, for example, to record and later playback welding operations for analysis and/or evaluation.
As shown in
As shown in
It must be understood that among other types of information, along with a variety of other parameter, the information based upon welding current and voltage includes, but is not limited to, welding current feedback, welding voltage feedback, control settings of the welding equipment, statistical information of the welding process, benchmarks or limits including capacity representations, alerts including material shortage or low flow, a representation of an intended or desired weld, etc.
Further, in one embodiment, the camera 26 is used to calibrate the depth of the image relative to the welding work area 20. This calibrated depth can be used to determine the focus of the information displayed on the display 24. For example, if the camera 26 determines that the distance from the helmet to the work area is 2 feet, the images and/or information shown on the display 24 is displayed such that the image has a focal point which would be at 2 feet beyond the helmet. As explained above, this allows the displayed information to be displayed at a same focal length as the weld area 20 so that the welder need not change his/her eye focus during a welding operation. In another embodiment, positions sensors on the welding gun may be used to calibrate the depth of the image. Such sensors can include, but are not limited to, magnetic sensors, optical sensors, acoustics sensors, and the like, which are sensed using an appropriate sensing system to allow for the positioning of the welding gun to be determined. This data can be used to aid in determining the focal range/distance of the work area relative to the helmet. In particular applications it is highly desirable to carefully align the image and the welding work such that the information represented in the image is easy for the welder to access and such that the information in the image is readily accepted by the welder.
In the example where the visual display 24 is a video monitor, information generating mechanism 28 may include an image representative of information from the monitoring system based upon the monitored parameter, such as welding current and voltage, in video pictures of the welding work area 20 shown on the display 24.
As indicated at 29, the information generating mechanism 28 may be in wired or wireless communication with other devices as desired.
In
In
There is shown in
In any case, the image may be an overlay of text or graphics or video feedback. Additionally, it is contemplated that in at least one embodiment the system described above may be used in a remote welding situation, including but not limited to robotic welding or underwater welding.
While principles and modes of operation have been explained and illustrated with regard to particular embodiments, it must be understood, however, that this may be practiced otherwise than as specifically explained and illustrated without departing from its spirit or scope.
Some exemplary embodiments of the present invention, as illustrated in
The combiner 134 reflects the image projected from projector 128 to the welder. In some embodiments, light transmitted through lens 24 is also transmitted through combiner 134. Thus, the welder will see both the projected image and the field of view behind the combiner 134 at the same time. The light transmitted through lens 24 can be that from a welding arc transmitted through lens 24. In some embodiments, the lens 24 is of a type that rapidly and automatically changes from transparent to dark when the lens 24 detects that a welding arc has been initiated. The auto-darkening feature protects the welder's eyes from damage that could occur if the eye is exposed to the welding arc. The auto-darkening lens is transparent when no arc is detected and thus allows the welder to see the work space even when the welding helmet 12 is flipped down over the welder's face. With an auto-darkening lens, the light transmitted through lens 24 and combiner 134 can be either the light from the welding arc or normal room lighting depending on whether a welding operation is taking place.
In some embodiments, the combiner 134 collimates the reflected image such that the projected image appears to be at optical infinity. Thus, the welder will not have to re-focus to see both the work space and the projected image—even during the welding process. In some embodiments, the combiner 134 is an appropriate transparent material, e.g., a flat piece of glass, that is angled such that the projected image from the projector 128 is reflected to the welder as illustrated in
In some embodiments, the combiner 134 includes a coating that reflects monochromatic light from the projector 128. For example, the coating on the combiner 134 can be such that only, e.g., green light is reflected and all other light is transmitted through. Thus, the HUD 135 will provide the welder a transparent display that allows the welder to see information on the combiner 134 in green while still allowing the welder to view the work space. Of course, other coatings that reflect other colors or even multiple colors can be used on the combiner 134. For example, the combiner 134 can be coated such that it reflects the colors green and red. While in the normal operating range, the information, e.g., welding current, may be displayed in green and when outside the normal operating range, the information, e.g., welding current, can be displayed in red. The information provided to the welder can include welding operating parameters such as, e.g., input current, input voltage, input power, welding current, welding voltage, wire feed speed, contact tip-to-work distance, arc length, mode of operation, etc.
The size, shape, and placement of the combiner 134 relative to the lens 24 can vary, as desired. For example,
In some embodiments, the projector is not used. As illustrated in
The combiner 234 receives image information, e.g., in the form a digital signal, from information generating device 229, which generates and/or processes the image based on information received from welding system 14 and/or computer system 160. In some embodiments, the combiner 234 and information generating device 229 can be integrated into a single physical unit. In some embodiments, the combiner 234 and lens 24 can be integrated into a single physical unit. In some embodiments, the combiner 234, information generating device 229, and lens 24 can be integrated into a single physical unit. In some embodiments, the computer system 160 and/or the welding system 14 generates and/or process the image and transmits the image information directly to combiner 234, which can include or is connected to a wireless communication device.
The information generating devices 129 and 229 can each include a communication device 150 to communicate via, e.g., a wireless network 170 or a wired network with welding system 14 and/or computer system 160. The wireless network 170 can operate using, e.g., Bluetooth, WiFi (IEEE 802.11) or some other wireless protocol. In some embodiments, the welding system 14 can provide information such as e.g., input power, input current, input voltage, welding current, welding voltage, welding power, contact tip-to-work distance, arc length, wire feed speed, etc. in real-time to, e.g., aid the welder while the welding operation is going on. Alternatively, or in addition, the welding system 14 can send welding performance information after the welder has stopped welding. For example, the welding system 14 can transmit information such as, e.g., heat input, duration of welding, etc. after, e.g., the welder system 14 is turned off, indicating that the welder is done welding. Such information might be useful to the welder in order to make corrections before starting the next welding segment.
In some embodiments, the computer system 160 performs all the calculations such as, e.g., heat input, welding duration, etc. The computer system 160 can communicate with the welding system 14 and/or the welding helmet 12 via, e.g., wireless network 170 or a wired network. In some embodiments, the computer system 160 collects, stores, and/or analyzes information received from the welding system 14. In some embodiments, the computer system 160 transmits the image information to the welding helmet 12 instead of or in addition to the welding system 14. In some embodiments, the computer system is incorporated into or is integral to the welding system 14.
In some embodiments, the image information seen by the welder is configurable. For example, the computer system 160 and/or the welding system 14 can be configured with different “views” or image screens that the welder can select. For example, as illustrated in
The welder can turn the HUD 135, 235 on and off and scroll through the “views” using controls (not shown) located on the welding helmet 12. Alternatively, or in addition, the welder can control the HUD 135, 235 using voice commands. The welding helmet 12 can include a microphone system 140 (see
In another exemplary embodiment, as seen
In addition to displaying visual cues and displaying the welding environment 480, the display device 440A can also playback or display a variety of media content such as videos, documents (e.g., in PDF format or another format), audio, graphics, text, or any other media content that can be displayed or played back on a computer. This media content can include, for example, instructional information on welding that the trainee can review prior to performing a weld (e.g., general information on welding, information on the specific weld joint or weld procedure the user wishes to perform, etc.). The media content can also include information on the trainee's performance or the quality of the weld after completion of the weld or training exercise, and/or audio/visual information provided to the trainee during the weld (e.g., if the visual cues are turned off and the trainee's performance is a problem, the system can display and/or provide audio information suggesting that the visual/audio cues be turned on).
System 400 further includes sensors and/or sensor systems, which may be comprised of a spatial tracker 420, operatively connected to the logic processor-based subsystem 410. As discussed below, the spatial tracker 420 tracks the position of the welding tool 460 and the welding helmet 440 relative to the welding environment 480. The system 400 can also include a welding user interface 430 in communication with the logic processor-based subsystem 410 for set up and control of the system 400. Preferably, in addition to the face-mounted display device 440A, which can be connected to, e.g., the logic processor-based subsystem 410, the system 400 also includes an observer/set-up display device 430A connected to, e.g., the welding user interface 430. Each display device 440A, 430A can provide a view of the welding environment 480 that has been overlaid with visual cues. Preferably, the system 400 also includes an interface that provides communication between programmable processor-based subsystem 410 and a welding power supply 450 in order to receive welding parameters such as, e.g., voltage, current, power, etc. In some exemplary embodiments, the system 400 includes a video capture device 470 that includes one or more cameras, e.g., digital video cameras, to capture the welding environment 480. The video-capture device 470 can be mounted on the welding helmet 440, similar to helmet 12 discussed above with reference to
In various exemplary embodiments, the spatial tracker 420 measures the motion of welding tool 460 and/or welding helmet 440 and gathers process data during the welding exercises. Preferably, the spatial tracker 420 uses one or more of the following tracking systems: a single or multiple camera based tracking system (e.g., based on point cloud image analysis), a magnetic-field based tracker, an accelerometer/gyroscope based tracker, an optical tracker, an infrared tracker, an acoustic tracker, a laser tracker, a radio frequency tracker, an inertial tracker, an active or passive optical tracker, and a mixed reality and simulation based tracking. Still, other types of trackers may be used without departing from the intended scope of coverage of the general inventive concepts. The exemplary embodiments of the invention are applicable to a wide range of welding and related processes including, but not necessarily limited to, GMAW, FCAW, SMAW, GTAW, cladding and cutting. For brevity and clarity, the description of system 400 will be provided in terms of welding, but those skilled in the art understand that the description will also be valid for other operations such as cutting, joining, cladding, etc.
Preferably, the logic processor-based subsystem 410 includes at least one computer for receiving and analyzing information captured by the spatial tracker 420, image capture device 470 and welding process data transmitted by the welding power supply 450. During operation, the computer is typically running software that includes a welding regimen module, an image processing and rigid body analysis module, and a data processing module. The welding regimen module includes a variety of weld types and a series of acceptable welding process parameters associated with creating each weld type. Any number of known or AWS weld joint types and the acceptable parameters associated with these weld joint types may be included in the welding regimen module, which can be accessible and configurable by user, a welding instructor, etc. to add modify or delete any information in the welding regimen module. In addition to known weld types, the logic processor-based subsystem 410 can import weld joints and/or specific welds corresponding to custom parts, devices, components, etc. into the welding regimen module. For example, the logic processor-based subsystem 410 can import design model information, e.g., CAD format (2-D or 3-D) (or another type of design format), of any complex custom part, device, component, etc. that is welded as part of the fabrication process (e.g., parts, devices, components, etc. used in any industrial, manufacturing, agricultural, or construction application or any other application). Once imported, the number and types of joints used in manufacturing the custom part, device, component, etc. can be identified and stored in the welding regimen module as a custom weld type. The acceptable parameter limits associated with these weld joint types can be input by, e.g., the instructor or, preferably, included in the design file (e.g., CAD file) from the manufacturer, and are then read by the logic processor-based subsystem 410. As an illustrative example, the logic processor-based subsystem 410 can import a design of an automobile axle, e.g., in 2-D or 3-D CAD format. The number and type of joints used in manufacturing the axle can be identified (either by, e.g., the instructor, or automatically read in as part of the file, e.g., 2-D or 3-D CAD format) and stored in the welding regimen module as a custom weld type or types. As with the known or AWS weld types, the acceptable parameters associated with the custom weld types are also stored in the welding regimen module. Preferably, information for the custom part, device, component, etc. is also used by embodiments of the logic processor-based subsystem 410 for workpiece recognition and auto-calibration of the system as discussed below.
The weld process and/or type selected by the user prior to welding determine which acceptable welding process parameters are used for any given welding exercise. The object recognition module is operative to train the system to recognize known rigid body objects, which can include two or more point markers, and then calculate position and orientation data for, e.g., welding tool 460 and welding helmet 440 as a manual weld is completed by the user. Preferably, along with known rigid body objects, the object recognition module can also be uploaded or configured with information for recognizing custom parts, devices, components, etc. discussed above. The data processing module compares the information in the welding regimen module to the information processed by the object recognition module and outputs feedback to the user. For example, the logic processor-based subsystem 410 can provide any type of feedback to the user (typically in real time) via various means including, but not limited to, one or more of in-helmet visual feedback, visual feedback on a separate monitor, audio feedback (e.g., tones, coaching, alarms) via speakers, and additional visual, audio, or tactile feedback using the welding tool (e.g., torch, welding gun). For example, the real-time visual feedback can be provided to the user and/or an observer as the user welds on a welding coupon 480A or a workpiece 480B, each of which can have a range of configurations, including large sizes, various joint types, pipe, plate, and complex shapes and assemblies. Preferably, measured parameters, which are provided as the feedback, include, but are not limited to, aim, work angle, travel angle, tool standoff, travel speed, bead placement, weave, voltage, current, wire feed speed, arc length, heat input, gas flow (metered), contact tip to work distance (CTWD), and deposition rate (e.g., lbs./hr., in./run). Preferably, the face-mounted display 440A and/or welding user interface 430 with display device 430A allows the user and/or an observer to visualize the processed data in real time and the visualized data is operative to provide the user with useful feedback regarding the characteristics and quality of the weld. In some exemplary embodiments, the feedback data is automatically recorded and saved in a data storage device, e.g., hard disk drive, or other known storage means by logic processor-based subsystem 410.
Preferably, the logic processor-based subsystem 410 can include memory, e.g., RAM, ROM, EPROM, hard disk drive, CD ROM, removable drives, flash memory, etc., that can be pre-populated with specific welding procedures, which can include procedures that have been customized, e.g., by an experienced welder, a manufacturer, etc. The procedures can include information related to the visual and audio cues such as the criteria for changing the attributes of the visual and audio cues. The procedures can include information on the target values and target ranges for parameters such as aim, work angle, travel angle, tool standoff, travel speed, bead placement, weave, voltage, current, wire feed speed, arc length, heat input, gas flow (metered), contact tip to work distance (CTWD), deposition rate (e.g., lbs./hr., in./run), etc. based on the type of welding process, the type of the welding gun, the type and orientation of the weld joint, the type of material being welded, the type of the electrode, the type and size the filler wire (if any), etc. Preferably, the logic processor-based subsystem 410 can perform real-time and a post-weld analysis that scores the performance of the welder, user. Preferably, based on the analysis, the logic processor-based subsystem 410 can provide other information such as the potential existence of faults in the weld (e.g., porosity, incomplete fusion (not enough penetration), crack in the weld, undercut, weld profile is too thin, weld profile is too thick, etc.) and how to avoid the faults in real-time and/or in a post-weld analysis. Preferably, the progress of the user is tracked over time, which can be a beneficial aid to a trainer in identifying areas where the user may need additional teaching. However, exemplary embodiments of the invention are not limited to a traditional training environments where welders, whether beginners, intermediate or experienced, practice welding on welding coupons such as, e.g., welding coupon 480A.
Exemplary embodiments of the invention can be used in actual working environments and the visual and audio cues, which are discussed further below, aid the welder in performing the weld. For example, a beginner can use the visual and audio cues to make sure the welding gun is oriented properly and the travel speed is correct. Experienced welders can also benefit from the visual and audio cues when, e.g., welding on a workpiece and/or using a filler wire whose material is new to the welder, working at an orientation that is unfamiliar, when using a non-traditional welding procedure, etc.
With reference now to
The helmet 440 operatively connects to the logic processor-based subsystem 410 and the spatial tracker 420 via wired or wireless means, e.g., in
The welding helmet 440 may further include speakers 440B, allowing the user to hear audio cues. Different sounds can be provided depending on if certain welding or performance parameters are within tolerance or out of tolerance. For example, a predetermined tone can be provided if the travel speed is too high and a different predetermined tone can be provided if the travel speed is too low. Sound may be provided to the user via speakers 440B, which may be earbud speakers or any other type of speakers or sound generating device, mounted in the welding helmet 440 and/or mounted separately, e.g., on the welding table. Still, any manner of presenting sound to the end user while engaging in welding activity may be chosen. It is also noted here that other types of sound information may be communicated through the speakers 440B. Examples include verbal instructions from an instructor or an expert, in either real time or via prerecorded messages. Prerecorded messages may be automatically triggered by particular welding activity. Real time instructions may be generated on site or from a remote location. Still, any type of message or instruction may be conveyed to end user.
Preferably, determining the orientation of the welding tool 460 and the welding helmet 440 includes capturing images of the respective objects with one or more off-the-shelf high-speed-vision cameras, which can be mounted on tracking system support 526 of stand 520 or another fixed location relative to the welding environment 480. Preferably, the processing of the captured images includes creating an image file at many (e.g., over 100) frames per second. Preferably, the one or more cameras typically capture at least two point markers located in a fixed geometric relationship to one another on each of the respective objects, welding tool 460 and welding helmet 440, 22. Of course, other objects in the welding environments 480 such as the welding coupon 480A and workpiece 480B can include point markers so that their position and orientation can also be determined based on the images captured by the cameras. Preferably, the processing of the captured images is performed in the spatial tracker 420 and/or the logic processor-based subsystem 410.
As discussed above, tracking of the objects in 3-D space, including the welding environment 480 can be accomplished by using point markers on the objects, e.g., welding tool 460, welding helmet 440, welding coupon 480A, workpiece 480B, etc. For example, as seen in
The image processing then includes frame-by-frame point cloud analysis of the rigid bodies (i.e., the calibrated targets, which can include welding tool 460 and welding helmet 440) that includes three or more point markers. Upon recognition of a known rigid body, position and orientation are calculated relative to the camera origin and the “trained” rigid body orientation. Calibrating and “training” the spatial tracker 420 to recognize the position and orientation in three-dimensional space of rigid bodies such as the welding tool 460 and welding helmet 440 is known in the relevant art and thus, for brevity, will not be discussed in detail. For example, the spatial tracker 420 can include any suitable data capturing system such as, for example, the Optitrack Tracking Tools (provided by NaturalPoint, Inc. of Corvallis, Oreg.) or a similar commercially available or proprietary hardware/software system that provides three-dimensional marker and six degrees of freedom object motion tracking in real time. Such technologies typically utilize reflective and/or light emitting point markers arranged in predetermined patterns to create point clouds that are interpreted by system imaging hardware and system software as “rigid bodies,” although other suitable methodologies are compatible with this invention. The system imaging hardware and software can be incorporated into logic processor-based subsystem 410.
Preferably, more than one camera is used to track welding tool 460 and welding helmet 440. Capturing and comparing the images from two or more cameras allows for a substantially accurate determination of the position and orientation in three-dimensional space of welding tool 460 and welding helmet 440. Images are typically processed at a rate of more than 100 times per second. One of ordinary skill in the art will appreciate that a lesser sampling rate (e.g., 10 images/sec.) or a greater sampling rate (e.g., 1,000 images/sec.) could be used. The output aspect of image processing includes creation of a data array that includes x-axis, y-axis, and z-axis positional data and roll, pitch, and yaw orientation data, as well as time stamps and software flags. Text files (including 6-D data for the welding tool 460 and welding helmet 440) may be streamed or sent by the spatial tracker 420 at a desired frequency to the programmable processor-based subsystem 410 or, in some embodiments, be generated by the logic processor-based subsystem 410. While the above exemplary embodiment of spatial tracker 420 is described as a single or multiple camera based tracking system based on point cloud image analysis, those skilled in the art understand that other type of tracking systems can be used, e.g., a magnetic-field based tracker, an accelerometer/gyroscope based tracker, an optical tracker, an infrared tracker, an acoustic tracker, a laser tracker, a radio frequency tracker, an inertial tracker, an active or passive optical tracker, and mixed reality and simulation based tracking. In addition, while the cameras 470 A, B are discussed above as providing a “field of view” of the user to the system 400, the cameras 470 A, B can also be configured to track objects in the welding environment 480, instead of the cameras on tracking system 526. Of course, an appropriate alternate fixed reference point in the welding environment 480 must be used because the cameras 470 A, B will move with the movement of the user.
In an exemplary embodiment,
In block 550, the digital image data (e.g., video) from one or more cameras mounted, e.g., on the welding helmet is captured for processing in, e.g., data processing block 540. In block 560, welding process parameters such as those transmitted by welding equipment, e.g., welding power supplies (e.g., current, voltage, etc.), wire feeders (wire feed speed, etc.), hot-wire power supplies (current, voltage, temperature, etc.), or some other piece of welding equipment are captured for processing in block 540. Preferably, the welding process parameters can also include heat input (which can be calculated, e.g., using welding current, welding voltage and travel speed), arc length (which can be determined based on welding voltage) and other calculated welding parameters.
Preferably, the data processing in block 540 includes correlating the view of the welding environment 480 from the helmet-mounted cameras to the tracking data processed in block 530, i.e., the 6-D data of the tracked welding tool and the tracked welding helmet. For example, the position and orientation information of objects such as the welding tool 460, welding coupon 480A and workpiece 480B are mapped to the image data captured by image capture device 470. Preferably, the welding process parameters from block 560 are processed with the mapped image and tracking data to create an image stream showing the welding environment with visual cues.
In some exemplary embodiments, audio cues based on the welding process parameters and the tracked data are also generated in block 540 to create an audio stream that can be transmitted to the welding helmet 440 along with the image stream. Preferably, the image stream and audio stream are combined to create a composite audiovisual stream that is sent to welding helmet 440. Generally, the data processing step in block 540 includes algorithms to generate target values and target ranges for welding parameters such as, e.g., aim, work angle, travel angle, tool standoff, travel speed, bead placement, weave, voltage, current, wire-feed speed, arc length, heat input, gas flow (metered), contact tip to work distance (CTWD), deposition rate, frequency of TIG filler addition, using algorithms specific to a selected welding process, joint type, material being welded, joint orientation, etc. The data processing step in block 540 also includes algorithms to generate visual and audio cues based on the welding process parameters and tracked data and algorithms to properly place the visual and audio cues on the in-helmet display as discussed below.
As seen in step 570, the digital image data and audio data processed in block 540 is transmitted to external devices. The image data can be viewed on, e.g., a monitor, in-helmet display, heads-up display, or combinations thereof and audio data (e.g., audio coaching, alarms, tones, etc.) may be directed to external speakers, in-helmet speakers, ear buds, etc. For example, the digital image data and audio data can be transmitted to the face-mounted display 440A and speakers 440B of welding helmet 440 and optionally to display device 430A of welding user interface 430. The digital image data and audio data include visual and audio cues, respectively, relate to welding data. For example, the visual and audio cues relate to, but are not limited to, aim, work angle, travel angle, tool standoff, travel speed, bead placement, weave, voltage, current, wire-feed speed, arc length, heat input, gas flow (metered), contact tip to work distance (CTWD), deposition rate, frequency of TIG filler addition. Based on the input data, the logic processor-based subsystem 410 can also generate tactile feedback that is sent to the welding tool 460 if so configured. For example, a vibrator on the welding tool 460 can be triggered if the welding process goes into an alarm condition. In addition, the welding tool 460 can provide visual feedback using e.g., LCDs, LEDs, etc. Preferably, the feedback, whether visual, audio or tactile, is presented to the user in real time as the user is performing a weld on, e.g., the real and/or simulated welding coupon 480A or workpiece 480B.
In preferred embodiments, the visual and audio cues can be provided to the user to aid the user in the welding process. The visual cues can appear as alphanumeric characters, symbols, graphics, icons, colors, etc. The audio cues can be in the form of tones, alarms, buzzers, audio instructions (either live or pre-recorded and either a human voice or a computer-generated voice). The visual cues can be displayed in any desired location on the display of the face-mounted display device 440A. For example,
In some embodiments, as shown in
The tracked parameters such as the position, orientation and movement of welding tool 460 and welding process parameters such as welding voltage, welding current, wire feed speed, etc. can be compared to upper and lower target thresholds, target values or preferred variations for the type of welding process (e.g., GMAW, FCAW, SMAW, GTAW), the type and orientation of the weld joint, the type materials, etc. The upper and lower thresholds or preferred variations can be based on the motions of an expert welder, based computer modeling, testing of similar prior welds, etc. For example, when a welder performs a weld (e.g., expert welder, instructor, a trainee, etc.), the position, orientation and movement of welding tool 460 of the welder and welding process parameters such as welding voltage, welding current, wire feed speed, etc. are recorded. After completing the weld, the welder can select an appropriate menu item that “clones” the procedure. The “cloned” procedure is then stored and can serve as a reference for future welding procedures. Preferably, the upper and lower target thresholds, target values or preferred variations can be entered manually by the welder and, more preferably, are automatically entered using default values, e.g., ±5% or some other appropriate value. Preferably, the upper and lower target thresholds, target values or preferred variations are configurable by the user.
Preferably, when the position, orientation or movement of the welding tool 460 and/or the welding process parameters fall outside the upper and lower target thresholds, target values or preferred variations, the programmable processor-based subsystem 410 changes an attribute, e.g., color, shape, size, intensity or brightness, position and/or some other characteristic, of the appropriate visual cue. For example, the programmable processor-based subsystem 410 can change the color of the visual cue from green to yellow to red depending on the amount of deviation, and/or the visual cues can graphically show the amount of deviation from the target. For example, as seen section A of
Visual cues 700 are not limited to just providing warnings and alarms. In exemplary embodiments of the invention, visual cues 700 can aid in identifying to the user the weld start position, the weld stop position and the total length of the weld. For example, in cases were the entire joint between two workpieces is not welded but only certain portions, the visual cues 700 can aid the user in identifying the start position by having a marker such as, e.g., a green spot at the start position and a second marker such as, e.g., a red spot at the stop position. In some embodiments, the visual cues aid in achieving the proper weld length. For example, an indication, e.g. a green spot, is turned on at the start position (or any other desired location on display 441) to start the weld and the user welds until the indication is turned off or changes color to achieve the desired weld length. Preferably, the visual cues 700 aid in the welding sequence to be followed. For example, if the welding should be done from the center of the workpiece and outward to the ends in order to reduce stresses in the workpiece, the visual cues 700 can alert the user, e.g., by displaying a message in text, e.g., in a corner of the display 441, or by using visual cues 700 to identify the start position, stop position and weld length as discussed above. In some embodiments, the visual cues 700 provide an indication of the welding progression. For example, a visual cue can provide the percent completion of a weld joint in, e.g., a corner of the display 441 or some other desired location, and/or provide an indication of the pass number in a multi-pass weld. Preferably, the visual cues 700 also aid the user in performing the weld. For example, in a weld procedure requiring a weave pattern, a visual cue can provide an indication, e.g. a bright spot that cycles in the preferred weave pattern at the end of the welding tool tip for the user to copy while welding. Similarly, when performing a TIG welding operation, the visual cues can indicate the proper frequency at which the filler wire should be added to the weld puddle. For example, a pulsating marker, e.g., a green spot that changes brightness, can be displayed at the end of the welding tool tip and the user knows to add the filler wire at the frequency of the pulsations, e.g., dip the filler wire in the weld puddle whenever the marker is ON or brighter (or when the marker is OFF or dimmer).
Audio cues can also be used to aid the user. For example, similar to visual cues, audio warning and alarms can be sent to speakers in the welding helmet 440 to alert the user if the position, orientation or movement of the welding tool 460 and/or the welding process parameters fall outside the upper and lower target thresholds, target value (e.g., outside an acceptable tolerance) or outside of preferred variations for the type of welding process. The warnings can be in any audio format, e.g., tones of different frequencies, buzzers, and voice alerts. Different frequencies and/or pitches can indicate to the user which parameter is not on target and the amount of deviation. Preferably, the logic processor-based subsystem 410 provides voice alerts to inform the user of problems. For example, the voice alert can instruct the user that the travel speed is too fast or too slow. Voice alerts can provide instructions or suggestions to the user in making any necessary corrections. Similar to the visual cues discussed above, voice alerts can also aid the user in making the weld, e.g., weave pattern, TIG welding, start and stop indicators, weld length, sequencing, and/or welding progression. Audio cues can be provided in place of the visual cues discussed above. Preferably, a combination of audio cues and visual cues are provided by logic processor-based subsystem 410.
Preferably, the visual and audio cues can be selectively turned ON or OFF by the user. In addition, with respect to the visual cues, the user can select whether to fixedly locate the visual cues 700 in a desired location on display 441 such as, e.g., a corner or along one of the sides of display 441, or “attach” the visual cues 700 to an object on the display 441 such as, e.g., welding tool 460. In some embodiments, the user 441 can individually select the visual cue 700 or audio cue to turn ON or OFF. Preferably, the visual cues 700 and/or the audio cues are grouped so that the user can select a group of cues to turn ON or OFF. For example, cues related to information from the welding power supply 450 can be grouped together and information related to welding tool 460 can be grouped together. The grouping can be preprogrammed or custom programmed, e.g., by the user. In some embodiments, the user can use the display 441 and user controls on, e.g., welding tool 460 or a remote control device, to activate or deactivate the visual and audio cues. Preferably, the user can use welding user interface 430 for activating and deactivating the cues. In some embodiments, the user can activate and deactivate the cues using voice commands. For example, the user can say “GROUP 1 ON” to activate the visual cues and/or audio cues related to the welding tool 460. To “attach” the GROUP 1 visual cues to the welding tool 460, the user can say “GROUP 1 ATTACH”. In some embodiments, eye tracking, as discussed above, can be used to highlight and select an appropriate menu item(s) to activate and deactivate the cues.
In preferred exemplary embodiments, welding data related to the position, orientation and movement of welding tool 460 and/or welding process parameters is stored as the user performs the weld. The stored data can be retrieved for subsequent review for training or certification purposes. For example, the programmable processor-based subsystem 410 can store the welding data (e.g., as a *.dat file), for reviewing the progress and/or performance of the welder at a later time. In some embodiments, the stored weld data can be used as the target values for the visual and audio cues. The stored target weld data can be that of a successful weld by an expert welder or even a successful prior welding run by the user. In some embodiments, the stored target weld data is based on computer modeling for the specific type of weld and/or testing of similar prior welds. Preferably, the stored target weld data includes information related to the weld weave pattern, TIG welding information such as filler frequency, welding start and stop indicators, weld length, welding sequencing, and welding progression. When a welding operation is started, visual and audio cues based on the stored target weld data will aid the user in creating the new weld. By using the target weld data, even experienced welders can benefit as the visual and audio cues can provide guidance in performing an unfamiliar weld sequence, an unusual weld joint configuration, etc.
Preferably, welding system training software to train welders, whether beginners or experienced, can be loaded into and executed by the logic processor-based subsystem 410. The user selects the appropriate welding training procedure and starts to perform the welding. The training software monitors and records the performance of the user as the user performs a weld. Generally, in related art real-weld training systems, welding equipment such as the power supply, wire feeder, hot-wire power supply, etc. do not communicate with the welding training computer. This means that, before or after the user selects a welding training procedure to run, the user must set up the power supply and possibly other welding equipment such as the wire feeder and/or a hot-wire power supply based on the selected welding training session. Typically, the welding equipment, e.g., the power supply, is located remotely from the welding training area. This means that any changes to the settings or verification of the settings on the welding equipment, e.g., the power supply, will require the user to exit the training area, and the trips to the welding equipment can occur multiple times during a welding training session, which can become cumbersome.
Because the tracking and monitoring of the welding tool 460 and/or the welding helmet 440 as the user performs the welding are discussed above with respect to system 400, for brevity, the tracking and monitoring will not be repeated. In addition, the following exemplary embodiments will be described in terms of the welding training system communicating with a power supply. However, those skilled in the art will understand that the welding training system can also communicate with other welding equipment such as wire feeders, hot-wire power supplies, etc.
In exemplary embodiments, user can communicate with welding equipment such as the power supply 450 using either the face-mounted display device 440A or the observer/set-up display device 430A connected to the welding user interface 430. Preferably, along with monitoring outputs of the power supply 450 such as voltage and current as discussed above, the user can also setup, view and change settings on the power supply 450. Preferably, the user uses display 430A and input device 430B (e.g., keyboard, mouse, or any other known inputting device) of the welding user interface 430 to communicate with the power supply 450 via the logic processor-based subsystem 410. Preferably, after providing login information on a login screen, the user can select a setup screen for setting parameters of the welding equipment.
In some exemplary embodiments, the face-mounted display 440A can be used to view, setup and change the settings on the welding equipment such as the power supply 450, wire feeder, hot-wire power supply, etc. The user can be shown the display prior to welding operation and the user can navigate through the selection screen using. e.g., voice commands, eye tracking and/or other input devices. In some embodiments, a manual remote control device can be used. The manual remote control device can be especially useful when used with a welding helmet 440 where the user can see through the display 440A. In either type of face-mounted display 440A, the user can navigate through the setting screen(s) and verify the settings prior to starting the weld training session. If the settings need to be modified for any reason, the user can immediately change the settings rather than having to stop and walk over to the welding equipment to change the settings.
The welding equipment, e.g., power supply 450, and/or the logic processor-based subsystem can include the software that transmits the screen 800 to the display 440A and/or 430A. For example, the programmable processor-based subsystem 410 can include software that receives the requests for information from the user and retrieves the information from the welding equipment. Preferably, the screen 800 is a web page transmitted by a webserver hosted on the programmable processor-based subsystem 410 and/or power supply 450 or some other welding equipment. In some embodiments, and app-based system can be used. Such client-server type software is known to those skilled in the art and thus will not be further discussed except as needed to describe exemplary embodiments of the invention.
Preferably, the power supply 450 and/or other welding equipment are automatically set up based on the welding training session selected by the user. For example, as seen in
Preferably, the programmable processor-based subsystem 410 and the power supply 450 (and/or some other welding equipment) communicate via the network 475 (see
In preferred embodiments of the invention, when the user presses the trigger of the welding tool 460, the power supply 450 sends a signal to the processor-based subsystem 410 indicating that the welding process (whether real or simulated) has started. Of course, when the weld is simulated, the power supply 450 does not provide a real-world voltage or current output and any voltage, current and power readings are simulated along with a simulated weld bead. The trigger signal can be used by the processor-based subsystem 410 to start recording the performance of user. For example, once the trigger signal is received by the processor-based subsystem 410 indicating the welding has started, the processor-based subsystem 410 can start recording performance parameters of the user such as CTWD, travel speed, work angle, travel angle and aim. In addition, the logic processor-based subsystems 410 can include parameters from the welding equipment such as welding voltage, welding current, wire feed speed, hot-wire current, etc. Preferably, as seen in
In some embodiments, the welding equipment itself may provide a score for the welding session when performing a weld. For example, Lincoln Electric PowerWave™ welding power supplies provide a weld score based on the current value, voltage value and the stability of the current and voltage values throughout the welding session. Preferably, the weld score is incorporated into the total score. Preferably, a real-time indication of the weld score is provided to the user via the face-mounted display 440A, e.g., as a visual cue as discussed above. For simulated welding, the user can practice setting up the welding power supply 450 and the processor-based subsystem 410 can communicate with the welding power supply 450 to determine if the user properly set up the machine. Preferably, if the welding power supply 450 is set up incorrectly the processor-based subsystem 410 can be configured such that the user will be unable to proceed and/or the score will be negatively impacted.
Preferably, one or more of the plotted parameters or the scores are provided to the user in real-time via the face-mounted display 440A as the user is welding, e.g., as a plotted graph or as a visual cue as discussed above. In some embodiments, the attributes of these parameters can be changed as discussed above whenever the parameter goes into a warning state or an alarm state based on the upper and lower threshold values and/or deviation from the optimum value.
In another exemplary embodiment, along with the visual and audio cues discussed above, the system 400 can overlay virtual objects on or over the real-world objects in the welding environment in real-time to provide a mixed reality and simulated scene of the welding environment. Preferably, the virtual objects are generated by the logic process-based subsystem 410. The virtual objects can include any object that can aid or train the user when performing a weld (e.g., real-world, simulated or a combination thereof) whether in a training environment or out in the field, e.g., a manufacturing site or a construction site. Preferably, the logic process-based subsystem 410 generates virtual objects to be overlaid on the welding environment 480 in real-time such that the user can see the virtual objects on display 441 during the welding process. In some embodiments, the generated virtual objects can also be transmitted to display 430A to been seen by other observers such as a welding instructor.
For example, as seen in
For example, the logic processor-based subsystem 410 can generate virtual weld objects 904 that visually show where the user should place the welds on a coupon/workpiece 480 A, B. Similar to the weld start/stop visual aids discussed above, the weld objects 904 show where the user should place the weld and how long the weld should be. For example, as seen in
Preferred embodiments can also generate and display other virtual objects that can aid or train the user. For example, as seen in
In addition to aiding the user in real-time during the welding process, virtual objects 900 can also provide information on the performance of the user and/or the quality of the weld. For example, the weld tool position, orientation and motion, e.g., CTWD, travel speed, work angle, travel angle and aim, can be used to determine if there is a potential flaw (e.g., porosity, incomplete fusion (not enough penetration), crack in the weld, undercut, weld profile is too thin, weld profile is too thick, etc.) or other problem in the weld. Preferably the position, orientation and motion, e.g., CTWD, travel speed, work angle, travel angle and/or aim, of the welding tool 460 is compared to upper and lower target thresholds, target values or preferred variations for the type of welding process (e.g., GMAW, FCAW, SMAW, GTAW), the type and orientation of the weld joint, the type materials, etc., to determine whether a potential flaw or other problem in the weld exists. In addition to the user's performance with respect to the position, orientation and motion of welding tool 460, other system parameters such as voltage, current, wire feed speed, arc length, heat input, gas flow (metered) and/or deposition rate (e.g., lbs./hr., in./run) can be analyzed to determine whether the weld 480D is acceptable. For example, based on one or more of the user's performance parameters and/or one or more of the other system parameters, the system 400, e.g. logic processor-based subsystem 410, can include algorithms that determine areas in the weld 480D that can have potential flaws or other problems such as, e.g., porosity, incomplete fusion (not enough penetration), crack in the weld, undercut, weld profile is too thin, weld profile is too thick, etc. Algorithms to determine potential problems such as, e.g., porosity, incomplete fusion, crack in the weld, undercut and/or weld profile are known in the art and thus will not be discussed further except as needed to describe the preferred embodiments. The algorithms compare one or more of the user's performance parameters thresholds and/or the other system parameters to set point values, e.g., upper and lower threshold values, optimum values and/or preferred variation, to see if the set point values are exceeded by a predetermined amount at any point on the weld 480D. If so, the system 400, e.g., logic processor-based subsystem 410, keeps track of the point in the weld that the set point values were exceeded. Based on the whether the set point values were exceeded, the attribute, e.g., color, intensity, etc., at the appropriate point on the virtual weld object 902 can be changed to indicate the potential flaw or problem in the weld.
For example, as seen in
Preferably, ideal weld objects are pre-loaded and/or can be custom built and stored in the system 400, e.g., in logic processor-based subsystems 410. The ideal weld objects can represent a variety of ideal weld profiles for the types of welding procedures, training procedures, that can be selected the system 400. Preferably, the ideal weld profile is based on the type of weld joint (e.g., straight, orbital, curved, etc.), orientation of the weld joint (horizontal, vertical, out-of-position, etc.), the type of welding operation (e.g., GMAW, FCAW, SMAW, GTAW), etc. The ideal weld object profiles (e.g., width and thickness of the weld seam) can be based on computer-generated models, based on data from analysis of test welds, based on a weld profile of an expert welder or another welder, and/or based on weld profiles know to be acceptable in the industry. In some embodiments, the virtual weld object can be based on a user's previous weld history in order to, e.g., compare the current weld with a previous weld. In some embodiments, a computer generated model of the actual weld profile is created during the welding process from one or more of the user's performance parameters, e.g., CTWD, travel speed, work angle, travel angle and/or aim, and/or one of more of the other system parameters, e.g., voltage, current, wire feed speed, arc length, heat input, gas flow (metered) and/or deposition rate (e.g., lbs./hr., in./run). The computer generated model of the actual weld profile can then be overlaid on weld 480D with the indications of any potential problem areas as discussed above. By seeing the computer-generated model, the user may see additional features of in generated weld that are difficult or impossible to see in the actual weld 480D, e.g., a potential flaw in the center of the weld. Preferably, in the above exemplary embodiments, the user has the option to turn ON and OFF the virtual objects 900, e.g., individually, all at one time and/or by groups (e.g., as discussed above with respect to visual cues).
In some exemplary embodiments, the coupon/workpiece 480A, B can be automatically calibrated by the system 400, e.g., by the logic processor-based subsystem 410. Typically, in order for a system such as system 400 to know where the actual weld path is (e.g. a fillet weld path, a lap weld path and a weld groove path), a person must first calibrate the system, especially the tracking subsystem such as spatial tracker 420, so that the system knows where the location of the weld path, the orientation of the weld path and the length of the weld path on the welding coupon or workpiece. The calibration typically involves placing the welding coupon or workpiece in a known calibration block that includes two or more markers (e.g. active or passive markers as discussed above) and allowing the system to capture the position of the markers, e.g., similar to the tracking discussed above with respect to spatial tracker 420 and processor-based subsystem 410. After the markers have been located, the system calculates the weld path location, orientation and length in the 3-D space of the welding environment. This allows the system to determine parameters such as aim, CTWD, etc. However, the system calibration requires human intervention and takes time to calibrate, which must be done every time a different type of coupon or workpiece is used.
In exemplary embodiments, the system 400 can include a coupon/workpiece recognition device. The coupon/workpiece recognition device can include a transmitter-receiver 960. As seen in
As seen in
While the invention has been particularly shown and described with reference to exemplary embodiments thereof, the invention is not limited to these embodiments. It will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the above embodiments.
This U.S. Non-Provisional Patent Application claims priority to U.S. Provisional Application Ser. No. 62/418,737, filed Nov. 7, 2016, which is incorporated herein by reference in its entirety. The specifications of U.S. application Ser. No. 14/682,340, filed Apr. 9, 2015; U.S. application Ser. No. 14/037,699, filed Sep. 26, 2013; U.S. application Ser. No. 12/577,824, filed on Oct. 13, 2009, which issued as U.S. Pat. No. 8,569,655; and U.S. Provisional Patent Application No. 61/977,275, filed Apr. 9, 2014, are incorporated herein by reference in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
317063 | Wittenstrom | May 1885 | A |
428459 | Coffin | May 1890 | A |
483428 | Coffin | Sep 1892 | A |
1159119 | Springer | Nov 1915 | A |
D140630 | Garibay | Mar 1945 | S |
D142377 | Dunn | Sep 1945 | S |
D152049 | Welch, Jr. | Dec 1948 | S |
2681969 | Burke | Jun 1954 | A |
D174208 | Abildgaard | Mar 1955 | S |
2728838 | Barnes | Dec 1955 | A |
D176942 | Cross | Feb 1956 | S |
2894086 | Rizer | Jul 1959 | A |
3035155 | Hawk | May 1962 | A |
3059519 | Stanton | Oct 1962 | A |
3356823 | Waters et al. | Dec 1967 | A |
3555239 | Kerth | Jan 1971 | A |
3621177 | McPherson et al. | Nov 1971 | A |
3654421 | Streetman et al. | Apr 1972 | A |
3739140 | Rotilio | Jun 1973 | A |
3866011 | Cole | Feb 1975 | A |
3867769 | Schow et al. | Feb 1975 | A |
3904845 | Minkiewicz | Sep 1975 | A |
3988913 | Metcalfe et al. | Nov 1976 | A |
D243459 | Bliss | Feb 1977 | S |
4024371 | Drake | May 1977 | A |
4041615 | Whitehill | Aug 1977 | A |
D247421 | Driscoll | Mar 1978 | S |
4124944 | Blair | Nov 1978 | A |
4132014 | Schow | Jan 1979 | A |
4237365 | Lambros et al. | Dec 1980 | A |
4275266 | Laser | Jun 1981 | A |
4280041 | Kiessling et al. | Jul 1981 | A |
4280042 | Berger et al. | Jul 1981 | A |
4280137 | Ashida et al. | Jul 1981 | A |
4314125 | Nakamura | Feb 1982 | A |
4354087 | Osterlitz | Oct 1982 | A |
4359622 | Dostoomian et al. | Nov 1982 | A |
4375026 | Kearney | Feb 1983 | A |
4410787 | Kremers et al. | Oct 1983 | A |
4429266 | Tradt | Jan 1984 | A |
4452589 | Denison | Jun 1984 | A |
D275292 | Bouman | Aug 1984 | S |
D277761 | Korovin et al. | Feb 1985 | S |
4523808 | Miller | Jun 1985 | A |
4525619 | Ide et al. | Jun 1985 | A |
D280329 | Bouman | Aug 1985 | S |
4611111 | Baheti et al. | Sep 1986 | A |
4616326 | Meier et al. | Oct 1986 | A |
4629860 | Lindbom | Dec 1986 | A |
4641282 | Ounuma | Feb 1987 | A |
4677277 | Cook et al. | Jun 1987 | A |
4680014 | Paton et al. | Jul 1987 | A |
4689021 | Vasiliev et al. | Aug 1987 | A |
4707582 | Beyer | Nov 1987 | A |
4716273 | Paton et al. | Dec 1987 | A |
D297704 | Bulow | Sep 1988 | S |
4867685 | Brush et al. | Sep 1989 | A |
4877940 | Bangs et al. | Oct 1989 | A |
4897521 | Burr | Jan 1990 | A |
4907973 | Hon | Mar 1990 | A |
4931018 | Herbst | Jun 1990 | A |
4973814 | Kojima et al. | Nov 1990 | A |
4998050 | Nishiyama et al. | Mar 1991 | A |
5034593 | Rice et al. | Jul 1991 | A |
5061841 | Richardson | Oct 1991 | A |
5089914 | Prescott | Feb 1992 | A |
5192845 | Kirmsse et al. | Mar 1993 | A |
5206472 | Myking et al. | Apr 1993 | A |
5266930 | Ichikawa et al. | Nov 1993 | A |
5285916 | Ross | Feb 1994 | A |
5305183 | Teynor | Apr 1994 | A |
5320538 | Baum | Jun 1994 | A |
5337611 | Fleming et al. | Aug 1994 | A |
5360156 | Ishizaka et al. | Nov 1994 | A |
5360960 | Shirk | Nov 1994 | A |
5370071 | Ackermann | Dec 1994 | A |
D359296 | Witherspoon | Jun 1995 | S |
5424634 | Goldfarb et al. | Jun 1995 | A |
5436638 | Bolas et al. | Jul 1995 | A |
5464957 | Kidwell et al. | Nov 1995 | A |
D365583 | Viken | Dec 1995 | S |
5562843 | Yasumoto | Oct 1996 | A |
5662822 | Tada | Sep 1997 | A |
5670071 | Ueyama et al. | Sep 1997 | A |
5671158 | Fournier | Sep 1997 | A |
5676503 | Lang | Oct 1997 | A |
5676867 | Van Allen | Oct 1997 | A |
5708253 | Bloch et al. | Jan 1998 | A |
5710405 | Solomon et al. | Jan 1998 | A |
5719369 | White et al. | Feb 1998 | A |
D392534 | Degen et al. | Mar 1998 | S |
5728991 | Takada et al. | Mar 1998 | A |
5734421 | Maguire, Jr. | Mar 1998 | A |
5751258 | Fergason et al. | May 1998 | A |
D395296 | Kaye et al. | Jun 1998 | S |
D396238 | Schmitt | Jul 1998 | S |
5781258 | Dabral et al. | Jul 1998 | A |
5823785 | Matherne, Jr. | Oct 1998 | A |
5835077 | Dao et al. | Nov 1998 | A |
5835277 | Hegg | Nov 1998 | A |
5845053 | Watanabe et al. | Dec 1998 | A |
5877777 | Colwell | Mar 1999 | A |
5896579 | Johnson | Apr 1999 | A |
5916464 | Geiger | Jun 1999 | A |
5963891 | Walker et al. | Oct 1999 | A |
6008470 | Zhang et al. | Dec 1999 | A |
6037948 | Liepa | Mar 2000 | A |
6049059 | Kim | Apr 2000 | A |
6051805 | Vaidya et al. | Apr 2000 | A |
6114645 | Burgess | Sep 2000 | A |
6155475 | Ekelof et al. | Dec 2000 | A |
6155928 | Burdick | Dec 2000 | A |
6230327 | Briand | May 2001 | B1 |
6236013 | Delzenne | May 2001 | B1 |
6236017 | Smartt et al. | May 2001 | B1 |
6242711 | Cooper | Jun 2001 | B1 |
6271500 | Hirayama et al. | Aug 2001 | B1 |
6330938 | Herve et al. | Dec 2001 | B1 |
6330966 | Eissfeller | Dec 2001 | B1 |
6331848 | Stove et al. | Dec 2001 | B1 |
D456428 | Aronson, II et al. | Apr 2002 | S |
6373465 | Jolly et al. | Apr 2002 | B2 |
D456828 | Aronson, II et al. | May 2002 | S |
6397186 | Bush | May 2002 | B1 |
D461383 | Blackburn | Aug 2002 | S |
6441342 | Hsu | Aug 2002 | B1 |
6445964 | White et al. | Sep 2002 | B1 |
6492618 | Flood et al. | Dec 2002 | B1 |
6506997 | Matsuyama | Jan 2003 | B2 |
6552303 | Blankenship et al. | Apr 2003 | B1 |
6560029 | Dobbie et al. | May 2003 | B1 |
6563489 | Latypov et al. | May 2003 | B1 |
6568846 | Cote et al. | May 2003 | B1 |
D475726 | Suga et al. | Jun 2003 | S |
6572379 | Sears et al. | Jun 2003 | B1 |
6583386 | Ivkovich | Jun 2003 | B1 |
6621049 | Suzuki | Sep 2003 | B2 |
6624388 | Blankenship et al. | Sep 2003 | B1 |
D482171 | Vui et al. | Nov 2003 | S |
6647288 | Madill et al. | Nov 2003 | B2 |
6649858 | Wakeman | Nov 2003 | B2 |
6655645 | Lu et al. | Dec 2003 | B1 |
6660965 | Simpson | Dec 2003 | B2 |
6697701 | Hillen et al. | Feb 2004 | B2 |
6697770 | Nagetgaal | Feb 2004 | B1 |
6703585 | Suzuki | Mar 2004 | B2 |
6708385 | Lemelson | Mar 2004 | B1 |
6710298 | Eriksson | Mar 2004 | B2 |
6710299 | Blankenship et al. | Mar 2004 | B2 |
6715502 | Rome et al. | Apr 2004 | B1 |
6720878 | Jumpertz | Apr 2004 | B2 |
D490347 | Meyers | May 2004 | S |
6730875 | Hsu | May 2004 | B2 |
6734393 | Friedl et al. | May 2004 | B1 |
6744011 | Hu et al. | Jun 2004 | B1 |
6750428 | Okamoto et al. | Jun 2004 | B2 |
6765584 | Wloka et al. | Jul 2004 | B1 |
6768974 | Nanjundan et al. | Jul 2004 | B1 |
6772802 | Few | Aug 2004 | B2 |
6788442 | Potin et al. | Sep 2004 | B1 |
6795778 | Dodge et al. | Sep 2004 | B2 |
6798974 | Nakano et al. | Sep 2004 | B1 |
6857553 | Hartman et al. | Feb 2005 | B1 |
6858817 | Blankenship et al. | Feb 2005 | B2 |
6865926 | O'Brien et al. | Mar 2005 | B2 |
6871958 | Christensen | Mar 2005 | B2 |
D504449 | Butchko | Apr 2005 | S |
6920371 | Hillen et al. | Jul 2005 | B2 |
6940037 | Kovacevic et al. | Sep 2005 | B1 |
6940039 | Blankenship et al. | Sep 2005 | B2 |
7021937 | Simpson et al. | Apr 2006 | B2 |
7024342 | Waite et al. | Apr 2006 | B1 |
7126078 | Demers et al. | Oct 2006 | B2 |
7132617 | Lee et al. | Nov 2006 | B2 |
7170032 | Flood | Jan 2007 | B2 |
7194447 | Harvey et al. | Mar 2007 | B2 |
7247814 | Ott | Jul 2007 | B2 |
D555446 | Picaza Ibarrondo et al. | Nov 2007 | S |
7315241 | Daily et al. | Jan 2008 | B1 |
D561973 | Kinsley et al. | Feb 2008 | S |
7346972 | Inget | Mar 2008 | B2 |
7353715 | Myers | Apr 2008 | B2 |
7363137 | Brant et al. | Apr 2008 | B2 |
7375304 | Kainec et al. | May 2008 | B2 |
7381923 | Gordon et al. | Jun 2008 | B2 |
7414595 | Muffler | Aug 2008 | B1 |
7465230 | LeMay et al. | Dec 2008 | B2 |
7478108 | Townsend et al. | Jan 2009 | B2 |
D587975 | Aronson, II et al. | Mar 2009 | S |
7516022 | Lee et al. | Apr 2009 | B2 |
7557327 | Matthews | Jul 2009 | B2 |
7580821 | Schirm et al. | Aug 2009 | B2 |
D602057 | Osicki | Oct 2009 | S |
7621171 | O'Brien | Nov 2009 | B2 |
D606102 | Bender et al. | Dec 2009 | S |
7643890 | Hillen et al. | Jan 2010 | B1 |
7687741 | Kainec et al. | Mar 2010 | B2 |
D614217 | Peters et al. | Apr 2010 | S |
D615573 | Peters et al. | May 2010 | S |
7817162 | Bolick et al. | Oct 2010 | B2 |
7853645 | Brown et al. | Dec 2010 | B2 |
D631074 | Peters et al. | Jan 2011 | S |
7874921 | Baszucki et al. | Jan 2011 | B2 |
7926228 | Becker | Apr 2011 | B1 |
7970172 | Hendrickson | Jun 2011 | B1 |
7972129 | O'Donoghue | Jul 2011 | B2 |
7991587 | Ihn | Aug 2011 | B2 |
8069017 | Hallquist | Nov 2011 | B2 |
8224881 | Spear et al. | Jul 2012 | B1 |
8248324 | Nangle | Aug 2012 | B2 |
8265886 | Bisiaux et al. | Sep 2012 | B2 |
8274013 | Wallace | Sep 2012 | B2 |
8287522 | Moses et al. | Oct 2012 | B2 |
8316462 | Becker | Nov 2012 | B2 |
8363048 | Gering | Jan 2013 | B2 |
8365603 | Lesage et al. | Feb 2013 | B2 |
8512043 | Choquet | Aug 2013 | B2 |
8569646 | Daniel et al. | Oct 2013 | B2 |
8657605 | Wallace | Feb 2014 | B2 |
8680434 | Stoger et al. | Mar 2014 | B2 |
8747116 | Zboray et al. | Jun 2014 | B2 |
8777629 | Kreindl et al. | Jul 2014 | B2 |
RE45062 | Maguire, Jr. | Aug 2014 | E |
8851896 | Wallace | Oct 2014 | B2 |
8860760 | Chen | Oct 2014 | B2 |
8915740 | Zboray | Dec 2014 | B2 |
RE45398 | Wallace | Mar 2015 | E |
8992226 | Leach et al. | Mar 2015 | B1 |
9011154 | Kindig | Apr 2015 | B2 |
9293056 | Zboray et al. | Mar 2016 | B2 |
9293057 | Zboray et al. | Mar 2016 | B2 |
9318026 | Peters | Apr 2016 | B2 |
9323056 | Williams | Apr 2016 | B2 |
9522437 | Pfeifer | Dec 2016 | B2 |
9761153 | Zboray et al. | Sep 2017 | B2 |
9767712 | Postlethwaite | Sep 2017 | B2 |
9779636 | Zboray et al. | Oct 2017 | B2 |
9818312 | Zboray et al. | Nov 2017 | B2 |
9836994 | Kindig et al. | Dec 2017 | B2 |
9911359 | Wallace | Mar 2018 | B2 |
9911360 | Wallace | Mar 2018 | B2 |
9928755 | Wallace et al. | Mar 2018 | B2 |
20010045808 | Hietmann et al. | Nov 2001 | A1 |
20010052893 | Jolly et al. | Dec 2001 | A1 |
20020032553 | Simpson et al. | Mar 2002 | A1 |
20020046999 | Veikkolainen et al. | Apr 2002 | A1 |
20020050984 | Roberts | May 2002 | A1 |
20020085843 | Mann | Jul 2002 | A1 |
20020175897 | Pelosi | Nov 2002 | A1 |
20030000931 | Ueda et al. | Jan 2003 | A1 |
20030025884 | Hamana et al. | Feb 2003 | A1 |
20030069866 | Ohno | Apr 2003 | A1 |
20030075534 | Okamoto et al. | Apr 2003 | A1 |
20030106787 | Santilli | Jun 2003 | A1 |
20030111451 | Blankenship et al. | Jun 2003 | A1 |
20030172032 | Choquet | Sep 2003 | A1 |
20030186199 | McCool et al. | Oct 2003 | A1 |
20030206491 | Pacheco | Nov 2003 | A1 |
20030223592 | Deruginsky et al. | Dec 2003 | A1 |
20030234885 | Pilu | Dec 2003 | A1 |
20040020907 | Zauner et al. | Feb 2004 | A1 |
20040035990 | Ackeret | Feb 2004 | A1 |
20040050824 | Samler | Mar 2004 | A1 |
20040088071 | Kouno et al. | May 2004 | A1 |
20040140301 | Blankenship et al. | Jul 2004 | A1 |
20040181382 | Hu et al. | Sep 2004 | A1 |
20040217096 | Lipnevicius | Nov 2004 | A1 |
20050007504 | Fergason | Jan 2005 | A1 |
20050017152 | Fergason | Jan 2005 | A1 |
20050029326 | Henrickson | Feb 2005 | A1 |
20050046584 | Breed | Mar 2005 | A1 |
20050050168 | Wen et al. | Mar 2005 | A1 |
20050101767 | Clapham et al. | May 2005 | A1 |
20050103766 | Iizuka et al. | May 2005 | A1 |
20050103767 | Kainec et al. | May 2005 | A1 |
20050109735 | Flood | May 2005 | A1 |
20050128186 | Shahoian et al. | Jun 2005 | A1 |
20050133488 | Blankenship et al. | Jun 2005 | A1 |
20050159840 | Lin et al. | Jul 2005 | A1 |
20050163364 | Beck et al. | Jul 2005 | A1 |
20050189336 | Ku | Sep 2005 | A1 |
20050199602 | Kaddani et al. | Sep 2005 | A1 |
20050230573 | Ligertwood | Oct 2005 | A1 |
20050252897 | Hsu et al. | Nov 2005 | A1 |
20050275913 | Vesely et al. | Dec 2005 | A1 |
20050275914 | Vesely et al. | Dec 2005 | A1 |
20060014130 | Weinstein | Jan 2006 | A1 |
20060076321 | Maev et al. | Apr 2006 | A1 |
20060136183 | Choquet | Jun 2006 | A1 |
20060142656 | Malackowski et al. | Jun 2006 | A1 |
20060154226 | Maxfield | Jul 2006 | A1 |
20060163227 | Hillen et al. | Jul 2006 | A1 |
20060166174 | Rowe et al. | Jul 2006 | A1 |
20060169682 | Kainec et al. | Aug 2006 | A1 |
20060173619 | Brant et al. | Aug 2006 | A1 |
20060189260 | Sung | Aug 2006 | A1 |
20060207980 | Jacovetty et al. | Sep 2006 | A1 |
20060213892 | Ott | Sep 2006 | A1 |
20060214924 | Kawamoto et al. | Sep 2006 | A1 |
20060226137 | Huismann et al. | Oct 2006 | A1 |
20060252543 | Van Noland et al. | Nov 2006 | A1 |
20060258447 | Baszucki et al. | Nov 2006 | A1 |
20070034611 | Drius et al. | Feb 2007 | A1 |
20070038400 | Lee et al. | Feb 2007 | A1 |
20070045488 | Shin | Mar 2007 | A1 |
20070088536 | Ishikawa | Apr 2007 | A1 |
20070112889 | Cook et al. | May 2007 | A1 |
20070198117 | Wajihuddin | Aug 2007 | A1 |
20070209586 | Ebensberger et al. | Sep 2007 | A1 |
20070211026 | Ohta | Sep 2007 | A1 |
20070221797 | Thompson et al. | Sep 2007 | A1 |
20070256503 | Wong et al. | Nov 2007 | A1 |
20070277611 | Portzgen et al. | Dec 2007 | A1 |
20070291035 | Vesely et al. | Dec 2007 | A1 |
20080021311 | Goldbach | Jan 2008 | A1 |
20080031774 | Magnant et al. | Feb 2008 | A1 |
20080038702 | Choquet | Feb 2008 | A1 |
20080061113 | Seki et al. | Mar 2008 | A9 |
20080078811 | Hillen et al. | Apr 2008 | A1 |
20080078812 | Peters et al. | Apr 2008 | A1 |
20080117203 | Gering | May 2008 | A1 |
20080120075 | Wloka | May 2008 | A1 |
20080128398 | Schneider | Jun 2008 | A1 |
20080135533 | Ertmer et al. | Jun 2008 | A1 |
20080140815 | Brant et al. | Jun 2008 | A1 |
20080149686 | Daniel et al. | Jun 2008 | A1 |
20080203075 | Feldhausen et al. | Aug 2008 | A1 |
20080233550 | Solomon | Sep 2008 | A1 |
20080303197 | Paquette et al. | Dec 2008 | A1 |
20080314887 | Stoger et al. | Dec 2008 | A1 |
20090015585 | Klusza | Jan 2009 | A1 |
20090021514 | Klusza | Jan 2009 | A1 |
20090045183 | Artelsmair et al. | Feb 2009 | A1 |
20090050612 | Serruys | Feb 2009 | A1 |
20090057286 | Ihara et al. | Mar 2009 | A1 |
20090152251 | Dantinne et al. | Jun 2009 | A1 |
20090173726 | Davidson et al. | Jul 2009 | A1 |
20090184098 | Daniel et al. | Jul 2009 | A1 |
20090200281 | Hampton | Aug 2009 | A1 |
20090200282 | Hampton | Aug 2009 | A1 |
20090231423 | Becker et al. | Sep 2009 | A1 |
20090259444 | Dolansky et al. | Oct 2009 | A1 |
20090298024 | Batzler et al. | Dec 2009 | A1 |
20090325699 | Delgiannidis | Dec 2009 | A1 |
20100012017 | Miller | Jan 2010 | A1 |
20100012637 | Jaeger | Jan 2010 | A1 |
20100048273 | Wallace et al. | Feb 2010 | A1 |
20100062405 | Zboray et al. | Mar 2010 | A1 |
20100062406 | Zboray | Mar 2010 | A1 |
20100096373 | Hillen et al. | Apr 2010 | A1 |
20100121472 | Babu et al. | May 2010 | A1 |
20100133247 | Mazumder et al. | Jun 2010 | A1 |
20100133250 | Sardy et al. | Jun 2010 | A1 |
20100176107 | Bong | Jul 2010 | A1 |
20100201803 | Melikian | Aug 2010 | A1 |
20100223706 | Becker | Sep 2010 | A1 |
20100224610 | Wallace | Sep 2010 | A1 |
20100276396 | Cooper et al. | Nov 2010 | A1 |
20100299101 | Shimada et al. | Nov 2010 | A1 |
20100307249 | Lesage et al. | Dec 2010 | A1 |
20110006047 | Penrod et al. | Jan 2011 | A1 |
20110060568 | Goldline et al. | Mar 2011 | A1 |
20110091846 | Kreindl et al. | Apr 2011 | A1 |
20110114615 | Daniel et al. | May 2011 | A1 |
20110116076 | Chantry et al. | May 2011 | A1 |
20110117527 | Conrardy et al. | May 2011 | A1 |
20110122495 | Togashi | May 2011 | A1 |
20110183304 | Wallace et al. | Jul 2011 | A1 |
20110187746 | Suto et al. | Aug 2011 | A1 |
20110248864 | Becker et al. | Oct 2011 | A1 |
20110290765 | Albrecht et al. | Dec 2011 | A1 |
20110316516 | Schiefermuller et al. | Dec 2011 | A1 |
20120122062 | Yang et al. | May 2012 | A1 |
20120189993 | Kindig et al. | Jul 2012 | A1 |
20120291172 | Wills et al. | Nov 2012 | A1 |
20120298640 | Conrardy et al. | Nov 2012 | A1 |
20130026150 | Chantry et al. | Jan 2013 | A1 |
20130040270 | Albrecht | Feb 2013 | A1 |
20130049976 | Maggiore | Feb 2013 | A1 |
20130075380 | Albrech et al. | Mar 2013 | A1 |
20130182070 | Peters et al. | Jul 2013 | A1 |
20130183645 | Wallace et al. | Jul 2013 | A1 |
20130189657 | Wallace et al. | Jul 2013 | A1 |
20130189658 | Peters et al. | Jul 2013 | A1 |
20130206741 | Pfeifer | Aug 2013 | A1 |
20130209976 | Postlethwaite et al. | Aug 2013 | A1 |
20130230832 | Peters et al. | Sep 2013 | A1 |
20130231980 | Elgart et al. | Sep 2013 | A1 |
20130327747 | Dantinne et al. | Dec 2013 | A1 |
20140013478 | Cole | Jan 2014 | A1 |
20140017642 | Postlethwaite | Jan 2014 | A1 |
20140038143 | Daniel et al. | Feb 2014 | A1 |
20140065584 | Wallace et al. | Mar 2014 | A1 |
20140134579 | Becker | May 2014 | A1 |
20140134580 | Becker | May 2014 | A1 |
20140220522 | Peters | Aug 2014 | A1 |
20140263224 | Becker | Sep 2014 | A1 |
20140272835 | Becker | Sep 2014 | A1 |
20140272836 | Becker | Sep 2014 | A1 |
20140272837 | Becker | Sep 2014 | A1 |
20140272838 | Becker | Sep 2014 | A1 |
20140312020 | Daniel | Oct 2014 | A1 |
20140315167 | Kreindl et al. | Oct 2014 | A1 |
20140322684 | Wallace | Oct 2014 | A1 |
20140346158 | Matthews | Nov 2014 | A1 |
20150056584 | Boulware et al. | Feb 2015 | A1 |
20150056585 | Boulware et al. | Feb 2015 | A1 |
20150056586 | Penrod et al. | Feb 2015 | A1 |
20150154884 | Salsich | Jun 2015 | A1 |
20150170539 | Chica Barrera | Jun 2015 | A1 |
20150190875 | Becker | Jul 2015 | A1 |
20150190876 | Becker | Jul 2015 | A1 |
20150190887 | Becker | Jul 2015 | A1 |
20150190888 | Becker | Jul 2015 | A1 |
20150194072 | Becker | Jul 2015 | A1 |
20150194073 | Becker | Jul 2015 | A1 |
20150209887 | DeLisio | Jul 2015 | A1 |
20150228203 | Kindig | Aug 2015 | A1 |
20150248845 | Postlethwaite | Sep 2015 | A1 |
20150283640 | Walker | Oct 2015 | A1 |
20150375323 | Becker | Dec 2015 | A1 |
20150375324 | Becker | Dec 2015 | A1 |
20150375327 | Becker | Dec 2015 | A1 |
20150379894 | Becker | Dec 2015 | A1 |
20160039034 | Becker | Feb 2016 | A1 |
20160039053 | Becker | Feb 2016 | A1 |
20160125592 | Becker | May 2016 | A1 |
20160125593 | Becker | May 2016 | A1 |
20160125594 | Becker | May 2016 | A1 |
20160125653 | Denis | May 2016 | A1 |
20160125761 | Becker | May 2016 | A1 |
20160125762 | Becker | May 2016 | A1 |
20160125763 | Becker | May 2016 | A1 |
20160125764 | Becker | May 2016 | A1 |
20160163221 | Sommers | Jun 2016 | A1 |
20160260261 | Hsu | Sep 2016 | A1 |
20160267806 | Hsu | Sep 2016 | A1 |
20160288236 | Becker | Oct 2016 | A1 |
20160321954 | Peters | Nov 2016 | A1 |
20160358503 | Batzler | Dec 2016 | A1 |
20170046974 | Becker | Feb 2017 | A1 |
20170046975 | Becker | Feb 2017 | A1 |
20170046976 | Becker | Feb 2017 | A1 |
20170046977 | Becker | Feb 2017 | A1 |
20170053557 | Daniel | Feb 2017 | A1 |
20170080509 | Pfeifer | Mar 2017 | A1 |
20170289424 | Beeson | Oct 2017 | A1 |
20190035306 | Becker | Jan 2019 | A1 |
20190172195 | Becker | Jun 2019 | A1 |
Number | Date | Country |
---|---|---|
2698078 | Sep 2011 | CA |
101193723 | Jun 2008 | CN |
101209512 | Jul 2008 | CN |
101214178 | Jul 2008 | CN |
201083660 | Jul 2008 | CN |
101419755 | Apr 2009 | CN |
201229711 | Apr 2009 | CN |
101571887 | Nov 2009 | CN |
101587659 | Nov 2009 | CN |
102014819 | Apr 2011 | CN |
102165504 | Aug 2011 | CN |
102298858 | Dec 2011 | CN |
202684308 | Jan 2013 | CN |
103871279 | Jun 2014 | CN |
105057869 | Nov 2015 | CN |
107316544 | Nov 2017 | CN |
2833638 | Feb 1980 | DE |
3046634 | Jul 1982 | DE |
3244307 | May 1984 | DE |
3522581 | Jan 1987 | DE |
4037879 | Jun 1991 | DE |
19615069 | Oct 1997 | DE |
19739720 | Oct 1998 | DE |
19834205 | Feb 2000 | DE |
20009543 | Aug 2001 | DE |
102005047204 | Apr 2007 | DE |
102010038902 | Feb 2012 | DE |
202012013151 | Feb 2015 | DE |
0108599 | May 1984 | EP |
0127299 | Dec 1984 | EP |
0145891 | Jun 1985 | EP |
0319623 | Jun 1989 | EP |
0852986 | Jul 1998 | EP |
1010490 | Jun 2000 | EP |
1527852 | May 2005 | EP |
1905533 | Apr 2008 | EP |
2274736 | May 2007 | ES |
1456780 | Jul 1966 | FR |
2827066 | Jan 2003 | FR |
2926660 | Jul 2009 | FR |
1455972 | Nov 1976 | GB |
1511608 | May 1978 | GB |
2254172 | Sep 1992 | GB |
2435838 | Sep 2007 | GB |
2454232 | May 2009 | GB |
478719 | Oct 1972 | JP |
5098035 | Aug 1975 | JP |
2224877 | Sep 1990 | JP |
5329645 | Dec 1993 | JP |
07047471 | Feb 1995 | JP |
07232270 | Sep 1995 | JP |
8132274 | May 1996 | JP |
8150476 | Jun 1996 | JP |
H08505091 | Jun 1996 | JP |
11104833 | Apr 1999 | JP |
2000167666 | Jun 2000 | JP |
2001071140 | Mar 2001 | JP |
2002278670 | Sep 2002 | JP |
2002366021 | Dec 2002 | JP |
2003200372 | Jul 2003 | JP |
2003271048 | Sep 2003 | JP |
2003326362 | Nov 2003 | JP |
2006006604 | Jan 2006 | JP |
2006281270 | Oct 2006 | JP |
2007290025 | Nov 2007 | JP |
2009500178 | Jan 2009 | JP |
2009160636 | Jul 2009 | JP |
2011528283 | Nov 2011 | JP |
2012024867 | Feb 2012 | JP |
10-0876425 | Dec 2008 | KR |
20090010693 | Jan 2009 | KR |
10-2011-0068544 | Jun 2011 | KR |
20140030644 | Mar 2014 | KR |
2008108601 | Sep 2009 | RU |
1038963 | Aug 1983 | SU |
1651309 | May 1991 | SU |
WO-9845078 | Oct 1998 | WO |
2001009867 | Feb 2001 | WO |
WO-0112376 | Feb 2001 | WO |
WO-0143910 | Jun 2001 | WO |
WO-0158400 | Aug 2001 | WO |
WO-2005102230 | Nov 2005 | WO |
WO-2006034571 | Apr 2006 | WO |
WO-2007009131 | Jan 2007 | WO |
WO-2007039278 | Apr 2007 | WO |
WO-2009060231 | May 2009 | WO |
WO-2009120921 | Oct 2009 | WO |
2009146359 | Dec 2009 | WO |
WO-2009149740 | Dec 2009 | WO |
WO-2010000003 | Jan 2010 | WO |
WO-2010020867 | Feb 2010 | WO |
WO-2010020870 | Feb 2010 | WO |
WO-2010044982 | Apr 2010 | WO |
WO-2010091493 | Aug 2010 | WO |
WO-2011045654 | Apr 2011 | WO |
WO-2011058433 | May 2011 | WO |
WO-2011067447 | Jun 2011 | WO |
WO-2011097035 | Aug 2011 | WO |
2011148258 | Dec 2011 | WO |
WO-2012082105 | Jun 2012 | WO |
WO-2012143327 | Oct 2012 | WO |
WO-2013014202 | Jan 2013 | WO |
WO-2013061518 | May 2013 | WO |
WO-2013114189 | Aug 2013 | WO |
WO-2013175079 | Nov 2013 | WO |
WO-2014007830 | Jan 2014 | WO |
WO-2014019045 | Feb 2014 | WO |
WO-2014020386 | Feb 2014 | WO |
2016137578 | Jan 2016 | WO |
2014140721 | Sep 2017 | WO |
Entry |
---|
The Lincoln Electric Company, Check Point Operator's Manual, 188 pages, issue date Aug. 2015. |
William Huff, Khoi Nguyen,“Computer Vision Based Registration Techniques for Augmented Reality”, Colorado School of Mines, Division of Engineering, Proceedings of Intellectual Robots and Computer Vision XV, pp. 538-548; SPIE vol. 2904, Nov. 18-22, 1996, Boston MA. |
European Search Report for European Patent Application 10860823.3-1702, pp. 1-8, dated Jun. 6, 2017. |
Benkai Xie, Qiang Zhou and Liang Yu; A Real Time Welding Training System Base on Virtual Reality; ONEW 360; Wuhan University of Technology; IEEE Virtual Reality Conference; Mar. 23-27, 2015. |
Extended European Search Report from Corresponding Application No. EP17001819.6; dated Apr. 11, 2018. |
“High Performance Computer Architectures_ A Historical Perspective,” downloaded May 5, 2016. |
http://homepages.inf.ed.ac.uk/cgi/rni/comparch. pl?Paru/perf.html,Paru/perf-f.html,Paru/menu-76.html. |
Abbas, et al., Code Aster (Software) EDR (France) 14 pages, Oct. 2001. |
Abbas, et al., Code_Aster; Introduction to Code_Aster; User Manual; Booket U1.0-: Introduction to Code_Aster; Document: U1.02.00; Version 7.4; Jul. 22, 2005. |
Abida et al., “Numerical simulation to study the effect of tack welds and root gap on welding deformations and residual stresses of a pipe-flange joint”, Faculty of Mechanical Engineering, GIK Institute of Engineering Sciences and Technology, Topi, NWFP, Pakistan. Available online Aug. 25, 2005. |
Adams, et al., “Adaptively sampled particle fluids,” ACM SIGGRAPH 2007 papers, Aug. 5-9, 2007, San Diego, California. |
Agren, “Sensor Integration for Robotic Arc Welding;” 1995; vol. 5604C of Dissertations Abstracts International p. 1123; Dissertation Abs Online (Dialog® File 35): © 2012 ProQuest Info& Learning: http://dialogweb.com/cgi/dwclient?req=1331233317524; one (1) page; printed Mar. 8, 2012. |
Aidun et al., Penetration in Spot GTA Welds during Centrifugation, Journal of Materials Engineering and Performance vol. 7(5) Oct. 1998—597. |
Aidun, D., “Influence of Simulated High-g on the Weld Size of Al—Li Alloy” Elevator Science Ltd.; 2001; 4 pages. |
Aiteanu et al., “Generation and Rendering of a Virtual Welding Seam in an Augmented Reality Training Environment” Proceedings of the Sixth IASTED International Conference, Aug. 2006, 8 pages. |
Aiteanu, “Virtual and Augmented Reality Supervisor for a New Welding Helmet” Dissertation Nov. 15, 2005. |
Aiteanu, et al., “A Step Forward in Manual Welding:; Demonstration of Augmented Reality Helmet” Institute of Automation, University of Bremen,; Germany, Proceedings of the Second IEEE and ACM International Symposium on Mixed and; Augmented Reality; 2003; 2 pages. |
Aiteanu, et al., “Computer-Aided Manual Welding Using an Augmented; Reality Supervisor” Sheet Metal Welding Conference XII, Livonia, MI, May 9-12, 2006, 14 pages. |
American Welding Society Advance Program of Programs and Events. Nov. 11-14, 2007. 31 pages. Chicago, IL. |
American Welding Society Detroit Section, “Sheet Metal Welding Conference XII”, May 2006, 11 pages. |
American Welding Society, “Vision for Welding Industry”; 41 pages, Estimated Jan. 1998. |
American Welding Society, ANSI/A WS D 10.11 MID 10. 11 :2007 Guide for Root Pass Welding of Pipe without Backing Edition: 3rd American Welding Society / Oct. 13, 2006/36 pages ISBN: 0871716445. |
American Welding Society, http://www.nsrp.org/6-presentations/WDVirtual_Welder. pdf (Virtual Reality Welder Training,; Project No. S1051, Navy ManTech Program, Project Review for Ship Tech 2005); 22 pages.; Biloxi, MS. |
American Welding Society, https://app.aws.org/conferences/defense/live index.html (AWS Welding in the Defense; Industry conference schedule, estimated Jan. 2004); 12 pages. |
American Welding Society, https://app.aws.org/w/r/www/wj/2005/03/WJ_2005_03.pdf (AWS Welding Journal, Mar. 2005; (see, e.g., p. 54)).; 114 pages. |
American Welding Society, https://app.aws.org/wj/2004/04/052/njc (AWS Virtual Reality Program to Train Welders for; Shipbuilding, workshop information, 2004); 7 pages. |
American Welding Society, https://app.aws.org/wj/2007 /11/WJ200711.pdf (AWS Welding Journal, Nov. 2007); 240 pages. |
American Welding Society, Welding Handbook, Welding Science & Technology, Ninth Ed., Copyright 2001. Appendix A “Terms and Definitions”. |
Antonelli et al, “A Semi-Automated Welding Station Exploiting Human-Robot Interaction,” Advanced Manufacturing Systems and Technology (2011) pp. 249-260. |
ARC+—Archived Press Release from WayBack Machine from Jan. 31, 2008—Apr. 22, 2013, Page, https://web.archive.org/web/20121006041803/http://www.123certification.com/en/article_press/index.htm, Jan. 21, 2016, 3 pages. |
ARC+ simulator; http://www.123arc.com/en/depliant_ang.pdf; Estimated Jan. 2000. |
Kenneth Fast; Virtual Welding—A Low Cost Virtual Reality Welder system training system phase II; NSRP ASE Technology Investment Agreement: Feb. 29. 2012: pp. 1-54. |
ArcSentry Weld Quality Monitoring System; Native American Technologies, allegedly 2002, 5; pages. |
ARS Electronica Linz Gmbh, Fronius, 2 pages, May 18, 1997. |
ARVIKA Forum Vorstellung Projekt PAARi. BMW Group Virtual Reality Center. 4 pages.; Nuernberg. 2003. |
asciencetutor.com, A division of Advanced Science and Automation Corp., VWL (Virtual Welding Lab), 2 pages, 2007. |
ASME Definitions, Consumables, Welding Positions, dated Mar. 19, 2001. See http://www.gowelding.com/wp/asme4.htm. |
Balijepalli, et al. “Haptic Interfaces for Virtual Environment and Teleoperator Systems,” Haptics 2003, Department of Mechanical & Aerospace Engineering, State University of New York at Buffalo, NY. |
Bargteil, et al., “A semi-lagrangian contouring method for fluid simulation,” ACM Transactions on Graphics, 25(1), 2006. |
Bargteil, et al., “A texture synthesis method for liquid animations,” In Proceedings of the ACM SIGGRAPH/Eurographics Symposium on Computer Animation, Sep. 2006. |
Bender Shipbuilding and Repair Co. Virtual Welding—A Low Cost Virtual Reality Welding; Training System. Proposal submitted pursuant to MSRP Advanced Shipbuilding Enterprise; Research Announcement, Jan. 23, 2008. 28 pages, See also, http://www.nsrp.org/6-; Presentations/WD/020409 Virtual Weldinq Wilbur.pdf. |
Borzecki, et al., Specialist Committee V.3 Fabrication Technology Committee Mandate, Aug. 20-25, 2006, 49 pages, vol. 2, 16th International Ship and Offshore Structures Congress, Southampton, UK. |
Catalina, et al., “Interaction of Porosity with a Planar Solid/Liquid Interface” (“Catalina”), Metallurgical and Materials Transactions, vol. 35A, May 2004, pp. 1525-1538. |
ChemWeb.com—Journal of Materials Engineering (printedSep. 26, 2012) (01928041). |
Chen, et al., “Self-Learning Fuzzy Neural Networks and Computer Vision for Control of Pulsed GTAW,” dated May 1997. |
Chentanez, et al., “Liquid simulation on lattice-based tetrahedral meshes.” In ACM SIGGRAPH/Eurographics Symposium on Computer Animation 2007, pp. 219-228, Aug. 2007. |
Chentanez, et al., “Simultaneous coupling of fluids and deformable bodies,” In ACM SIGGRAPH/Eurographics Symposium on Computer Animation, pp. 83-89, Aug. 2006. |
Choquet, C., “ARC+: Today's Virtual Reality Solution for Welders” Internet Page, Jan. 1, 2008; 6 pages. |
Choquet, C., “ARC+®: Today's Virtual Reality Solution for Welders”, Published in Proceedings of the IIW Internatioal Conference; Jul. 10-11, 2008; 19 pages. |
Clausen, et al., “Simulating liquids and solid-liquid interactions with lagrangian meshes,” ACM Transactions on Graphics, 32(2):17:1-15, Apr. 2013. Presented at SIGGRAPH 2013. |
Cooperative Research Program, Virtual Reality Welder Training, Summary Report SR 0512, 4 pages, Jul. 2005. |
CS WAVE, The Virtual Welding Trainer, 6 pages, 2 estimated Jan. 2007. |
CS Wave—Manual, “Virtual Welding Workbench User Manual 3.0” estimated Jan. 2007. |
CUDA Programming Guide Version 1.1, Nov. 29, 2007. |
Da Dalto, et al. “CS WAVE, A Virtual learning tool for welding motion”, 10 pages, Mar. 14, 2008. |
Da Dalto, et al. “CS Wave: Learning welding motion in a virtual environment” Published in Proceedings of the IIW International Conference, Jul. 10-11, 2008. |
Desroches, X.; Code-Aster, Note of use for aciculations of welding; Instruction manual U2.03 booklet: Thermomechanical; Document: U2.03.05; Oct. 1, 2003. |
D'Huart, et al., “Virtual Environment for Training” 6th International Conference, ITS 20002, Jun. 2002; 6 pages. |
Dotson, “Augmented Reality Welding Helmet Prototypes How Awesome the Technology Can Get,” Sep. 26, 2012, Retrieved from the Internet: URL:http://siliconangle.com/blog/2012/09/26/augmented-reality-welding-helmet-prototypes-how-awesome-the-technology-can-get/,retrieved on Sep. 26, 2014, 1 page. |
Echtler et al, “17 The Intelligent Welding Gun: Augmented Reality for Experimental Vehicle Construction,” Virtual and Augmented Reality Applications in Manufacturing (2003) pp. 1-27. |
Edison Welding Institute, E-Weld Predictor, 3 pages, 2008. |
Eduwelding+, Training Activities with arc+ simulator; Weld Into The Future, Online Welding Simulator—A virtual training environment; 123arc.com; 6 pages, May 2008. |
Eduwelding+, Weld Into the Future; Online Welding Seminar—A virtual training environment; 123arc.com; 4 pages, 2005. |
Energetics, Inc. “Welding Technology Roadmap”, Sep. 2000, 38 pages. |
Fast, K. et al., “Virtual Training for Welding”, Mixed and Augmented Reality, 2004, ISMAR 2004, Third IEEE and CM International Symposium on Arlington, VA, Nov. 2-5, 2004. |
Feldman, et al., “Animating Suspended Particle Explosions”. In Proceedings of ACM SIGGRAPH 2003, pp. 708-715, Aug. 2003. |
Feldman, et al., “Fluids in deforming meshes” In ACM SIGGRAPH/Eurographics Symposium on Computer Animation 2005, Jul. 2005. |
Fite-Georgel, “Is there a Reality in Industrial Augmented Reality?” 10th IEEE International Symposium on Mixed and Augmented Reality (ISMAR). 10 pages, allegedly 2011. |
Foster, et al., “Realistic animation of liquids,” Graphical Models and Image Processing, v.58 n.5, p. 471-483, Sep. 1996. |
Foster, et al., “Practical animation of liquids,” Proceedings of the 28th annual conference on Computer graphics and interactive techniques, p. 23-30, Aug. 2001. |
Garcia-Allende, et al., “Defect Detection in Arc-Welding Processes by Means of the Line-to-Continuum Method and Feature Selection” www.mdpi.com/journal/sensors; Sensors 2009, 9, 7753-7770; DOI; 10.3390/s91007753. |
Goktekin, et al., “A Method for Animating Viscoelastic Fluids”. ACM Transactions on Graphics (Proc. of ACM SIGGRAPH 2004), 23(3):463-468, 2004. |
Graham, “Texture Mapping” Carnegie Mellon University Class 15-462 Computer graphics, Lecture 10 dated Feb. 13, 2003; 53 pages. |
Grahn, A., “Interactive Simulation of Contrast Fluid using Smoothed Particle Hydrodynamics,” Jan. 1, 2008, Master's Thesis in Computing Science, Umeå University, Department of Computing Science, Umeå, Sweden. |
Guu et al., “Technique for Simultaneous Real-Time Measurements of Weld Pool Surface Geometry and Arc Force,” Dec. 1992. |
Heston, Virtually Welding—raining in a virtual environment gives welding students a leg up, retrieved on Apr. 12, 2010 from: http://www.thefabricator.com/article/arcwelding/virtually-welding. |
Hillers, et al., “Augmented Reality—Helmet for the Manual; Welding Process” Institute of Automation, University of Bremen, Germany; 21 pages, 2004. |
Hillers, et al., “Direct welding arc observation without harsh flicker,” 8 pages, allegedly Fabtech International and AWS welding show, 2007. |
Hillers, et al., “Real time Arc-Welding Video Observation System.” 62nd International Conference of IIW, Jul. 12-17, 2009, 5 pages Singapore 2009. |
Hillers, et al., “TEREBES:; Welding Helmet with AR Capabilities”, Institute of Automatic University Bremen; Institute of; Industrial Engineering and Ergonomics, 10 pages, allegedly 2004. |
Hillis, et al., “Data Parallel Algorithms”, Communications of the ACM, Dec. 1986, vol. 29, No. 12, p. 1170. |
Hirche, et al. “Hardware Accelerated Per-Pixel Displacement Mapping” University of Tubingen, Germany, Alexander Ehlert, Stefan Guthe, WStIGRfS & Michael Doggett, ATI Research; 8 pages. |
Holmberg et al, “Efficient modeling and rendering of turbulent water over natural terrain,” In Proceedings of the 2nd international conference on Computer graphics and interactive techniques in Australasia and South East Asia (GRAPHITE '04) 2004. |
Sun Yaoming; Application of Micro Computer in Robotic Technologies; Science and Technology Literature Press; Catalogue of New Books of Science and Technology; Sep. 1987, pp. 360-363. |
Hu et al. “Heat and mass transfer in gas metal arc welding. Part 1: the arc” found in ScienceDirect, International Journal of Heat and Mass transfer 50 (2007) 833-846 Available on Line on Oct. 24, 2006 http://web.mst.edu/˜tsai/publications/Hu-IJHMT-2007-1-60.pdf. |
Impact Welding: examples from current and archived website, trade shows, etc. See, e.g.,; http://www.impactweldinq.com. 53 pages; estimated Jan. 2000. |
Irving, et al., “Efficient simulation of large bodies of water by coupling two and three dimensional techniques,” ACM SIGGRAPH 2006 Papers, Jul. 30-Aug. 3, 2006, Boston, Massachusetts. |
Jeffus, “Welding Principles and Applications” Sixth Edition, 2008, 10 pages. |
Jonsson et al. “Simulation of Tack Welding Procedures in Butt Joint Welding of Plates” Research Supplement, Oct. 1985. |
Juan Vicenete Rosell Gonzales, “RV-Sold: simulator virtual para la formacion de soldadores”; Deformacion Metalica, Es. vol. 34, No. 301 Jan. 1, 2008. |
Kass, et al., “Rapid, Stable Fluid Dynamics for Computer Graphics,” Proceedings of SIGGRAH '90, in Computer Graphics, vol. 24, No. 4, pp. 49-57, 1990. |
Klingner, et al., “Fluid animation with dynamic meshes,” In Proceedings of ACM SIGGRAPH 2006, pp. 820-825, Aug. 2006. |
Kobayashi, et al., “Simulator of Manual Metal Arc Welding with Haptic Display” (“Kobayashi 2001”), Proc. of the 11th International Conf. on Artificial Reality and Telexistence (ICAT), Dec. 5-7, 2001, pp. 175-178, Tokyo, Japan. |
Kobayashi, et al., “Skill Training System of Manual Arc Welding by Means of Face-Shield-Like HMD and Virtual Electrode” (“Kobayashi 2003”), Entertainment Computing, vol. 112 of the International Federation for Information Processing (IFIP), Springer Science + Business Media, New York, copyright 2003, pp. 389-396. |
Lincoln Global, Inc., “VRTEX 360: Virtual Reality Arc Welding Trainer” Brochure (2015) 4 pages. |
Lindholm, et al., “NVIDIA Testla: A Unifired Graphics and Computing Architecture”, IEEE Computer Society, 2008. |
Mahrle, A., et al.; “The influence of fluid flow phenomena on the laser beam welding process” International Journal of Heat and Fluid Flow 23 (2002, No. 3, pp. 288-297; Institute of Fluid Dynamics and Thermodynamics, Otto-von-Guericke University Magdeburg, P.O. Box 4120, D-39016 Magdeburg, Germany. |
Mann, et al., “Realtime HDR (High Dynamic Range) Video for Eyetap Wearable Computers, FPGA-Based Seeing Aids, and Glasseyes (EYETAPS),” 2012 25th IEEE Canadian Conference on Electrical and Computer Engineering (CCECE),pp. 1-6, Apr. 29, 2012, 6 pages. |
Mantinband, et al., “Autosteroscopic, field-sequential display with full freedom of movement or Let the display were the shutter-glasses,” 3ality (Israel) Ltd., 2002. |
Mavrikios D et al, A prototype virtual reality-based demonstrator for immersive and interactive simulation of welding processes, International Journal of Computer Integrated manufacturing, Taylor and Francis, Basingstoke, GB, vol. 19, No. 3, Apr. 1, 2006, pp. 294-300. |
Miller Electric Mfg. Co, “LiveArc: Welding Performance Management System” Owner's Manual, (Jul. 2014) 64 pages. |
Miller Electric Mfg. Co., “LiveArc Welding Performance Management System” Brochure, (Dec. 2014) 4 pages. |
Miller Electric Mfg. Co.; MIG Welding System features weld monitoring software; NewsRoom 2010 (Dialog® File 992); © 2011 Dialog. 2010; http://www.dialogweb.com/cgi/dwclient?reg=1331233430487; three (3) pages; printed Mar. 8, 2012. |
Moore, “No exponential is forever: but ‘Forever’ can be delayed!,” IEEE International Solid-State Circuits Conference, 2003. |
Müller, et al., “Particle-based fluid simulation for interactive applications,” Proceedings of the 2003 ACM SIGGRAPH/Eurographics symposium on Computer animation, Jul. 26-27, 2003, San Diego, California. |
Müller, et al., “Point Based Animation of Elastic, Plastic and Melting Objects,” Eurographics/ACM SIGGRAPH Symposium on Computer Animation (2004). |
N. A. Tech., P/NA.3 Process Modeling and Optimization, 11 pages, Jun. 4, 2008. |
Nasios, “Improving Chemical Plant Safety Training Using Virtual Reality,” Thesis submitted to the University of Nottingham for the Degree of Doctor of Philosophy, Dec. 2001. |
Nealen, A., “Point-Based Animation of Elastic, Plastic, and Melting Objects,” CG topics, Feb. 2005. |
Nordruch, et al., “Visual Online Monitoring of PGMAW Without a Lighting Unit”, Jan. 2005. |
NSRP ASE, Low-Cost Virtual Realtiy Welder Training System, 1 page, 2008. |
O'Brien et al.,“Dynamic Simulation of Splashing Fluids”. In Proceedings of Computer Animation 95, pp. 198-205, Apr. 1995. |
O'Brien, “Google's Project Glass gets some more details”, Jun. 27, 2012 (Jun. 27, 2012), Retrieved from the Internet: http://www.engadget.com/2012/06/27/googles-project-glass-gets-some-more-details/, retrieved on Sep. 26, 2014, 1 page. |
P/NA.3 Process Modelling and Optimization; Native American Technologies, allegedly 2002,; 5 pages. |
Penrod, “New Welder Training Tools.” EWI PowerPoint presentation; 16 pages allegedly 2008. |
Phar, “GPU Gems 2 Programming Techniques for High-Performance Graphics and General-Purpose Computation,” 2005, 12 pages. |
Porter, et al. Virtual Reality Welder Trainer, Session 5: Joining Technologies for Naval Applications: earliest date Jul. 14, 2006 (http://weayback.archive.org) Edision Welding Institute; J. Allan Cote, General Dynamics Electric Boat; Timothy D. Gifford, VRSim, and Wim Lam, FCS Controls. |
Porter, et al., Virtual Reality Training, Paper No. 2005-P19, 14 pages, 2005. |
Porter, et al., Virtual Reality Training, vol. 22, No. 3, Aug. 2006; 13 pages. |
Porter, et al., Virtual Reality Welder Training, dated Jul. 14, 2006. |
Praxair Technology Inc., “The RealWeld Trainer System: Real Weld Training Under Real Conditions” Brochure (Est. Jan. 2013) 2 pages. |
Premoze, et al., “Particle-based simulation of fluids,” Comput. Graph. Forum 22, 3, 401-410, 2003. |
Rasmussen, et al., “Directable photorealistic liquids,” Proceedings of the 2004 ACM SIGGRAPH/Eurographics symposium on Computer animation, Aug. 27-29, 2004, Grenoble, France. |
Ratnam, et al., “Automatic classification of weld defects using simulated data and an MLP neutral network.” Insight vol. 49, No. 3; Mar. 2007. |
Reeves, “Particles Systems—A Technique for Modeling a Class of Fuzzy Objects”, Computer Graphics 17:3 pp. 359-376, 1983. |
Renwick, et al., “Experimental Investigation of GTA Weld Pool Oscillations” Welding Research—Supplement to the Welding Journal, Feb. 1983, 7 pages. |
Rodjito, “Position tracking and motion prediction using Fuzzy Logic,” 2006, Colby College. |
Russel, et al., “Artificial Intelligence: A Modern Approach”, Prentice-Hall (Copywrite 1995). |
Sandor, et al., “Lessons Learned in Designing Ubiquitous Augmented; Reality User Interfaces.” 21 pages, allegedly from Emerging Technologies of Augmented; Reality: Interfaces Eds. Haller, M.; Billinghurst, M.; Thomas, B. Idea Group Inc. 2006. |
Sandor, et al., “PAARTI: Development of an Intelligent Welding Gun for; BMW.” PIA2003, 7 pages, Tokyo. 2003. |
Sandter, et al. Fronius—virtual welding, FH Joanne UM, Gesellschaft mbH, University of; Annlied Sciences 2 pages, May 12, 2008. |
Schoder, “Design and Implementation of a Video Sensor for Closed Loop Control of Back Bead Weld Puddle Width,” Massachusetts Institute of Technology, Dept. of Mechanical Engineering, May 27, 1983. |
Screen Shot of CS Wave Control Centre V3.0.0 https://web.archive.org/web/20081128081915/http:/wave.c-s.fr/images/english/snap_evolution4.jpg; Estimated Jan. 2007. |
Screen Shot of CS Wave Control Centre V3.0.0 https://web.archive.org/web/20081128081817/http:/wave.c-s.fr/images/english/snap_evolution6.jpg, estimated Jan. 2007. |
Screen Shot of CS Wave Exercise 135.FWPG Root Pass Level 1 https://web.archive.org/web/20081128081858/http:/wave.c-s.fr/images/english/snap_evolution2.jpg, estimated Jan. 2007. |
Sim Welder, retrieved on Apr. 12, 2010 from: http://www.simwelder.com. |
SIMFOR / CESOL, “RV-SOLD” Welding Simulator, Technical and Functional Features, 20 pages, estimated Jan. 2010. |
Slater, et al., “Mechanisms and Mechanical Devices Sourcebook,” McGraw Hill; 2nd Addition, 1996. |
Stam, J., “Stable fluids,” Proceedings of the 26th annual conference on Computer graphics and interactive techniques, p. 121-128, Jul. 1999. |
SWANTEC corporate web page downloaded Apr. 19, 2016. http://www.swantec.com/technology/numerical-simulation/. |
Tamasi, T., “The Evolution of Computer Graphics,” NVIDIA, 2008. |
Teeravarunyou, et al, “Computer Based Welding Training System,” International Journal of Industrial Engineering (2009) 16(2): 116-125. |
Terebes: examples from http://www.terebes.uni-bremen.de.; 6 pages. |
The Fabricator, Virtual Welding, 4 pages, Mar. 2008. |
The Lincoln Electric Company, “VRTEX Virtual Reality Arc Welding Trainer,” http://www.lincolnelectric.com/en-us/equipment/training-equipment/Pages/vrtex.aspx as accessed on Jul. 10, 2015, 3 pages. |
The Lincoln Electric Company, Production Monitoring 2 brochure, 4 pages, May 2009. |
The Lincoln Electric Company; CheckPoint Production Monitoring borchure; four (4) pages; http://www.lincolnelectric.com/assets/en_US/products/literature/s232.pdf; Publication S2.32; Issue Date Feb. 2012. |
Thurey, et al., “Real-time Breaking Waves for Shallow Water Simulations,” In Proceedings of the 15th Pacific Conference on Computer Graphics and Applications (PG '07) 2007. |
Tonnesen, D., “Modeling Liquids and Solids using Thermal Particles,” Proceedings of Graphics Interface'91, pp. 255-262, Calgary, Alberta, 1991. |
Tschirner, et al., “Virtual and Augmented Reality for Quality Improvement of Manual Welds” National Institute of Standards and Technology, Jan. 2002, Publication 973, 24 pages. |
Tschirner, et al, “A Concept for the Application of Augmented Reality in Manual Gas Metal Arc Welding.” Proceedings of the International Symposium on Mixed and Augmented Reality; 2 pages; 2002. |
Vesterlund, M., “Simulation and Rendering of a Viscous Fluid using Smoothed Particle Hydrodynamics,” Dec. 3, 2004, Master's Thesis in Computing Science, Umeå University, Department of Computing Science, Umeå, Sweden. |
Viega, et al. “Simulation of a Work Cell in the IGRIP Program” dated 2006; 50 pages. |
Virtual Welding: A Low Cost Virtual Reality Welder Training System, NSRP RA 07-01—BRP Oral Review Meeting in Charleston, SC at ATI, Mar. 2008. |
ViziTech USA, “Changing the Way America Learns,” http://vizitechusa.com/ accessed on Mar. 27, 2014; 2 pages. |
VRSim Inc. “About Us—History” www.vrsim.net/history, 2016, 1 page. |
VRSim Powering Virtual Reality, www.lincolnelectric.com/en-us/equipmenl/Iraining-equipmenl/Pages/powered-by-; ′rsim.aspx, 2016, 1 page. |
Wade, “Human uses of ultrasound: ancient and modern”, Ultrasonics vol. 38, dated 2000. |
Wahi, et al., “Finite-Difference Simulation of a Multi-Pass Pipe Weld” (“Wahi”), vol. L, paper 3/1, International Conference on Structural Mechanics in Reactor Technology, San Francisco, CA, Aug. 15-19, 1977. |
Wang, et al. “Numerical Analysis of Metal Tranfser in Gas Metal Arc Welding, ” Departements of Mechanical and Electrical Engineering. University of Kentucky, Dec. 10, 2001. |
Wang, et al., “Impingement of Filler Droplets and Weld Pool During Gas Metal Arc Welding Process” International Journal of Heat and Mass Transfer, Sep. 1999, 14 pages. |
Wang, et al., “Study on welder training by means of haptic guidance and virtual reality for arc welding,” 2006 IEEE International Conference on Robotics and Biomimetics, ROBIO 2006 ISBN—10: 1424405718, p. 954-958. |
Webster's II new college dictionary, 3rd ed., Houghton Mifflin Co., copyright 2005, Boston, MA, p. 1271, definition of “wake.”. |
White, et al., Virtual welder training, 2009 IEEE Virtual Reality Conference, p. 303, 2009. |
Wu, “Microcomputer-based welder training simulator”, Computers in Industry, vol. 20, No. 3, Oct. 1992, pp. 321-325, XP000205597, Elsevier Science Publishers, Amsterdam, NL. |
Wuhan Onew Technology Co Ltd, “ONEW-360 Welding Training Simulator” http://en.onewtech.com/_d276479751.htm as accessed on Jul. 10, 2015, 12 pages. |
Yao, et al., “Development of a Robot System for Pipe Welding” 2010 International Conference on Measuring Technology and Mechatronics Automation. Retrieved from the Internet: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=5460347&tag=1; pp. 1109-1112. |
Yoder, Fletcher, Opinion U.S. Pat. No. Re. 45,398 and U.S. Appl. No. 14/589,317, including Appendices ; filed Sep. 9, 2015; 1700 pages. |
United States Provisional Patent Application for “System for Characterizing Manual Welding Operations on Pipe and Other Curved Structures,” U.S. Appl. No. 62/055,724, filed Sep. 26, 2014, 35 pages. |
Office Action from U.S. Appl. No. 14/526,914 dated Feb. 3, 2017. |
Arc Simulation & Certification, Weld Into the Future, 4 pages, 2005, Jan. 2008. |
International Search Report and Written Opinion from PCT/IB10/02913 dated Apr. 19, 2011. |
International Search Report for PCT/IB2014/001796, dated Mar. 24, 3016; 8 pages. |
International Search Report for PCT/IB2015/000161, dated Aug. 25, 2016; 9 pages. |
International Search Report for PCT/IB2015/000777, dated Dec. 15, 2016; 11 pages. |
International Search Report for PCT/IB2015/000814 dated Dec. 15, 2016; 9 pages. |
International Preliminary Report from PCT/IB2015/001084 dated Jan. 26, 2017. |
Petition for Inter Partes Review of U.S. Pat. No. 8,747,116; IPR 2016-00749; Apr. 7, 2016; 70 pages. |
Declaration of Edward Bohnart, Apr. 27, 2016, exhibit to IPR 2016-00749. |
Declaration of Dr. Michael Zyda, May 3, 2016, exhibit to IPR 2016-00749. |
Trial Denied IPR Proceeding of U.S. Pat. No. 8,747,116; IPR 2016-00749; Sep. 21, 2016; 21 pages. |
Petition for Inter Partes Review of U.S. Pat. No. Re. 45,398; IPR 2016-00840; Apr. 18, 2016; 71 pages. |
Declaration of AxelGraeser, Apr. 17, 2016, exhibit to IPR 2016-00840; 88 pages. |
Decision Denying Request for Rehearing of U.S. Pat. No. Re. 45,398; IPR 2016-00840; Nov. 17, 2016; 10 pages. |
Petition for Inter Partes Review of U.S. Pat. No. 8,747,116; IPR 2016-01568; Aug. 9, 2016; 75 pages. |
Decision Termination Proceeding of U.S. Pat. No. 8,747,116; IPR 2016-01568; Nov. 15, 2016; 4 pages. |
Petition for Inter Partes Review of U.S. Pat. No. 9,293,056; IPR 2016-00904; May 9, 2016; 91 pages. |
Declaration of Edward Bohnart, Apr. 27, 2016, exhibit to IPR 2016-00904; 22 pages. |
Declaration of Dr. Michael Zyda, May 3, 2016, exhibit to IPR 2016-00904; 76 pages. |
Decision Trial Denied IPR Proceeding of U.S. Pat. No. 9,293,056; IPR 2016-00904; Nov. 3, 2016; 15 pages. |
Petition for Inter Partes Review of U.S. Pat. No. 9,293,057; IPR 2016-00905; May 9, 2016; 87 pages. |
Declaration of Edward Bohnart, Apr. 27, 2016, exhibit to IPR 2016-00905; 23 pages. |
Declaration of Dr. Michael Zyda, May 3, 2016, exhibit to IPR 2016-00905; 72 pages. |
Decision Trial Denied IPR Proceeding of U.S. Pat. No. 9,293,057; IPR 2016-00905; Nov. 3, 2016; 21 pages. |
Lincoln Electric Company et al v. Seabery Soluciones SL et al—1:15-cv-01575-DCN—Complaint filed Aug. 15, 2015 (Dkt 01). |
Lincoln Electric Company et al v. Seabery Soluciones SL et al—1:15-cv-01575-DCN—Amended Answer filed Mar. 1, 2016 by Seabery North America (docket 44). |
Lincoln Electric Company et al v. Seabery Soluciones SL et al—1:15-cv-01575-DCN—Amended Answer filed Mar. 1, 2016 by Seabery Soluciones SL (docket 45). |
Lincoln Electric Company et al v. Seabery Soluciones SL et al—1:15-cv-01575-DCN—Amended Answer filed Mar. 22, 2016 by Lincoln Electri c Company (docket 46). |
Lincoln Electric Company et al v. Seabery Soluciones SL et al—1:15-cv-01575-DCN—Answer filed Mar. 22, 2016 by Lincoln Global Inc. (docket 47). |
Exhibit B from Declaration of Morgan Lincoln in Lincoln Electric Co. et al. v. Seabery Soluciones, S.L. et al., Case No. 1:15-cv-01575-DCN, dated Dec. 20, 2016, 5 pages. |
International Serach Report and Written Opinion for International Application No. PCT/IB2009/006605. |
European Examination Report for application No. 17001820.4,4 pages, dated May 16, 2019. |
U.S. Appl. No. 15/784,979, filed Oct. 16, 2017, System and Method for Calibrating a Welding Trainer. |
U.S. Appl. No. 15/785,019, filed Oct. 16, 2017, Welding Trainer Utilizing a Head Up Display to Display Simulated and Real-World Objects. |
U.S. Appl. No. 15/785,513, filed Oct. 17, 2017, Communication Between a Welding Machine and a Live Welding Training Device. |
Number | Date | Country | |
---|---|---|---|
20180126476 A1 | May 2018 | US |
Number | Date | Country | |
---|---|---|---|
62418737 | Nov 2016 | US |