The present disclosure generally relates to weld training simulations and, more particularly, to weld training simulations using mobile devices, modular workpieces, and simulated welding equipment.
The welding industry has a shortage of experienced and skilled operators. Additionally, it is difficult and expensive to train new operators using live welding equipment. Further, even experienced welders often have difficulty maintaining important welding techniques throughout welding processes. Thus, there is a demand for affordable training tools and equipment that help operators develop, maintain, and/or refine welding skills.
Simulated welding tools make it possible for both experienced and inexperienced weld operators to practice producing high quality welds prior to actually using the real welding equipment. Additionally, welding operators can test out different welding tools in a simulated environment prior to actually purchasing that particular welding tool. However, conventional systems and methods for simulating joining operations require substantial investments in equipment (e.g., processors, displays, practice workpieces, welding tool(s), sensor(s), etc).
Limitations and disadvantages of conventional and traditional approaches will become apparent to one of skill in the art, through comparison of such systems with the present disclosure as set forth in the remainder of the present application with reference to the drawings.
The present disclosure is directed to weld training simulations using mobile devices, modular workpieces, and simulated welding equipment, substantially as illustrated by and/or described in connection with at least one of the figures, and as set forth in the claims.
These and other advantages, aspects and novel features of the present disclosure, as well as details of an illustrated example thereof, will be more fully understood from the following description and drawings.
The figures are not necessarily to scale. Where appropriate, the same or similar reference numerals are used in the figures to refer to similar or identical elements. For example, reference numerals utilizing lettering (e.g., workpiece 900a, workpiece 900b) refer to instances of the same reference numeral that does not have the lettering (e.g., workpieces 900).
Some examples of the present disclosure relate to simulating (e.g., via augmented, mixed, and/or virtual reality) joining operations (e.g., welding, brazing, adhesive bonding, and/or other joining operations). While the following disclosure sometimes refers to welding and/or weld training as a shorthand, the disclosure is equally applicable to other joining operations.
Some example of the present disclosure relate to using mobile devices (e.g., smartphone, tablet, personal digital assistant, electronic book reader, ipod, etc.) for conducting welding simulations, such as for purposes of training. In some examples, it may be advantageous to use mobile devices due to their availability, relative affordability, and/or technical power. The disclosure further contemplates automatically detecting whether an orientation of the mobile device is proper for the simulation, and notifying the user if not.
The present disclosure additionally contemplates using modular workpieces for conducting welding simulations. In some examples, the modular workpieces may be configured to tool-lessly connect to, and/or disconnect from, other modular workpieces to form various workpiece assemblies. In some examples, tool-less connectors may be advantageous because they can be easily connected to and/or engaged with other connectors without the need for auxiliary tools (e.g., screwdrivers, hammers, etc.). Tool-less connectors may also be advantageous over adhesives, as the tool-less connectors may be continually connected, disconnected, and reconnected with negligible change to their effectiveness, unlike adhesives. In some examples, the welding simulation may further be configured to recognize different joints formed by the modular workpieces, and conduct the welding simulation accordingly.
The present disclosure further contemplates using simulated equipment interfaces that replicate the appearance of actual equipment interfaces of actual welding-type equipment. In some examples, this replication may help orient a user who is already familiar with a particular piece of welding-type equipment and/or its actual equipment interface, thereby making them more comfortable with the welding simulation. In some examples, the replication may help users who are unfamiliar with a particular piece of welding-type equipment become familiar with the welding-type equipment (and/or its interface). Additionally, the present disclosure contemplates simulating certain welding effects in accordance with the way the effects might occur in the real world when real welding is performed using the real world welding-type equipment.
Some examples of the present disclosure relate to a mock workpiece for use with a mobile electronic device conducting a welding simulation, comprising: an object comprising: a marker configured for recognition or detection by the mobile device; and a connector configured for tool-less connection to a complementary connector of a complementary mock workpiece.
In some examples, the connector comprises a magnet, a hook fastener, a loop fastener, a snap fastener, a button, a clamping fastener, a prong, a stud, or a socket. In some examples, the connector comprises an array of connectors positioned along an edge or middle of the object. In some examples, the array of connectors are arranged asymmetrically in a poka yoke configuration to prevent incorrect connection to the complementary connector.
In some examples, the marker is positioned over the connector, hiding the connector. In some examples, the connection of the connector and complementary connector creates a joint at an intersection of the mock workpiece and the complementary mock workpiece, the joint comprising a lap joint, a butt joint, a corner joint, a T joint, an edge joint, or a pipe joint. In some examples, the connector is further configured for removable connection to a complementary connector of a fixturing system.
Some examples of the present disclosure relate to a weld training system, comprising: a first workpiece having a first connector; a second workpiece having a second connector configured to tool-lessly engage the first connector to secure the first workpiece to the second workpiece; and a mobile electronic device configured to conduct a weld training simulation, the mobile electronic device comprising: a sensor configured to detect data relating to the first workpiece and second workpiece, processing circuitry, and memory circuitry comprising computer readable instructions which, when executed by the processing circuitry, cause the processing circuitry to: determine a spatial relationship between the first workpiece and the second workpiece based on the data detected by the sensor.
In some examples, the spatial relationship comprises a type of joint defined by an intersection of the first workpiece and second workpiece, the type of joint comprising a lap joint, a butt joint, a corner joint, a T joint, an edge joint, or a pipe joint. In some examples, the memory circuitry further comprises computer readable instructions which, when executed by the processing circuitry, cause the processing circuitry to output a notification in response to determining the spatial relationship is different than an expected spatial relationship. In some examples, the notification comprises instructions for transitioning from the spatial relationship determined by the processing circuitry to the expected spatial relationship.
In some examples, the expected spatial relationship is based on a parameter of the weld training simulation, the parameter comprising a selected exercise, a selected part, or a selected joint type. In some examples, the memory circuitry further comprises computer readable instructions which, when executed by the processing circuitry, cause the processing circuitry to determine a training score based on a difference between the spatial relationship determined by the processing circuitry and the expected spatial relationship. In some examples, the memory circuitry further comprises computer readable instructions which, when executed by the processing circuitry, cause the processing circuitry to conduct the weld training simulation based on the spatial relationship of the first workpiece and second workpiece.
Some examples of the present disclosure relate to a mock workpiece assembly for use with a mobile electronic device conducting a welding simulation, comprising: a first mock workpiece, comprising: a first marker configured for recognition or detection by the mobile electronic device, and a first connector; and a second mock workpiece comprising: a second marker configured for recognition or detection by the mobile electronic device, a second connector configured for tool-less connection to the first connector in a first joint arrangement, and a third connector configured for tool-less connection to the first connector in a second joint arrangement that is different than the first joint arrangement.
In some examples, the first connector, second connector, and third connector comprise a first connector array, second connector array, and third connector array, respectively. In some examples, the first joint arrangement or second joint arrangement comprise a lap joint, a butt joint, a corner joint, a T joint, or an edge joint. In some examples, the second connector and third connector are further configured for tool-less disconnection from the first connector. In some examples, the first connector, second connector, or third connector comprises a magnet, a hook fastener, a loop fastener, a snap fastener, a button, a clamping fastener, a prong, a stud, or a socket. In some examples, the mock workpiece assembly further comprises a third mock workpiece comprising: a third marker configured for recognition or detection by the mobile electronic device, and a fourth connector configured for tool-less connection to the first connector in a third joint arrangement.
Some examples of the present disclosure relate to a mock workpiece for use with a desktop electronic device conducting a welding simulation, comprising: an object comprising: a marker configured for recognition or detection by the desktop electronic device; and a connector configured for tool-less connection to a complementary connector of a complementary mock workpiece.
In some examples, the connector comprises a magnet, a hook fastener, a loop fastener, a snap fastener, a button, a clamping fastener, a prong, a stud, or a socket. In some examples, the connector comprises an array of connectors positioned along an edge or middle of the object. In some examples, the array of connectors are arranged asymmetrically in a poka yoke configuration to prevent incorrect connection to the complementary connector.
In some examples, the marker is positioned over the connector, hiding the connector. In some examples, the connection of the connector and complementary connector creates a joint at an intersection of the mock workpiece and the complementary mock workpiece, the joint comprising a lap joint, a butt joint, a corner joint, a T joint, an edge joint, or a pipe joint. In some examples, the connector is further configured for removable connection to a complementary connector of a fixturing system.
Some examples of the present disclosure relate to a weld training system, comprising: a first workpiece having a first connector; a second workpiece having a second connector configured to tool-lessly engage the first connector to secure the first workpiece to the second workpiece; and a desktop electronic device configured to conduct a weld training simulation, the desktop electronic device comprising: a sensor configured to detect data relating to the first workpiece and second workpiece, processing circuitry, and memory circuitry comprising computer readable instructions which, when executed by the processing circuitry, cause the processing circuitry to: determine a spatial relationship between the first workpiece and the second workpiece based on the data detected by the sensor.
In some examples, the spatial relationship comprises a type of joint defined by an intersection of the first workpiece and second workpiece, the type of joint comprising a lap joint, a butt joint, a corner joint, a T joint, an edge joint, or a pipe joint. In some examples, the memory circuitry further comprises computer readable instructions which, when executed by the processing circuitry, cause the processing circuitry to output a notification in response to determining the spatial relationship is different than an expected spatial relationship. In some examples, the notification comprises instructions for transitioning from the spatial relationship determined by the processing circuitry to the expected spatial relationship.
In some examples, the expected spatial relationship is based on a parameter of the weld training simulation, the parameter comprising a selected exercise, a selected part, or a selected joint type. In some examples, the memory circuitry further comprises computer readable instructions which, when executed by the processing circuitry, cause the processing circuitry to determine a training score based on a difference between the spatial relationship determined by the processing circuitry and the expected spatial relationship. In some examples, the memory circuitry further comprises computer readable instructions which, when executed by the processing circuitry, cause the processing circuitry to conduct the weld training simulation based on the spatial relationship of the first workpiece and second workpiece.
Some examples of the present disclosure relate to a mock workpiece assembly for use with a desktop electronic device conducting a welding simulation, comprising: a first mock workpiece, comprising: a first marker configured for recognition or detection by the mobile electronic device, and a first connector; and a second mock workpiece comprising: a second marker configured for recognition or detection by the desktop electronic device, a second connector configured for tool-less connection to the first connector in a first joint arrangement, and a third connector configured for tool-less connection to the first connector in a second joint arrangement that is different than the first joint arrangement.
In some examples, the first connector, second connector, and third connector comprise a first connector array, second connector array, and third connector array, respectively. In some examples, the first joint arrangement or second joint arrangement comprise a lap joint, a butt joint, a corner joint, a T joint, or an edge joint. In some examples, the second connector and third connector are further configured for tool-less disconnection from the first connector. In some examples, the first connector, second connector, or third connector comprises a magnet, a hook fastener, a loop fastener, a snap fastener, a button, a clamping fastener, a prong, a stud, or a socket. In some examples, the mock workpiece assembly further comprises a third mock workpiece comprising: a third marker configured for recognition or detection by the desktop electronic device, and a fourth connector configured for tool-less connection to the first connector in a third joint arrangement.
In the example of
While the device mount 102 is shown as a clamshell case in the example of
In some examples, the device mount 102 may be removably secured such that the device mount 102 may be toollessly separated from one helmet shell 104 and then toollessly secured to a different helmet shell 104. In some examples, the device mount 102 may be configured for attachment to the helmet shell 104 in multiple different orientations (e.g., left and right landscape orientations). In such an example, the orientation of the mobile device 200 may be adjusted by adjusting the attachment orientation of the device mount 102 to the helmet shell 104.
In the example of
While the below disclosure focuses on the mobile device 200 of
In the examples of
In the examples of
In the examples of
In some examples, the welding tool 700 may include markers 112 on other portions of the welding tool 700 (e.g., handle 704, gooseneck 708, communication module 710, and/or trigger 706). While shown as pattern markers in the example of
In some examples, the mobile device 200 may capture sensor data (e.g., images) relating to the welding tool 700 and/or workpiece(s) 900. In some examples, the mobile device 200 may determine a position, orientation, motion, configuration, and/or other characteristic(s) of the welding tool 700 and/or workpiece(s) 900 based on an analysis of the sensor data. In some examples, the markers 112 may assist in this analysis. For example, one or more characteristics of the markers 112 may be recognized and/or interpreted to help determine the position, orientation, motion, configuration, and/or other characteristic of the welding tool 700 and/or workpiece(s) 900. In some examples, the mobile device 200 may be configured to conduct a welding simulation using the sensor data, and/or positions, orientations, motions, configurations, and/or other characteristics of the welding tool 700 and/or workpiece(s) 900. In some examples, image recognition techniques may be utilized in recognizing and/or interpreting the markers 112, welding tool 700, and/or workpiece(s) 900. In some examples, the welding tool 700 and/or workpiece(s) 900 may be markerless, and the weld training system 100 may user markerless techniques to determine position, orientation, configuration, and/or other characteristics of the welding tool 700 and/or workpiece(s) 900.
In the examples of
In some examples, the components of the mobile device 200 may reside on one or more printed circuit boards (PCBs) and/or flex circuits. While not shown in the example of
In some examples, the camera sensor(s) 208 may include one or more adjustable lenses, filters, and/or other optical components for capturing electromagnetic waves in one or more spectra, such as, for example, infrared, visible, and/or ultraviolet. In some examples, two or more of the camera sensors 208 may implement stereoscopic tracking and/or capture stereoscopic images. In some examples, one or more of the camera sensors 208 and one or more of the mounted sensors 106 may implement stereoscopic tracking and/or capture stereoscopic images. In some examples, one or more of the other mobile sensors 206 may comprise temperature sensors, accelerometers, magnetometers, gyroscopes, proximity sensors, pressure sensors, light sensors, motion sensors, position sensors, ultrasonic sensors, infrared sensors, Bluetooth sensors, and/or near field communication (NFC) sensors.
In some examples, the communication circuitry 210 may be configured for wireless communication with the communication module 710 of the welding tool 700, remote server(s) 114, and/or remote display(s) 116 via one or more wireless communication protocols. For example, the one or more wireless communication protocols may include NFC protocols, cellular protocols (e.g., GSM, IS-95, UMTS, CDMA, LTE, etc.), IEEE 802.15.4 based protocols in the 2.4 GHz industrial, scientific, and medical (ISM) radio band (commonly known as Zigbee), low frequency magnetic signal protocols being transmitted at a frequency of approximately 131-134 kHz in conformance with IEEE 1902.1 standard (commonly known as Rubee), short wavelength ultra high frequency radio communication protocols in the 2.400 to 2.485 GHz ISM band in conformance with IEEE 802.15.1 standard (commonly known as Bluetooth), communication protocols in conformance with the IEEE 802.11 standard (commonly known as Wifi), and/or other appropriate communication protocols. Though not shown in the example of
In some examples, the audio circuitry 220 may include circuitry configured to drive the one or more speakers 214. In some examples, the graphics circuitry 224 may include one or more graphical processing units (GPUs), graphical driver circuitry, and/or circuitry configured to drive graphical display on the display screen 204. In some examples, the graphics circuitry 224 may be configured to generate one or more simulation (e.g., augmented reality, mixed reality, and/or virtual reality) images on the display screen 204 during a welding simulation.
In some examples, the processing circuitry 222 may include one or more processors. In the example of
In some examples, a simulation exercise may comprise a predefined activity, test, and/or task for a user to complete during a welding simulation. In some examples, a simulation exercise may be automatically determined and/or selected by the simulation program 300, such as, for example, based on a selected/determined joint type and/or other simulation parameter. In some examples, a simulation exercise may be a freeform exercise, where there is no predefined task, and a user is instead given free reign to weld in whatever manner they wish.
In some examples, a joint type may comprise a type of joint defined by an intersection of two workpieces 900 in a workpiece assembly 1000. In some examples, a joint type may comprise, for example, a lap joint, a butt joint, a corner joint, a T joint, an edge joint, and/or a pipe joint. In some examples, a joint type may be automatically determined and/or selected by the simulation program 300, such as, for example, based on sensor data, a selected simulation exercise, and/or some other simulation parameter.
In some examples, a tutorial may be an audio, pictorial, and/or video tutorial that is output to a user through appropriate mechanisms of the mobile device 200. In some examples, a selected tutorial may be output prior to and/or during a welding simulation. In some examples, a tutorial may be interactive, requiring some input from user to complete. In some examples, a tutorial may be automatically determined and/or selected by the simulation program 300, such as, for example, based on a selected/determined exercise, joint type, goal, difficulty, feedback, realism, and/or other simulation parameters.
In some examples, a goal may be an objective and/or target grade and/or score for a user to achieve during a welding simulation. In some examples, the goal may be automatically determined and/or selected by the simulation program 300, such as, for example, based on a selected/determined exercise, joint type, difficulty, realism, mode, and/or other simulation parameter(s). In some examples, a difficulty (e.g., very easy, easy, normal, hard, very hard, etc.) may refer to how ambitious a goal may be, and/or how strict and/or stringent may be the scoring of the welding simulation. In some examples, the difficulty may be automatically determined and/or selected by the simulation program 300, such as, for example, based on a selected/determined exercise, realism, mode, and/or other simulation parameter(s).
In some examples, a feedback setting may indicate the means by which feedback should be provided to a user during the welding simulation. For example, feedback may be provided through audio, visual, vibration, and/or other means. In some examples, a feedback setting may indicate how much and/or how little feedback should be provided to the user during the welding simulation. For example, feedback may be provided with respect to all or some equipment parameters and/or welding technique parameters (e.g., tool angle, tool aim, tool speed, tool position, contact tip to work distance, workpiece position, workpiece orientation, workpiece configuration, equipment parameters, etc.). In some examples, a feedback setting may allow suppression of feedback with respect to some or all equipment parameters and/or welding technique parameters. In some examples, a feedback setting may allow suppression of feedback with respect to all but one equipment parameter and/or welding technique parameter. In some examples, a feedback setting may be automatically determined and/or selected by the simulation program 300, such as, for example, based on a selected/determined simulation exercise, joint type, tutorial, goal, difficulty, realism, and/or other appropriate simulation settings and/or parameters.
In some examples, a realism setting (e.g., low, medium, high, etc.) may indicate how close to reality the welding simulation attempts to adhere. For example, the welding simulation may simulate or omit certain things that sometimes occur during real life welding (e.g., sounds, smoke, fumes, lights, vibrations, resistance, anomalies, impurities, burn through, etc.) based on a realism setting. In some examples, the realism setting may impact certain performance quality settings (e.g., of the display screen 204, graphics circuitry 224, etc.). In some examples, a realism setting may be automatically determined and/or selected by the simulation program 300, such as, for example, based on a selected/determined simulation exercise, goal, difficulty, and/or other appropriate simulation settings and/or parameters.
In some examples, sensor settings may be settings pertaining to the camera sensor(s) 208 and/or mobile sensors 206 of the mobile device 200, and/or the mounted sensors 106 of the device mount 102. In some examples, sensor settings may include autofocus and/or auto-tracking settings of the camera sensor(s) 208. In some examples, sensor settings may include a calibration of one or more of the camera sensors 208 and/or mobile sensors 206 (e.g., accelerometers and/or gyroscopes). In some examples, lighting settings may include settings pertaining to the lights 202 of the mobile device, such as, for example, brightness, intensity, when to be on/off, how long to stay on/off, and/or other appropriate settings. In some examples, certain lighting settings may be automatically determined and/or selected by the simulation program 300, such as, for example, based on a selected/determined simulation exercise, goal, difficulty, realism, and/or other appropriate settings and/or parameters.
In some examples, input and/or output device settings may be settings pertaining to the input and/or output devices of the mobile device 200 (e.g., input devices 218, display screen 204, speaker(s) 214, etc.). For example, an input device setting may turn on/off a microphone and/or touch screen sensitivity of the display screen 204. As another example, an output device setting may be a volume of the speaker 214 and/or a brightness, color, resolution, and/or graphics quality of the display screen 204. In some examples, certain input and/or output device settings may be automatically determined and/or selected by the simulation program 300, such as, for example, based on a selected/determined exercise, tutorial, mode, feedback, realism, and/or other appropriate settings and/or parameters.
In some examples, communication settings may be settings pertaining to the communication circuitry 210 of the mobile device 200. For example, the communication settings may control and/or impact the connection between the mobile device 200 and the communication module 710 of the welding tool 700, the remote server(s) 114, and/or the remote display(s) 116. For example, the communication settings may control and/or impact the communication protocols used by the mobile device 200 to communicate with the communication module 710 of the welding tool 700, the remote server(s) 114, and/or the remote display(s) 116. In some examples, the communication settings may include a unique identifier of the communication module 710 and/or welding tool 700, to enable communication between the mobile device 200 and welding tool 700.
In some examples, simulation modes may set different modes of operation for the welding simulation. For example, selecting a normal mode of operation may lead to a normal simulation that overlays simulation images onto the welding tool 700, workpiece assemblies 1000, and/or other objects in the FOV 108 of the user (e.g., via the mobile device 200) when wearing the welding helmet shell 104.
In some examples, selecting a tool-less mode of operation may lead to a more simplified welding simulation that does not use the welding tool 700 and/or workpieces 900. Instead of using a welding tool 700, in some examples, a user may use their finger(s) and/or stylus to deliver touch screen inputs and/or perform the welding simulation during a tool-less mode of operation.
In some examples, selecting a helmet-less mode of operation may configure the welding simulation program 300 for operation without a helmet shell 104. In such an example, the mobile device 200 may be secured to the welding tool 700 instead of the helmet shell 104, such as via the device mount 102 and/or a torch mount 450.
In some examples, a fixture parameter may be a location, configuration, and/or orientation of the fixturing system 1100. In some examples, one or more fixture parameters may be automatically determined and/or selected by the simulation program 300 via a calibration process. In some examples, an equipment type may include a type and/or model of a welding tool 700, a welding power supply, a wire feeder, a gas supply, and/or a gas valve. In some examples, an equipment parameter may be a parameter of a piece of welding-type equipment (e.g., power supply, gas supply valve, wire feeder, welding tool 700, etc.). Examples of equipment parameters include a welding process, current, voltage, pulse frequency, wire type, wire diameter, wire feed speed, pressure, workpiece material type, and/or workpiece material thickness. In some examples, a threshold may be an upper or lower limit on some parameter, such as, for example, a temperature and/or remaining power of the mobile device 200.
In some examples, a product credential may be a unique identifier (e.g., serial number) of the weld training system 100 and/or a component of the weld training system 100 (e.g., mobile device 200, simulation program 300, helmet shell 103, torch 700, etc.). In some examples, a user credential may be a username, unique identifier, and/or password of a user. In some examples, product credentials and/or user credentials may be sent to and/or verified by the remote server(s) 114.
In some examples, user characteristics may include, for example, one or more preferred simulation parameters, dominant hand, height, experience, qualifications, completed exercises, assigned exercises, scores, and/or other characteristics of a user. In some examples, user characteristics may be received by the mobile device 200 from the remote server(s) 114, such as in response to sending user credentials. In some examples, upload settings may include information pertaining to what, when, where, and/or how the simulation program 300 should upload data to the remote server(s) 114. In some examples, screen mirroring settings may include information pertaining to what, when, where, and/or how the simulation program 300 should send to and/or display on the remote display(s) 116.
In the example of
In the example of
In some examples, the instructions (and/or guidance) may be tailored to the user and/or simulation using one or more parameters of the simulation program 300. For example, the simulation program 300 may output instructions (and/or guidance) as to how to secure the mobile device 200 to the helmet shell 104 in a normal mode of operation, and output instructions (and/or guidance) as to how to secure the mobile device 200 to the torch 700 in a helmet-less mode of operation. In some examples, instructions (and/or guidance) as to how to secure the mobile device 200 to the helmet shell 104 and/or torch 700 may only be provided if the user selects the icon displayed on the screen 204 to start the simulation 300 at block 306.
At block 308, the simulation program 300 captures sensor data via the camera sensor(s) 208, mobile sensors 206, and/or mounted sensors 106. For example, image, audio, thermal, position, movement, angle, and/or other data may be captured. Additionally, at block 308, the simulation program 300 captures data from the welding tool 700. In some examples, this may comprise receiving one or more signals from the communication module 710 of the welding tool 700. In some examples, the communication module 710 may be in electrical and/or mechanical communication with the trigger 706 of the welding tool 700, and/or send one or more signals indicative of the whether the trigger 706 has been and/or is being activated. In some examples, the simulation program 300 may additionally, or alternatively, determine whether the trigger has been and/or is being activated via an analysis of the sensor data (e.g., distance between and/or presence of certain markers 112). Finally, at block 308, the simulation program 300 captures input data from the input devices 218 and/or display screen 204 of the mobile device 200.
In the example of
In the example of
In the example of
In the example of
In some examples, certain properties of the simulated effects may be based, at least in part, on the simulation parameters. For example, the simulation program 300 may simulate certain welding effects (e.g., welding arcs, weld puddles, weld beads, welding sounds, welding fumes, vibration) differently depending on a type and/or model of welding-type equipment (e.g., welding-type power supply, wire feeder, gas supply, and/or welding tool 700) selected for the simulation, and/or the selected equipment parameters. In some examples, the simulation program 300 may configure effect properties to be similar to the properties of environmental effects that occur in the real world when welding using the selected equipment with the selected equipment parameters. This may provide a user with a welding experience that more closely adheres to a welding experience that they may experience in the real world using equipment they are familiar with and/or own. In some examples, the realism of the effects may also be impacted by a realism setting.
As another example, the simulation program 300 may simulate the properties of the feedback effects and/or other effects (e.g., reticles, targets, guides, instructions, markings) differently based on a selected exercise, joint type, tutorial, goal, difficulty, feedback setting, realism, mode, and/or marking setting. In some examples, different exercises and/or tutorials may entail welding at different locations with different equipment parameters and/or welding techniques. The simulation program 300 may simulate feedback effects differently to reflect this, such as, for example, by changing reticles, targets, guides, instructions, markings to indicate to the user the required and/or recommended equipment parameters, welding techniques, and/or positions, orientations, and/or configurations of the workpiece(s) 900 and/or welding tool 700.
In the example of
In the example of
In some examples, the simulation program 300 may implement changes to the simulation configurations at block 316 if the simulation program 300 determines the simulation should continue. For example, the user may provide one or more inputs indicative of a desire and/or command to change one or more simulation configurations (e.g., exercise, equipment parameters, goals, difficulty, realism, etc.) during the welding simulation. As another example, the simulation program 300 may automatically decide to change one or more simulation parameters. In such examples, the simulation program 300 may implement those changes at block 316 if the simulation program 300 determines the simulation should continue, before returning to block 308.
In the example of
In some examples, different touch input may be interpreted differently by the simulation program 300. For example, one finger input may be interpreted as a command to move the simulated welding tool to a selected portion of the display screen 204. On the other hand, two finger input may be interpreted as a command to begin welding (e.g., activate the simulated welding tool 407), such as, for example, where the simulated welding tool 407 is already positioned, or at the selected portion of the display screen 204.
In some examples, a user may hold the mobile device 200 in their hand, during a tool-less mode of operation, rather than the mobile device 200 being held by the mobile device mount 102. In some examples, one or more physical workpieces 900 may still be used during the tool-less mode of operation. In some examples, no workpiece(s) 900 or workpiece assemblies 1000 may be used during the tool-less mode of operation, and the simulation program 300 may simply generate one or more simulated workpiece assemblies 410 on its own.
In the example of
While not shown due to the perspective of the drawing, in some examples, the cradle 454 may further include a base configured to support the mobile device 200. While not shown due to the perspective of the drawing, in some examples, the cradle 454 (e.g., at the base) may be attached to the clamp 452 via a mechanical link. In some examples, the mechanical link may comprise a flexible cable, a gooseneck, an arm, a joint (e.g., a ball joint), a ratcheting mechanism, and/or other means by which to movably connect the cradle 454 to the clamp 452. In some examples, the mechanical link is configured to allow the cradle 454 to be repositioned with respect to the clamp 452 and/or welding tool 700, so that the position, orientation, and/or FOV 108 of the mobile device 200 may be adjusted.
In some examples, the simulation program 300 may provide a preview of the impact of certain feedback setting(s) and/or other simulation parameters. For example, the display screen 204 may show a preview 499 of feedback effects that might be shown during the simulation program 300 under the selected feedback setting(s). In some examples, such a preview 499 might be shown when setting and/or changing feedback settings and/or other simulation parameters (e.g., at blocks 302 and/or 316).
In the examples of
In the example of
In
In the examples of
In the example of
In the example of
In the example of
At block 506, the temperature detection process 500 sets (or returns) the mobile device 200 and/or simulation program 300 (and/or related settings) to regular, default, and/or peak operation. In some examples, this may comprise setting, resetting, and/or increasing one or more performance and/or graphical settings of the mobile device 200 and/or simulation program 300, and/or one or more related settings (e.g., realism, resolution, etc.). In some examples, this may comprise enabling and/or resuming uploads to the remote server(s) 114, mirroring done by the remote display(s) 116, the welding simulation blocks 308 and/or 316, and/or the simulation program 300 in general. As shown, the temperature detection process 500 ends after block 506, though, in some examples, the temperature detection process 500 may instead return to block 502 instead of ending.
In the example of
In the example of
In the example of
In the example of
At block 512, the temperature detection process 500 determines whether the one or more temperatures measured at block 502 are greater than one or more third temperature thresholds. In some examples, the third temperature threshold(s) may be the same or higher than the second temperature threshold(s). In some examples, the third temperature threshold(s) may be representative of one or more temperatures above which there is significant and/or immediate risk of thermal damage to the mobile device 200. In some examples, one or more of the third temperature thresholds may be predetermined and/or stored in the memory circuitry 226. In some examples, one or more of the third temperature thresholds may be set by a user, such as, for example, during block 302 of the welding simulation program 300. In some examples, the temperature detection process 500 may consider multiple third temperature thresholds at block 510. For example, the memory circuitry 226 may store different third temperature thresholds for the mobile device 200 as a whole and the individual components of the mobile device 200 (e.g., the processing circuitry 222, the graphics circuitry 224).
In the example of
In the example of
In the example of
In some examples, the mobile device 200 may undergo a calibration step prior to the orientation configuration process 600, where sensor data from the camera sensor(s) 208, mobile sensor(s) 206, and/or mount sensor(s) 106 is evaluated in different orientations of the mobile device 200 and/or associated with the different orientations of the mobile device when stored in memory circuitry 226. In such an example, the orientation configuration process 600 may compare instantaneous data from of the camera sensor(s) 208, mobile sensor(s) 206, and/or mount sensor(s) 106 with the stored data to determine the most likely orientation of the mobile device 200. In some examples, the sensor data and orientation association(s) may be predefined and/or predetermined. For example, the sensor data and orientation association(s) may be downloaded from the remote server(s) 114 and/or queried from memory circuitry 226 (e.g., based on some identifying information of the mobile device 200, such as a make, model, serial number, etc.).
In some examples, the orientation configuration process 600 may evaluate sensor data from interactions and/or communications between the mobile sensor(s) 206 and/or mount sensor(s) 106 to determine an orientation of the mobile device 200. For example, the mobile device mount 102 may include one or more mounted sensors 106 (e.g., NFC and/or RFID sensors) positioned at different portions of the device mount 102. In such an example, the mounted sensor(s) 106 may be configured to sense, detect, communicate with, and/or otherwise interface with one or more mobile sensors 206 of the mobile device 200 when the mobile sensor(s) 206 and mounted sensor(s) 106 are in proximity to one another. In some examples, certain mobile sensors 206 and mounted sensors 106 may only be in such proximity when the mobile device 200 is in a particular orientation. In some examples, a calibration step and/or loading of calibration data may be performed prior to this sort of orientation determination, similar to that discussed above.
In the example of
In some examples, the user characteristic(s) may be automatically determined by the orientation configuration process 600. For example, the orientation configuration process 600 may determine the user characteristic(s) based on certain user behaviors observed during the welding simulation. In some examples, data from the mounted sensors 106 and/or the mobile sensors 206 may show that a user exhibits welding behavior indicative of one or more particular user characteristics. For example, data from the mounted sensors 106 and/or the mobile sensors 206 may show that a user positions the welding tool 700 relative to the workpiece assembly 1000 in a certain way and/or a certain orientation at the start and/or end of a particular type of welding that is indicative of a particular user characteristic. For example, the orientation configuration process 600 may determine that a user is right handed if data from the mounted sensor(s) 106, camera sensor(s) 208, and/or mobile sensor(s) 206 show that the user positions the welding tool 700 to the right of the workpiece assembly 1000 when beginning a push welding technique, and/or positions the welding tool 700 to the left of the workpiece assembly 1000 when beginning a drag welding technique.
In some examples, the orientation configuration process 600 may determine the user characteristic(s) based on data from the mounted sensor(s) 106, camera sensor(s) 208, and/or mobile sensor(s) 206 relating to the welding tool 700, and/or markers 112 on the welding tool 700. For example, the orientation configuration process 600 may analyze and/or evaluate (e.g., image) data captured by the mounted sensor(s) 106, camera sensor(s) 208, and/or mobile sensor(s) 206 to determine whether the markers 112 on the welding tool 700 are relatively discernable, clear, and/or perpendicular to the camera sensor(s) 208. In some examples, the orientation configuration process 600 may further consider the current orientation of the mobile device 200 determined at block 602 when determining the user characteristic(s) and/or operational orientation. For example, the orientation configuration process 600 may analyze and/or evaluate the sensor data and determine that the markers 112 on the welding are not discernable, clear, and/or perpendicular to the camera sensor(s) 208. The orientation configuration process 600 may further determine that the current mobile device 200 orientation (determined at block 602), in conjunction with the determination that the markers 112 are less than discernable, clear, and/or perpendicular, suggests a particular user characteristic (e.g., right handed). Further, the orientation configuration process 600 may determine that, in view of the user characteristic and the current orientation of the mobile device 200, the operational orientation of the mobile device 200 during the welding simulation should be a different orientation.
In the example of
In some examples, the memory circuitry 226 of the mobile device 200 may store information relating to the markers 112 of the welding tool 700 (e.g., number, shape, size, pattern, position, etc.). In some examples, the memory circuitry 226 may store other data relating to the welding tool 700, such as, for example, one or more images, models, and/or diagrams of the welding tool 700 and/or its shape, features, dimensions, and/or other characteristics. In some examples, the orientation configuration process 600 may compare the stored information to the information obtained from the mounted sensor(s) 106, camera sensor(s) 208, and/or other mobile sensor(s) 206 to determine the user characteristic.
For example, the orientation configuration process 600 may determine that the welding tool 700 is oriented similarly to
In the example of
In some examples, the notification(s) output at block 610 may be output via the speaker(s) 214, display screen 204, and/or output device(s) 216 of the mobile device 200. In some examples, the notification(s) output at block 610 may be output via a speaker and/or vibration device of the welding tool 700. In some examples, the notification(s) may include one or more arrows, icons, messages (e.g., visual and/or audio), animations, vibrations, and/or light flashes. For example, the welding tool 700 and/or mobile device 200 may vibrate to indicate that the orientation should change, and/or speech may play from the welding tool 700 and/or mobile device 200 telling the user that the orientation should be changed and/or providing instructions on how to change the orientation. As another example, an icon, arrow, text message, one or more pictures, a video, and/or an animation may be shown via the display screen 204 of the mobile device telling the user that the orientation should be changed and/or providing instructions on how to change the orientation. In some examples, the notification may include an output (such as discussed above) indicating that the welding simulation will be terminated, disabled, and/or prevented from running until the orientation is changed. In some examples, the orientation configuration process 600 may interface with the simulation program 300 to prevent execution of the welding simulation until the orientation is changed. In some examples, the notification(s) may indicate that (and/or how) an orientation (and/or other configuration) of the device mount 102 may be changed in order to change an orientation of the mobile device 200.
In the example of
In the example of
In the example of
In some examples, the determination of the type(s) of joint(s) and/or intersection(s) may be based on data from the camera sensor(s) 208, mobile sensor(s) 206, and/or mount sensor(s) relating to features and/or characteristics of the workpieces 900. In some examples, the determination of the type(s) of joint(s) and/or intersection(s) may additionally be based on data stored in memory circuitry 226 relating to features and/or characteristics of known workpieces 900, workpieces assemblies 1000, and/or joints formed between workpieces 900 to form one or more workpiece assemblies 1000. For example, the workpiece configuration process 800 may analyze and/or evaluate the sensor data collected by the camera sensor(s) 208, mobile sensor(s) 206, and/or mounted sensor(s) 106 and compare that sensor data to the data stored in memory circuitry 226 in an attempt to recognize one or more types of joints and/or intersections. In some examples, the stored data may be stored by and/or retrieved from the remote server(s) 116 instead of, or in addition to, the memory circuitry 226.
In some examples, the stored data may include, for example, images, models, diagrams, and/or other data relating to features and/or characteristics of known workpieces 900, workpieces assemblies 1000, and/or joints. In some examples, the features and/or characteristics may include the presence and/or absence of one or more markers 112. In some examples, the features and/or characteristics may include types, positions, orientations, patterns, shapes, dimensions, numbers, arrangements, colors, and/or other properties of the markers 112 on the workpieces 900. In some examples, the features and/or characteristics may include dimensions, profiles, shapes, and/or other properties of the workpieces 900 themselves. In some examples, the features and/or characteristics may include dimensions, profiles, shapes, and/or other properties of various workpiece assemblies 1000 that may be formed by combinations of workpieces 900. In some examples, the features and/or characteristics may include dimensions, profiles, shapes, and/or other properties of various joints that may be formed between workpieces 900 to create the workpiece assemblies 1000.
In the example of
In some examples, block 808 is satisfied if there is at least one joint type determined at block 806 for each expected joint type 1. In some examples, the number of joint types must match the exact same number of expected joint types (e.g., 6 lap joints=6 expected lap joints) for block 806 to be satisfied. In some examples, block 806 may also be satisfied if the number of joint types is more than the number of expected joint types (e.g., 8 lap joints>6 expected lap joints).
In the example of
In the example of
In some examples, the notification(s) output at block 810 and/or 812 may be output via the speaker(s) 214, display screen 204, and/or output device(s) 216 of the mobile device 200. In some examples, the notification(s) may be output via a speaker and/or vibration device of the welding tool 700. In some examples, the notification(s) may include one or more arrows, icons, messages (e.g., visual and/or audio), animations, vibrations, and/or light flashes. For example, the welding tool 700 and/or mobile device 200 may vibrate to indicate that there are no recognized joints or that one or more of the recognized joints are different than the expected joint(s). As another example, speech may play from the welding tool 700 and/or mobile device 200 telling the user that the workpieces 900 should be rearranged (and/or how they should be rearranged) to produce an expected joint changed and/or workpiece assembly 1000. As another example, an icon, arrow, text message, one or more pictures, a video, and/or an animation may be shown via the display screen 204 of the mobile device telling the user that the workpieces 900 should be rearranged (and/or how they should be rearranged). In some examples, the notification may include an output (such as discussed above) indicating that the welding simulation will be terminated, disabled, and/or prevented from running until the workpieces 900 are rearranged. In some examples, the workpiece configuration process 800 may interface with the simulation program 300 to prevent execution of the welding simulation until the orientation is changed.
In some examples, a connector 902 may be a magnet (north or south polarity), an electromagnet, a ferromagnetic material, a hook fastener, a loop fastener, a snap fastener, a button, a clamping fastener, a prong, a stud, an aperture, a socket, and/or some other type of tool-less connector. In some examples, tool-less connectors 902 may be advantageous because they can be easily connected to and/or engaged with other connectors 902 without the need for auxiliary tools (e.g., screwdrivers, hammers, etc.). Tool-less connectors 902 may also be advantageous over adhesives, as the tool-less connectors 902 may be continually connected, disconnected, and reconnected with negligible change to their effectiveness, unlike adhesives.
In the example of
In the example of
In the example of
However, unlike the workpiece 900a, the workpiece 900b has no markers 112 across an approximate middle of the workpiece 900b in the example of
In the example of
Given the complementary arrangement of connectors 902 in workpiece 900b and workpiece 900c, in some examples, the two workpieces 900 may connect together to form a T joint workpiece assembly 1000b. Such a T joint workpiece assembly 1000b is shown, for example, in
While
In the examples of
In the example of
In the example of
In the example of
In some examples, the equipment configuration process 1200 may allow a user to select the welding-type equipment using the welding tool 700, display screen 204, one or more input devices 218, mobile sensors 206, camera sensors 208, and/or other appropriate mechanisms. In some examples, the equipment configuration process 1200 may allow a user to select the welding-type equipment via a dropdown menu 1302 displayed to the user, such as shown in
In the example of
In the example of
In some examples, the equipment configuration process 1200 may additionally provide one or more recommendations to the user (e.g., via the display screen 204 and/or speaker(s) 214) based on the selected welding-type equipment. For example, the equipment configuration process 1200 may recommend equipment parameters (e.g., gas type, wire type, etc.) and/or complementary welding-type equipment based on the selected welding-type equipment. In some examples, the equipment configuration process 1200 may store (e.g., in memory circuitry 226) recommended equipment parameters associated with certain welding-type equipment and/or other simulation parameters (e.g., exercise, realism, difficult, goals, etc.), and query the stored recommendations. In some examples, the equipment configuration process 1200 may receive recommendations from the remote server(s) 114 (e.g., in response to one or more similar queries and/or signals). In the example of
In the example of
In the example of
In the example of
In the example of
The present disclosure contemplates using mobile devices 200 (and/or desktop devices 250) to conduct welding simulations. In some examples, it may be advantageous to use mobile devices 200 due to their availability, relative affordability, and/or technical power. The disclosure further contemplates automatically detecting whether an orientation of the mobile device 200 is proper for the simulation, and notifying the user if not.
The present disclosure additionally contemplates using modular workpieces 900 for conducting welding simulations. In some examples, the modular workpieces 900 may be configured to tool-lessly connect to, and/or disconnect from, other modular workpieces 900 to form various workpiece assemblies 1000. In some examples, tool-less connectors 902 may be advantageous because they can be easily connected to and/or engaged with other connectors 902 without the need for auxiliary tools (e.g., screwdrivers, hammers, etc.). Tool-less connectors 902 may also be advantageous over adhesives, as the tool-less connectors 902 may be continually connected, disconnected, and reconnected with negligible change to their effectiveness, unlike adhesives. In some examples, the welding simulation may further be configured to recognize different joints formed by the modular workpieces 900, and conduct the welding simulation accordingly.
The present disclosure further contemplates using simulated equipment interfaces 1304 that replicate the appearance of actual equipment interfaces 1404 of actual welding-type equipment. In some examples, this replication may help orient a user who is already familiar with a particular piece of welding-type equipment and/or its actual equipment interface 1404, thereby making them more comfortable with the welding simulation. In some examples, the replication may help users who are unfamiliar with a particular piece of welding-type equipment become familiar with the welding-type equipment (and/or its actual equipment interface 1404). Additionally, the present disclosure contemplates simulating certain welding effects in accordance with the way the effects might occur in the real world when real welding is performed using the real world welding-type equipment.
The present method and/or system may be realized in hardware, software, or a combination of hardware and software. The present methods and/or systems may be realized in a centralized fashion in at least one computing system, or in a distributed fashion where different elements are spread across several interconnected computing or cloud systems. Some examples may comprise a non-transitory machine-readable (e.g., computer readable) medium (e.g., FLASH drive, optical disk, magnetic storage disk, or the like) having stored thereon one or more lines of code executable by a machine, thereby causing the machine to perform processes as described herein.
While the present method and/or system has been described with reference to certain examples, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the scope of the present method and/or system. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the present disclosure without departing from its scope. Therefore, it is intended that the present method and/or system not be limited to the particular examples disclosed, but that the present method and/or system will include all implementations falling within the scope of the appended claims.
As used herein, “and/or” means any one or more of the items in the list joined by “and/or”. As an example, “x and/or y” means any element of the three-element set {(x), (y), (x, y)}. In other words, “x and/or y” means “one or both of x and y”. As another example, “x, y, and/or z” means any element of the seven-element set {(x), (y), (z), (x, y), (x, z), (y, z), (x, y, z}. In other words, “x, y and/or z” means “one or more of x, y and z”.
As utilized herein, the terms “e.g.,” and “for example” set off lists of one or more non-limiting examples, instances, or illustrations.
As used herein, the terms “coupled,” “coupled to,” and “coupled with,” each mean a structural and/or electrical connection, whether attached, affixed, connected, joined, fastened, linked, and/or otherwise secured. As used herein, the term “attach” means to affix, couple, connect, join, fasten, link, and/or otherwise secure. As used herein, the term “connect” means to attach, affix, couple, join, fasten, link, and/or otherwise secure.
As used herein, “mobile device” or “mobile electronic device” refers to a handheld electronic computing apparatus having a casing that houses a camera, a display screen, processing circuitry, and communication circuitry in a single unit.
As used herein, “desktop device” or “desktop electronic device” refers to a non-handheld electronic computing apparatus that houses processing circuitry, communication circuitry, and possibly a display in a single unit, while also controlling (and/or powering) a camera and a display that are housed in a separate unit (e.g., a helmet shell) outside of the single unit of the non-handheld electronic computing apparatus.
As used herein the terms “circuits” and “circuitry” refer to physical electronic components (i.e., hardware) and any software and/or firmware (“code”) which may configure the hardware, be executed by the hardware, and or otherwise be associated with the hardware. As used herein, for example, a particular processor and memory may comprise a first “circuit” when executing a first one or more lines of code and may comprise a second “circuit” when executing a second one or more lines of code. As utilized herein, circuitry is “operable” and/or “configured” to perform a function whenever the circuitry comprises the necessary hardware and/or code (if any is necessary) to perform the function, regardless of whether performance of the function is disabled or enabled (e.g., by a user-configurable setting, factory trim, etc.).
As used herein, a control circuit may include digital and/or analog circuitry, discrete and/or integrated circuitry, microprocessors, DSPs, etc., software, hardware and/or firmware, located on one or more boards, that form part or all of a controller, and/or are used to control a welding process, and/or a device such as a power source or wire feeder.
As used herein, the term “processor” means processing devices, apparatus, programs, circuits, components, systems, and subsystems, whether implemented in hardware, tangibly embodied software, or both, and whether or not it is programmable. The term “processor” as used herein includes, but is not limited to, one or more computing devices, hardwired circuits, signal-modifying devices and systems, devices and machines for controlling systems, central processing units, programmable devices and systems, field-programmable gate arrays, application-specific integrated circuits, systems on a chip, systems comprising discrete elements and/or circuits, state machines, virtual machines, data processors, processing facilities, and combinations of any of the foregoing. The processor may be, for example, any type of general purpose microprocessor or microcontroller, a digital signal processing (DSP) processor, an application-specific integrated circuit (ASIC), a graphic processing unit (GPU), a reduced instruction set computer (RISC) processor with an advanced RISC machine (ARM) core, etc. The processor may be coupled to, and/or integrated with a memory device.
As used, herein, the term “memory” and/or “memory circuitry” means computer hardware or circuitry to store information for use by a processor and/or other digital device. The memory and/or memory circuitry can be any suitable type of computer memory or any other type of electronic storage medium, such as, for example, read-only memory (ROM), random access memory (RAM), cache memory, compact disc read-only memory (CDROM), electro-optical memory, magneto-optical memory, programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically-erasable programmable read-only memory (EEPROM), a computer-readable medium, or the like. Memory can include, for example, a non-transitory memory, a non-transitory processor readable medium, a non-transitory computer readable medium, non-volatile memory, dynamic RAM (DRAM), volatile memory, ferroelectric RAM (FRAM), first-in-first-out (FIFO) memory, last-in-first-out (LIFO) memory, stack memory, non-volatile RAM (NVRAM), static RAM (SRAM), a cache, a buffer, a semiconductor memory, a magnetic memory, an optical memory, a flash memory, a flash card, a compact flash card, memory cards, secure digital memory cards, a microcard, a minicard, an expansion card, a smart card, a memory stick, a multimedia card, a picture card, flash storage, a subscriber identity module (SIM) card, a hard drive (HDD), a solid state drive (SSD), etc. The memory can be configured to store code, instructions, applications, software, firmware and/or data, and may be external, internal, or both with respect to the processor.
As used herein, welding-type refers to welding, cladding, brazing, plasma cutting, induction heating, carbon arc cutting, and/or hot wire welding/preheating (including laser welding and laser cladding), carbon arc cutting or gouging, and/or resistive preheating.
As used herein, welding-type power refers power suitable for welding, cladding, brazing, plasma cutting, induction heating, carbon arc cutting, and/or hot wire welding/preheating (including laser welding and laser cladding), carbon arc cutting or gouging, and/or resistive preheating.
As used herein, a welding-type power supply and/or power source refers to any device capable of, when power is applied thereto, supplying welding, cladding, brazing, plasma cutting, induction heating, laser (including laser welding, laser hybrid, and laser cladding), carbon arc cutting or gouging, and/or resistive preheating, including but not limited to transformer-rectifiers, inverters, converters, resonant power supplies, quasi-resonant power supplies, switch-mode power supplies, etc., as well as control circuitry and other ancillary circuitry associated therewith.
Disabling of circuitry, actuators, hardware, and/or software may be done via hardware, software (including firmware), or a combination of hardware and software, and may include physical disconnection, de-energization, and/or a software control that restricts commands from being implemented to activate the circuitry, actuators, hardware, and/or software. Similarly, enabling of circuitry, actuators, hardware, and/or software may be done via hardware, software (including firmware), or a combination of hardware and software, using the same mechanisms used for disabling.
The present application claims the benefit of, and priority to, Provisional Patent Application No. 62/940,111, entitled “WELD TRAINING SIMULATIONS USING MOBILE DEVICES, MODULAR WORKPIECES, AND SIMULATED WELDING EQUIPMENT,” filed Nov. 25, 2019, the entire contents of which are hereby incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
3555239 | Kerth | Jan 1971 | A |
3652824 | Okada | Mar 1972 | A |
4021840 | Ellsworth | May 1977 | A |
4280137 | Ashida | Jul 1981 | A |
4453085 | Pryor | Jun 1984 | A |
4477712 | Lillquist | Oct 1984 | A |
4482960 | Pryor | Nov 1984 | A |
4577796 | Powers | Mar 1986 | A |
4602163 | Pryor | Jul 1986 | A |
4641292 | Tunnell | Feb 1987 | A |
4654949 | Pryor | Apr 1987 | A |
4707647 | Coldren | Nov 1987 | A |
4733051 | Nadeau | Mar 1988 | A |
4753569 | Pryor | Jun 1988 | A |
4769700 | Pryor | Sep 1988 | A |
4788440 | Pryor | Nov 1988 | A |
4812614 | Wang | Mar 1989 | A |
5148591 | Pryor | Sep 1992 | A |
5275327 | Watkins | Jan 1994 | A |
5380978 | Pryor | Jan 1995 | A |
5506682 | Pryor | Apr 1996 | A |
5572102 | Goodfellow | Nov 1996 | A |
5580475 | Sakai | Dec 1996 | A |
5602967 | Pryor | Feb 1997 | A |
5608847 | Pryor | Mar 1997 | A |
5923555 | Bailey | Jul 1999 | A |
5932123 | Marhofer | Aug 1999 | A |
5956417 | Pryor | Sep 1999 | A |
5978090 | Burri | Nov 1999 | A |
6044183 | Pryor | Mar 2000 | A |
6051805 | Vaidya | Apr 2000 | A |
6107601 | Shimagama | Aug 2000 | A |
6122042 | Wunderman et al. | Sep 2000 | A |
6163946 | Pryor | Dec 2000 | A |
6167607 | Pryor | Jan 2001 | B1 |
6186855 | Bauer | Feb 2001 | B1 |
6230327 | Briand | May 2001 | B1 |
6240253 | Yamaguchi | May 2001 | B1 |
6242711 | Cooper | Jun 2001 | B1 |
6271500 | Hirayama | Aug 2001 | B1 |
6301763 | Pryor | Oct 2001 | B1 |
6314631 | Pryor | Nov 2001 | B1 |
6315186 | Friedl | Nov 2001 | B1 |
6317953 | Pryor | Nov 2001 | B1 |
6441342 | Hsu | Aug 2002 | B1 |
6476354 | Jank | Nov 2002 | B1 |
6479793 | Wittmann | Nov 2002 | B1 |
6572379 | Sears | Jun 2003 | B1 |
6587186 | Bamji | Jul 2003 | B2 |
6734393 | Friedl | May 2004 | B1 |
6750428 | Okamoto | Jun 2004 | B2 |
6754518 | Lloyd | Jun 2004 | B1 |
7358458 | Daniel | Apr 2008 | B2 |
7523069 | Friedl et al. | Apr 2009 | B1 |
7534005 | Buckman | May 2009 | B1 |
7926118 | Becker | Apr 2011 | B2 |
7962967 | Becker | Jun 2011 | B2 |
7987492 | Liwerant | Jul 2011 | B2 |
8144193 | Melikian | Mar 2012 | B2 |
8224029 | Saptharishi | Jul 2012 | B2 |
8274013 | Wallace | Sep 2012 | B2 |
8275201 | Rangwala | Sep 2012 | B2 |
8316462 | Becker et al. | Nov 2012 | B2 |
8428926 | Choquet | Apr 2013 | B2 |
8502866 | Becker | Aug 2013 | B2 |
8512043 | Choquet | Aug 2013 | B2 |
8569646 | Daniel | Oct 2013 | B2 |
8569655 | Cole | Oct 2013 | B2 |
8605008 | Prest | Dec 2013 | B1 |
8648903 | Loipetsberger | Feb 2014 | B2 |
8657605 | Wallace | Feb 2014 | B2 |
8680432 | Uecker | Mar 2014 | B2 |
8680434 | Stoger et al. | Mar 2014 | B2 |
8747116 | Zboray et al. | Jun 2014 | B2 |
8777629 | Kreindl | Jul 2014 | B2 |
8808164 | Hoffman | Aug 2014 | B2 |
8826357 | Fink | Sep 2014 | B2 |
8834168 | Peters | Sep 2014 | B2 |
8851896 | Wallace | Oct 2014 | B2 |
8884177 | Daniel | Nov 2014 | B2 |
8911237 | Postlethwaite | Dec 2014 | B2 |
8915740 | Zboray | Dec 2014 | B2 |
8934029 | Nayar | Jan 2015 | B2 |
8957835 | Hoellwarth | Feb 2015 | B2 |
8964298 | Haddick | Feb 2015 | B2 |
RE45398 | Wallace | Mar 2015 | E |
8987628 | Daniel et al. | Mar 2015 | B2 |
8992226 | Leach | Mar 2015 | B1 |
9011154 | Kindig | Apr 2015 | B2 |
9012802 | Daniel | Apr 2015 | B2 |
9050678 | Daniel | Jun 2015 | B2 |
9050679 | Daniel | Jun 2015 | B2 |
9056365 | Hoertenhuber | Jun 2015 | B2 |
9073138 | Wills | Jul 2015 | B2 |
9089921 | Daniel | Jul 2015 | B2 |
9097891 | Border | Aug 2015 | B2 |
9101994 | Albrecht | Aug 2015 | B2 |
9104195 | Daniel | Aug 2015 | B2 |
9196169 | Wallace | Nov 2015 | B2 |
9218745 | Choquet | Dec 2015 | B2 |
9221117 | Conrardy | Dec 2015 | B2 |
9230449 | Conrardy | Jan 2016 | B2 |
9235051 | Salter | Jan 2016 | B2 |
9244539 | Venable | Jan 2016 | B2 |
9269279 | Penrod et al. | Feb 2016 | B2 |
9280913 | Peters | Mar 2016 | B2 |
9293056 | Zboray | Mar 2016 | B2 |
9293057 | Zboray | Mar 2016 | B2 |
9318026 | Peters | Apr 2016 | B2 |
9330575 | Peters | May 2016 | B2 |
9336686 | Peters | May 2016 | B2 |
9352411 | Batzler | May 2016 | B2 |
9368045 | Becker | Jun 2016 | B2 |
9468988 | Daniel | Oct 2016 | B2 |
9483959 | Wallace | Nov 2016 | B2 |
9583014 | Becker | Feb 2017 | B2 |
9583023 | Becker et al. | Feb 2017 | B2 |
9589481 | Becker et al. | Mar 2017 | B2 |
9666160 | Patel | May 2017 | B2 |
9977242 | Patel | May 2018 | B2 |
10909872 | Albrecht | Feb 2021 | B2 |
20020017752 | Levi | Feb 2002 | A1 |
20040034608 | de Miranda et al. | Feb 2004 | A1 |
20040189675 | Pretlove | Sep 2004 | A1 |
20050001155 | Fergason | Jan 2005 | A1 |
20050099102 | Arthur | May 2005 | A1 |
20050103767 | Kainec | May 2005 | A1 |
20050161357 | Allan | Jul 2005 | A1 |
20050199605 | Furman | Sep 2005 | A1 |
20060087502 | Karidis | Apr 2006 | A1 |
20060090135 | Fukuda | Apr 2006 | A1 |
20060176467 | Rafii | Aug 2006 | A1 |
20060207980 | Jacovetty | Sep 2006 | A1 |
20060213892 | Ott | Sep 2006 | A1 |
20060281971 | Sauer | Dec 2006 | A1 |
20070187378 | Karakas | Aug 2007 | A1 |
20080083351 | Lippert | Apr 2008 | A1 |
20080158502 | Becker | Jul 2008 | A1 |
20080187235 | Wakazono | Aug 2008 | A1 |
20080314887 | Stoger | Dec 2008 | A1 |
20090014500 | Cho et al. | Jan 2009 | A1 |
20090134203 | Domec et al. | May 2009 | A1 |
20090231423 | Becker | Sep 2009 | A1 |
20090276930 | Becker | Nov 2009 | A1 |
20090298024 | Batzler | Dec 2009 | A1 |
20100036624 | Martin | Feb 2010 | A1 |
20100048273 | Wallace | Feb 2010 | A1 |
20100062406 | Zboray | Mar 2010 | A1 |
20100079356 | Hoellwarth | Apr 2010 | A1 |
20100206851 | Nakatate | Aug 2010 | A1 |
20100223706 | Becker | Sep 2010 | A1 |
20100262468 | Blankenship | Oct 2010 | A1 |
20110006047 | Penrod | Jan 2011 | A1 |
20110083241 | Cole | Apr 2011 | A1 |
20110091846 | Kreindl | Apr 2011 | A1 |
20110108536 | Inada | May 2011 | A1 |
20110117527 | Conrardy | May 2011 | A1 |
20110187859 | Edelson | Aug 2011 | A1 |
20110220616 | Mehn | Sep 2011 | A1 |
20110220619 | Mehn | Sep 2011 | A1 |
20110227934 | Sharp | Sep 2011 | A1 |
20110309236 | Tian | Dec 2011 | A1 |
20120006800 | Ryan | Jan 2012 | A1 |
20120012561 | Wiryadinata | Jan 2012 | A1 |
20120074114 | Kawamoto | Mar 2012 | A1 |
20120122062 | Yang et al. | May 2012 | A1 |
20120152923 | Sickels | Jun 2012 | A1 |
20120176659 | Hsieh | Jul 2012 | A1 |
20120180180 | Steve | Jul 2012 | A1 |
20120189993 | Kindig | Jul 2012 | A1 |
20120229632 | Hoertenhuber | Sep 2012 | A1 |
20120241429 | Knoener | Sep 2012 | A1 |
20120249400 | Demonchy | Oct 2012 | A1 |
20120262601 | Choi | Oct 2012 | A1 |
20120291172 | Wills | Nov 2012 | A1 |
20120298640 | Conrardy | Nov 2012 | A1 |
20122298640 | Conrardy | Nov 2012 | |
20130050432 | Perez | Feb 2013 | A1 |
20130081293 | Delin | Apr 2013 | A1 |
20130112678 | Park | May 2013 | A1 |
20130163090 | Yu | Jun 2013 | A1 |
20130189657 | Wallace | Jul 2013 | A1 |
20130189658 | Peters | Jul 2013 | A1 |
20130200882 | Almalki | Aug 2013 | A1 |
20130206740 | Pfeifer | Aug 2013 | A1 |
20130206741 | Pfeifer et al. | Aug 2013 | A1 |
20130208569 | Pfeifer | Aug 2013 | A1 |
20130215281 | Hobby | Aug 2013 | A1 |
20130229485 | Rusanovskyy | Sep 2013 | A1 |
20130234935 | Griffith | Sep 2013 | A1 |
20130252214 | Choquet | Sep 2013 | A1 |
20130288211 | Patterson | Oct 2013 | A1 |
20130291271 | Becker | Nov 2013 | A1 |
20130321462 | Salter | Dec 2013 | A1 |
20130345868 | One | Dec 2013 | A1 |
20140014637 | Hunt | Jan 2014 | A1 |
20140014638 | Artelsmair | Jan 2014 | A1 |
20140017642 | Postlethwaite | Jan 2014 | A1 |
20140020147 | Anderson | Jan 2014 | A1 |
20140042135 | Daniel et al. | Feb 2014 | A1 |
20140042136 | Daniel et al. | Feb 2014 | A1 |
20140042137 | Daniel et al. | Feb 2014 | A1 |
20140059730 | Kim | Mar 2014 | A1 |
20140063055 | Osterhout | Mar 2014 | A1 |
20140065584 | Wallace | Mar 2014 | A1 |
20140092015 | Apr 2014 | A1 | |
20140097164 | Beistle | Apr 2014 | A1 |
20140134579 | Becker | May 2014 | A1 |
20140134580 | Becker | May 2014 | A1 |
20140144896 | Einav | May 2014 | A1 |
20140159995 | Adams | Jun 2014 | A1 |
20140183176 | Hutchison | Jul 2014 | A1 |
20140184496 | Gribetz | Jul 2014 | A1 |
20140185282 | Hsu | Jul 2014 | A1 |
20140205976 | Peters | Jul 2014 | A1 |
20140220522 | Peters | Aug 2014 | A1 |
20140232825 | Gotschlich | Aug 2014 | A1 |
20140234813 | Peters | Aug 2014 | A1 |
20140263224 | Becker | Sep 2014 | A1 |
20140263227 | Daniel et al. | Sep 2014 | A1 |
20140263249 | Miller | Sep 2014 | A1 |
20140272835 | Becker | Sep 2014 | A1 |
20140272836 | Becker | Sep 2014 | A1 |
20140272837 | Becker | Sep 2014 | A1 |
20140272838 | Becker | Sep 2014 | A1 |
20140315167 | Kreindl | Oct 2014 | A1 |
20140320529 | Roberts | Oct 2014 | A1 |
20140322684 | Wallace | Oct 2014 | A1 |
20140326705 | Kodama | Nov 2014 | A1 |
20140346158 | Matthews | Nov 2014 | A1 |
20140349256 | Connor | Nov 2014 | A1 |
20150009316 | Baldwin | Jan 2015 | A1 |
20150034618 | Langeder | Feb 2015 | A1 |
20150056584 | Boulware | Feb 2015 | A1 |
20150056585 | Boulware | Feb 2015 | A1 |
20150072323 | Postlethwaite | Mar 2015 | A1 |
20150125836 | Daniel | May 2015 | A1 |
20150154884 | Salsich | Jun 2015 | A1 |
20150170539 | Barrera | Jun 2015 | A1 |
20150190875 | Becker | Jul 2015 | A1 |
20150190876 | Becker | Jul 2015 | A1 |
20150190887 | Becker | Jul 2015 | A1 |
20150190888 | Becker | Jul 2015 | A1 |
20150194072 | Becker | Jul 2015 | A1 |
20150194073 | Becker | Jul 2015 | A1 |
20150209887 | DeLisio | Jul 2015 | A1 |
20150228203 | Kindig | Aug 2015 | A1 |
20150235565 | Postlethwaite | Aug 2015 | A1 |
20150248845 | Postlethwaite | Sep 2015 | A1 |
20150264992 | Happel | Sep 2015 | A1 |
20150268663 | Daniel et al. | Sep 2015 | A1 |
20150304538 | Huang | Oct 2015 | A1 |
20150320601 | Gregg | Nov 2015 | A1 |
20150325153 | Albrecht | Nov 2015 | A1 |
20150348439 | Zboray | Dec 2015 | A1 |
20150348441 | Zboray | Dec 2015 | A1 |
20150352653 | Albrecht | Dec 2015 | A1 |
20150356888 | Zboray | Dec 2015 | A1 |
20150375324 | Becker | Dec 2015 | A1 |
20150375327 | Becker | Dec 2015 | A1 |
20150379894 | Becker | Dec 2015 | A1 |
20160012750 | Wallace | Jan 2016 | A1 |
20160027215 | Burns | Jan 2016 | A1 |
20160039034 | Becker | Feb 2016 | A1 |
20160039053 | Becker | Feb 2016 | A1 |
20160049085 | Beeson | Feb 2016 | A1 |
20160093233 | Boulware | Mar 2016 | A1 |
20160114418 | Jones | Apr 2016 | A1 |
20160125592 | Becker et al. | May 2016 | A1 |
20160125593 | Becker | May 2016 | A1 |
20160125594 | Becker | May 2016 | A1 |
20160125761 | Becker | May 2016 | A1 |
20160125762 | Becker | May 2016 | A1 |
20160125763 | Becker | May 2016 | A1 |
20160125764 | Becker | May 2016 | A1 |
20160155358 | Zboray | Jun 2016 | A1 |
20160155359 | Zboray | Jun 2016 | A1 |
20160155360 | Zboray et al. | Jun 2016 | A1 |
20160155361 | Peters | Jun 2016 | A1 |
20160158884 | Hagenlocher | Jun 2016 | A1 |
20160171906 | Matthews | Jun 2016 | A1 |
20160183677 | Achillopoulos | Jun 2016 | A1 |
20160189559 | Peters | Jun 2016 | A1 |
20160203732 | Wallace | Jul 2016 | A1 |
20160203733 | Wallace | Jul 2016 | A1 |
20160203734 | Boulware | Jul 2016 | A1 |
20160203735 | Boulware | Jul 2016 | A1 |
20160236303 | Matthews | Aug 2016 | A1 |
20160260261 | Hsu | Sep 2016 | A1 |
20160267806 | Hsu | Sep 2016 | A1 |
20160284311 | Patel | Sep 2016 | A1 |
20160288236 | Becker | Oct 2016 | A1 |
20160307460 | Peters | Oct 2016 | A1 |
20160321954 | Peters | Nov 2016 | A1 |
20160343268 | Postlethwaite | Nov 2016 | A1 |
20160358503 | Batzler | Dec 2016 | A1 |
20160361774 | Daniel et al. | Dec 2016 | A9 |
20160365004 | Matthews | Dec 2016 | A1 |
20170036288 | Albrecht | Feb 2017 | A1 |
20170046974 | Becker | Feb 2017 | A1 |
20170046977 | Becker | Feb 2017 | A1 |
20170046982 | Wallace | Feb 2017 | A1 |
20170053557 | Daniel | Feb 2017 | A1 |
20170060398 | Rastogi | Mar 2017 | A1 |
20170200395 | Albrecht | Jul 2017 | A1 |
20170249858 | Boettcher | Aug 2017 | A1 |
20190340954 | Schneider | Nov 2019 | A1 |
20210012679 | Torrecilla et al. | Jan 2021 | A1 |
Number | Date | Country |
---|---|---|
2725719 | Jun 2012 | CA |
2778699 | Nov 2012 | CA |
1749940 | Mar 2006 | CN |
101067905 | Nov 2007 | CN |
101248659 | Aug 2008 | CN |
101965576 | Feb 2011 | CN |
102165504 | Aug 2011 | CN |
102625739 | Aug 2012 | CN |
202741926 | Feb 2013 | CN |
103170767 | Jun 2013 | CN |
103687687 | Mar 2014 | CN |
103996322 | Aug 2014 | CN |
204013703 | Dec 2014 | CN |
104384765 | Mar 2015 | CN |
104471629 | Mar 2015 | CN |
104599314 | May 2015 | CN |
104603860 | May 2015 | CN |
104708174 | Jun 2015 | CN |
105160645 | Dec 2015 | CN |
4313508 | Oct 1994 | DE |
0165501 | Dec 1985 | EP |
2082656 | Jul 2009 | EP |
2801966 | Nov 2014 | EP |
2863376 | Apr 2015 | EP |
3537410 | Sep 2019 | EP |
3537410 | Sep 2019 | EP |
3550432 | Oct 2019 | EP |
S52126656 | Oct 1977 | JP |
2002178148 | Jun 2002 | JP |
2016203205 | Dec 2016 | JP |
2005102230 | Nov 2005 | WO |
2008101379 | Aug 2008 | WO |
2009137379 | Nov 2009 | WO |
2009146359 | Dec 2009 | WO |
2010062481 | Jun 2010 | WO |
2013122805 | Aug 2013 | WO |
2014188244 | Nov 2014 | WO |
20140188244 | Nov 2014 | WO |
2015121742 | Aug 2015 | WO |
2016022452 | Feb 2016 | WO |
2016044680 | Mar 2016 | WO |
2016144744 | Sep 2016 | WO |
2017120488 | Jul 2017 | WO |
2017120491 | Jul 2017 | WO |
2018080994 | May 2018 | WO |
2018147868 | Aug 2018 | WO |
Entry |
---|
Anonymous: “AugmentedArc(TM) Augmented Reality Welding System”, May 30, 2017 (May 30, 2017), pp. 1-4, XP055501429, Retrieved from the Internet: URL:https://patongroup.com/wp-content/uploads/2017/05/AugmentedArc-Brochure.pdf. |
International Search Report and Written Opinion, International Patent Application No. PCT/US2020/062301, daated Feb. 16, 2021, 15 pages. |
International Search Report and Written Opinion, International Patent Application No. PCT/US2020/062267, dated Feb. 15, 2021, 15 pages. |
www.boxford.co.us: “Augmented Reality Welding Training”, Commercial video from Boxford, Aug. 7, 2014 (Aug. 7, 2014); Retrieved from the Internet: URL:https://www.youtube.com/watch?v=mjJcebhlo_g [retrieved Dec. 23, 2020], 1 page. |
European Patent Office, Brief Communication with Oral Proceedings in Application No. 16713176.2, dated Nov. 3, 2020, 18 pages. |
International Search Report and Written Opinion, International Patent Application No. PCT/US2020/062277, dated Feb. 16, 2021, 12 pages. |
NAMeS Users Guide, N A Tech Neural Applications, Copyright 1997, 1998, 1999, 2000 Golden, CO (123 pages). |
Klinker, Gudrun, Intelligent Welding Gun, 2002. |
Native American Technologies, “ArcSentry Weld Quality Monitoring System” web page, http://web.archive.org/web/20020608124903/http://www.natech-inc.com/arcsentry1/index.html, published Jun. 8, 2002. |
Native American Technologies, “P/NA.3 Process Modelling and Optimization” web pages, http://web.archive.org/web/20020608125619/http://www.natech-inc.com/pna3/index.html, published Jun. 8, 2002. |
Fite-Georgel, Pierre; “Is there a Reality in Industrial Augmented Reality?” 10th IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 2011. |
ARVIKA Forum Vorstellung Projeckt PAARA, BMW Group Virtual Reality Center, Nuernberg, 2003. |
Li, Larry, Time-of-Flight Camera—An Introduction, Technical White Paper, SLOA190B—Jan. 2014, revised May 2014 (10 pages). |
Heston, Tim, Lights, camera, lean-recording manufacturing efficiency, The Fabricator, Aug. 2010 (4 pages). |
Wavelength Selective Switching, http://en.wikipedia.org/wiki/wavelength_selective_switching, Mar. 4, 2015 (5 pages). |
Cavilux HF, Laser Light for High-Speed Imaging, See What You Have Missed (2 pages). |
Cavilux Smart, Laser Light for Monitoring and High Speed Imaging, Welcome to the Invisible World (2 pages). |
Windows 10 to Get ‘Holographic’ Headset and Cortana, BBC News, www.bbc.com/news/technology-30924022, Feb. 26, 2015 (4 pages). |
Daqri Smart Helmet, The World's First Wearable Human Machine Interface, Brochure (9 pages). |
Intelligenter SchweiBbrenner, Intelligent Welding Torch, IP Bewertungs AG (IPB) (12 pages). |
Intelligent Robotic Arc Sensing, Lincoln Electric, Oct. 20, 2014, http://www.lincolnelectric.com/en-us/support/process-and-theory/pages/intelligent-robotic-detail.aspx (3 pages). |
LiveArc Welding Performance Management System, A reality-based recruiting, screening and training solution, MillerWelds.com 2014 (4 pages). |
Frank Shaopeng Cheng (2008). Calibration of Robot Reference Frames for Enhanced Robot Positioning Accuracy, Robot Manipulators, Marco Ceccarelli (Ed.), ISBN: 978-953-7619-06-0, InTech, Available from: http://www.intechopen.com/books/robot_manipulators/calibration_of_robot_reference_frames_for_enhanced_r obot_positioning_accuracy (19 pages). |
Lutwak, Dr. Robert, Micro-Technology for Positioning, Navigation, and Timing Towards PNT Everywhere and Always Stanford PNT Symposium, Stanford, CA Oct. 29, 2014 (26 pages). |
Lutwak, Dr. Robert, DARPA, Microsystems Tech. Office, Micro-Technology for Positioning, Navigation, and Timing Towards PNT Everywhere and Always, Feb. 2014 (4 pages). |
Parnian, Neda et al., Integration of a Multi-Camera Vision System and Strapdown Inertial Naviation System (SDINS) with a Modified Kalman Filter, Sensors 2010, 10, 5378-5394; doi: 10.3390/s100605378 (17 pages). |
Pipe-Bug, Motorized & Manual Chain Driven Pipe Cutting Machines From Bug-0 Systems (4 pages). |
Electronic speckle pattern interferometry Wikipedia, the free encyclopedia (4 pages), [retrieved Feb. 10, 2015]. |
Rivers, et al., Position-Correcting Tools for 2D Digital Fabrication (7 pages). |
Int' Search Report and the Written Opinion Appln No. PCT/US2016/016107, dated May 17, 2016 (11 pages). |
Handheld Welding Torch with Position Detection technology description, Sep. 21, 2011 (11 pages). |
Telops, Innovative Infrared Imaging, HDR-IR High Dynamic Range IR Camera, http://www.telops.com/en/infrared-Cameras/hdr-ir-high-dynamic-range-ir-camera, 2015 (2 pages). |
OV10642:1.3-Megapixel OmniHDRTM, http://www.ovt.com/applications/application.php?id=7 (2 pages). |
Altasens—Wide Dynamic Range (WDR), http://www.altasens.com/index.php/technology/wdr (1 page), [retrieved Jan. 5, 2016). |
HDR Camera for Industrial and Commercial Use, Invisual E Inc., http://www.invisuale.com/hardware/hdr-camera.html (2 pages), [retrieved Jan. 5, 2016). |
NIT, Magic Technology—White Paper, Scene Contrast Indexed Image Sensing with WDR (14 pages). |
Ni, Yang, et al., A CMOS Log Image Sensor with On-Chip FPN Compensation (4 pages). |
NIT, Application Note: Native WDRTM for your Industrial Welding Applications, www.new-imaging-technologies.com (2 pages). |
Reverchon, J.L., et al. New InGaAs SWIR Imaging Solutions from III-VLab, New Imaging Technologies (10 pages). |
Ni, Y et al. A 768x576 Logarithmic Image Sensor with Photodiode in Solar Cell Mode, New Imaging Technologies (4 pges). |
NIT, 8Care12004-02-B1 Datasheet, New Imaging Technologies (9 pages). |
NIT, NSC1005, Datasheet, Revised Nov. 2012, NSC1005 HD ready Logarithmic CMOS Sensor (28 pages). |
NIT Image Processing Pipeline for Lattice HDR-6-, NIP, Pipeline, IP_NIT_NSC1005C_HDR60_V1_0 (23 pages). |
NIT, WiDySwire, New Imaging Technologyies (7 pages). |
NIT Image Processing Pipeline, R&D Report N RD1220-Rev B, May 14, 2012 (10 pages). |
NIT Color Management, R&D Report N RD1113-Rev B, Apr. 11, 2011 (31 pages). |
Int'l Search Report and Written Opinion for PCT/US2015/067931 dated Jul. 26, 2016 (19 pages). |
Cameron Series: “Why Weld Cameras Need Why High Dynamic Range Imaging”, Apr. 10, 2013 (Apr. 10, 2013), XP055269605, Retrieved from the Internet: URL:http://blog.xiris.com/blog/bid/258666/Why-Weld-Cameras-Need-High-Dynamic-Range-Imaging [retrieved on Apr. 29, 2016] the whole document (5 pages). |
AD-081CL Digital 2CCD Progressive Scan HDR/High Frame Rate Camera User's Manual, Jul. 1, 2012 (Jul. 1, 2012) p. 27, XP055269758, Retrieved from the Internet: URL:http://www.stemmer-imaging.de/media/up loads/docmanager/53730_JAI_AD-081_CL_Manual.pdf [retrieved on Apr. 29, 2016] the whole document (55 pages). |
Anonymous: “JAI introduces unique high-dynamic-range camera”, Nov. 5, 2009 (Nov. 5, 2009), XP055269759, Retrieved from the Internet: URL:http://www.jai.com/en/newsevents/news/ad-081c1 [retrieved on Apr. 29, 2016] Typical HDR applications for the AD-081CL include inspection tasks where incident light or bright reflections are Oresent, such as . . . welding (2 pages). |
International Search Report and Written Opinion corresponding to International Patent Application No. PCT/US2016/012164, dated May 12, 2016. |
NIT Image Processing Pipeline for Lattice HDR-60, NIP IP Pipeline, NIT_HDR60_V1_0_Pipeline_Sample (48 pages). |
Patent Cooperation Treaty, Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, in PCT/US2016/020865, dated May 11, 2016, 12 pages. |
Choi et al., Simulation of Dynamic Behavior in a GMAW System, Welding Research Supplement, Oct. 2001, 239-s thru 245-s (7 pages). |
Aiteanu et al., Generation and Rendering of a Virtual Welding Seam in an Augmented Reality Training Envionment, Proceedings of the Sixth IASTED International Conference Visualization, Imaging, and Image Proceeding, Aug. 28-30, 2006, Palma de Mallorca, Spain ISBN Hardcapy: 0-88986-598-1 /CD: 0-88986-600-7 (8 pages). |
High Dynamic Range (HDR) Video Image Processing For Digital Glass, Augmented Reality in Quantigraphic Lightspace and Mediated Reality with Remote Expert, Raymond Lo, Sep. 12, 2012, https://www.youtube.com/Watch?v=ygcm0AQXX9k, YouTube screenshot submitted in lieu of the video itself. |
ASH VR1-DIY Homebrew PC Virtual Reality Head Mounted Display HMD,' alrons1972, https://www.youtube.com/Watch?v=VOQboDZqguU, Mar. 3, 2013, YouTube screenshot submitted in lieu of the video itself. |
Soldamatic Augmented Training, Augmented Reality World, May 30, 2013, https://www.youtube.com/watch? V=Mn0O52Ow_qY, YouTube screenshot submitted in lieu of the video itself. |
Optical Head-Mounted Display, Wikipedia, Jun. 2, 2016, https://en.wikipedia.org/wiki/Optical_head-mounted_display 14 pages. |
About Us.' Weldobot.com. <http://weldobot.com/?page_id=6> Accessed Jun. 2, 2016. 1 page. |
International Search Report and Written Opinion corresponding to International Patent Application No. PCT/US2016/020861, dated May 23, 2016. |
Hillers, Bernd, lat Institut fur Automatislerungstechnik, doctoral thesis Selective Darkening Filer and Welding Arc Observation for the Manual Welding Process, Mar. 15, 2012, 152 pgs. |
“High Dynamic Range (HDR) Video Image Processing For Digital Glass, Wearable Cybernetic Eye Tap Helmet Prototype,” Raymond Lo, https://www.youtube.com/watch?v=gtTdiqDqHc8, Sep. 12, 2012, YouTube screenshot Submitted in lieu of the video itself. |
Int'l Search Report and Written Opinion for PCT/US2016/035473 dated Aug. 17, 2016 (15 pages). |
G. Melton et al: “Laser diode based vision system for viewing arc welding (May 2009)”, EUROJOIN 7, May 21, 2009 (May 21, 2009), XP055293872, Venice Lido, Italy, May 21-22, 2009. |
Sergi Foix et al: “Exploitation of Time-of-Flight (ToF) Cameras IRI Technical Report”, Oct. 1, 2007 (Oct. 1, 2007), pp. 1-22, XP055294087, Retrieved from the Internet: URL:http://digital.csic.es/bitstream/10261/30066/1 Itime-of-flight.pdf [retrieved on Aug. 8, 2016]. |
Int'l Search Report and Written Opinion Application No. PCT/US2017/012558 dated Mar. 23, 2017 (12 pages). |
Klinker, Gudrun, Augmented Reality im prktischen Einsatz, Oct. 10, 2012 (40 pages). |
Sandor, C., Klinker, G., A rapid prototyping software infrastructure for user interfaces in ubiquitous augmented reality, Pers Ubiquit Compu (2005) 9 169-185. |
Invertig.Pro Digital. Sep. 16, 2013. |
Rehm Welding Technology, Invertig.Pro Digital, Sep. 16, 2013. |
Rehm Welding Technology, Product Range, Aug. 2013. |
Fig Welder How to Play, www.tradesgamer.com, Nov. 17, 2011. |
Hillers, Bernd & Aiteanu, D & Tschimer, P & Park, M & Graeser, Axel & Balazs, B & Schmidt, L. (2004). TEREBES: Welding helmet with AR capabilities. |
Aiteanu, Dorin, “Virtual and Augmented Reality Supervisor for a New Welding Helmet” Nov. 15, 2005, pp. 1-150. |
Mnich, Chris, et al., “In situ weld pool measurement using sterovision,” Japan-UA Symposium on Flexible Automation, Denver, CO 2004. |
Communication from European Patent Office Appln No. 18 150 120.6 dated Jul. 4, 2018 (9 pgs). |
Int'l Search Report and Written Opinion for PCT/US2018/028261 dated Aug. 6, 2018 (17 pgs). |
Larkin et al., “3D Mapping using a ToF Camera for Self Programming an Industrial Robot”, Jul. 2013, IEEE, 2013 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), pp. 494,499. |
Bombardier et al: “Dual Digimig/Pulse Feeder and SVI-450i Power Supply”, Feb. 1999 (Feb. 1999), XP055480578, Retrieved from the Internet: URL:http://www.esabna.com/eu/literature/arc%20equipment/accessories/dual%20digimig_pulse_fdr%20&%20svi-450i_15-565.pdf [retrieved on Jun. 1, 2018]. |
European Office Action Appln No. 16713176.2 dated Oct. 17, 2018 (7 pgs). |
Wang et al. “Stereo vision-based depth of field rendering on a mobile device”. Journal of Electronic Imaging 23(2), 023009 (Mar.-Apr. 2014) (Year: 2014). |
Mrovlje, etal. “Distance measuring based on stereoscopic pictures”. 9th International PhD Workshop on Systems and Control: Young Generation Viewpoint 1.-3. Oct. 2008, Izola, Slovenia (Year: 2008). |
Petrovai etal, “A stereovision based approach for detecting and tracking lane and forward obstacles on mobile devices” 2015 IEEE Intelligent Vehicles Symposium (IV) Jun. 28-Jul. 1, 2015 COEX, Seoul, Korea (Year: 2015). |
International Search Report and Written Opinion, International Patent Application No. PCT/US2020/018869, dated May 4, 2020, 16 pages. |
Wikipedia, “Google Cardboard,”Jan. 30, 2019, retrieved on Apr. 17, 2020, 8 pages. |
International Search Report and Written Opinion, International Patent Application No. PCT/US2020/018866, dated Apr. 24, 2020, 15 pages. |
Anonymous: “Showcasing latest international developments in welding training systems”, Australasian Welding Journal, vol. 59, Third Quarter, 2014, Jan. 1, 2014 (Jan. 1, 2014), pp. 1-5, XP055742728, Retrieved from the internet: URL:https://www.slv-halle.de/fileadmin/user_upload/Halle/Pressemitteilungen/Welding-training-IIW-C-XIV.pdf [retrieved on Oct. 22, 2020]. |
Number | Date | Country | |
---|---|---|---|
20210158717 A1 | May 2021 | US |
Number | Date | Country | |
---|---|---|---|
62940111 | Nov 2019 | US |