The present disclosure generally relates to material testing systems and, more particularly, to material testing systems with test method verifications.
Material testing machines are used to test the properties (e.g., tensile/compressive strength) of various material specimens. The particular method of testing (a.k.a. test method) may vary from material specimen to material specimen. A computing device in communication with the material testing machine may guide a user through a workflow to setup, execute, and analyze the results of each test method.
Limitations and disadvantages of conventional and traditional approaches will become apparent to one of skill in the art, through comparison of such systems with the present disclosure as set forth in the remainder of the present application with reference to the drawings.
The present disclosure is directed to material testing systems with test method verifications, substantially as illustrated by and/or described in connection with at least one of the figures, and as set forth more completely in the claims.
These and other advantages, aspects and novel features of the present disclosure, as well as details of an illustrated example thereof, will be more fully understood from the following description and drawings.
The figures are not necessarily to scale. Where appropriate, the same or similar reference numerals are used in the figures to refer to similar or identical elements. For example, reference numerals utilizing lettering (e.g., grip 124a, grip 124b) refer to instances of the same reference numeral that does not have the lettering (e.g., grips 124).
The present disclosure relates to material testing systems that verify that two or more test methods are similar or dissimilar, and/or that a software application enabling setup, execution, and/or analysis of the test method(s) is installed and/or operating correctly. The systems are further configured to provide supporting documentation of the similarity/dissimilarity and/or correct installation/operation. Moreover, the systems are configured to reference data recorded during setup, execution, and/or result analysis of the test methods as evidence of the correct installation/operation and/or similarity/dissimilarity.
Such similarity/dissimilarity verification and/or documentation may be helpful to an entity that wants to understand the impact that recent (e.g., intentional and/or unintentional) changes and/or updates (e.g., to their systems, processes, specimens, testing machines, test methods, etc.) may have had on their testing methods. The similarity/dissimilarity verification and/or documentation may further be helpful to an entity wishing to confirm that their testing methods remain the same (or at least relatively similar in important areas), despite any recent changes that may have occurred. Finally, the correct installation and/or operation verification may be helpful for entities seeking to perform federally and/or statutorily required system validations, such as may be required in the medical device industry.
Some examples of the present disclosure relate to a material testing system, comprising: a material testing machine, comprising a test sensor, a test actuator, and a test controller configured to control the test actuator; a user interface; and a computing device configured for communication with the user interface and the material testing machine, the computing device comprising processing circuitry configured to: capture a recording of an output of the user interface during setup, execution, or result analysis of a test method that uses the material testing machine, and generate a document discussing one or more aspects of the setup, execution, or result analysis of the test method using one or more portions of the recording.
In some examples, the user interface comprises a display screen and a speaker, the output of the user interface comprising one or more graphical outputs of the display screen or one or more audio outputs of the speaker. In some examples, the test method comprises a sensor test method for testing a sensor of the material testing machine. In some examples, the document comprises an installation qualification (IQ) document or an operational qualification (OQ) document.
In some examples, the document includes or references the one or more portions of the recording. In some examples, the recording is captured via a camera or a screen recorder. In some examples, the recording comprises a first recording, the document comprises a first document, and the processing circuitry is further configured to: perform a comparison of the first recording and a second recording, identify one or more differences between the first recording and the second recording; generate a second document discussing the one or more differences between the first recording and the second recording.
In some examples, the output comprises a first output, the user interface comprises a first user interface, the computing device comprises a first computing device, the material testing machine comprises a first material testing machine, and the test method comprises a first test method, the second recording being of a second output of the first user interface or of a second user interface of a second computing device that is in communication with a second material testing machine, the second output being output during setup execution, or analysis of the first test method, or of a second test method that uses the second material testing machine. In some examples, the one or more differences between the first recording and the second recording comprise one or more differences between setup, execution, or analysis of the first test method and setup, execution, or analysis of the second test method. In some examples, the one or more differences between the first recording and the second recording comprise one or more differences between one or more first parameters of the first test method evident in the first recording and one or more second parameters of the second test method evident in the second recording.
Some examples of the present disclosure relate to a material testing system, comprising: a first material testing machine, comprising a test sensor, a test actuator, and a test controller configured to control the test actuator; and a computing device, the computing device comprising processing circuitry configured to: perform a comparison between a first test method and a second test method, the first test method being conducted on a first specimen using the first material testing machine, and the second test method being conducted on the first specimen or a second specimen using the first material testing machine or a second material testing machine, identify one or more differences between the first test method and the second test method, and generate a document discussing the one or more differences between the first test method and the second test method.
In some examples, the first test method is defined by one or more first parameters, the second test method is defined by one or more second parameters, and the one or more differences between the first test method and the second test method comprise one or more differences between the one or more first parameter and the one or more second parameters. In some examples, the computing device is in communication with the first material testing machine, and the processing circuitry is further configured to setup the first test method or the second test method, execute the first test method or the second test method, or analyze test results of the first test method or the second test method. In some examples, the system further comprises a first user interface, the processing circuitry being further configured to: capture a first recording of a first output of the first user interface or a second user interface during setup of the first test method, execution of the first test method, or analysis of first test results of the first test method, and capture a second recording of a second output of the first user interface or the second user interface during setup of the second test method, execution of the second test method, or analysis of second test results of the second test method, wherein performing the comparison between the first test method and the second test method comprises performing the comparison between the first recording and the second recording.
In some examples, the one or more differences between the first recording and the second recording comprise one or more differences between one or more first parameters of the first test method evident in the first recording and one or more second parameters of the second test method evident in the second recording. In some examples, the first recording or the second recording is captured via a screen recorder or a camera. In some examples, the first user interface or the second user interface comprises a display screen or a speaker, and the first output or the second output of the first user interface or the second user interface comprises one or more graphical outputs of the display screen or one or more audio outputs of the speaker.
In some examples, the one or more differences between the first recording and the second recording comprises one or more differences between setup, execution, or analysis of the first test method and setup, execution, or analysis of the second test method. In some examples, the document comprises a first document, and the processing circuitry is further configured to generate a second document discussing one or more aspects of the setup, execution, or result analysis of the first test method or the second test method using one or more portions of the first recording or the second recording. In some examples, the second document comprises an installation qualification (IQ) document or operational qualification (OQ) document.
In the example of
In the example of
In the example of
In the example of
In the example of
In some examples, the material testing machine 102 may be configured for static mechanical testing. For example, the material testing machine 102 may be configured for compression strength testing, tension strength testing, shear strength testing, bend strength testing, deflection strength testing, tearing strength testing, peel strength testing (e.g., strength of an adhesive bond), torsional strength testing, and/or any other compressive and/or tensile testing. Additionally or alternatively, the material testing machine 102 may be configured to perform dynamic testing.
In some examples, the material testing machine 102 is configured to interface with the computing system 200 to conduct a test method. For example, the computing system 200 may communicate with a controller 214 (see, e.g.,
In the example of
In some examples, the camera(s) 150 may capture one or more recordings of one or more outputs of the computing system 200. In some examples, each camera 150 may include one or more communication devices configured for (e.g. wired and/or wireless) communication with the computing system 200, such that the camera(s) 150 may transmit the recording(s) to the computing system 200 via the communication device(s).
The drive shafts 212 are further shown connected to the movable crosshead 120, such that movement of the drive shaft(s) 212 via the actuator(s) 210 will result in movement of the movable crosshead 120. While termed drive shafts 212 in the example of
The example material testing machine 102 further includes a controller 214 in electrical communication with the actuator(s) 210. In some examples, the controller 214 may include processing circuitry and/or memory circuitry. In some examples, the controller 214 may be configured to control the material testing machine 102 based on one or more commands, control inputs, and/or test parameters. In some examples, the controller 214 may be configured to translate commands, control inputs, and/or test parameters (e.g., received from the computing system 200) to appropriate (e.g., electrical) signals that may be delivered to the actuator(s) 210, thereby controlling operation of the material testing machine 102 (e.g., via the actuator(s) 210). For example, the controller 214 may provide one or more signals(s) commanding more or less electrical power be provided to the actuator(s) 210, to thereby increase or decrease applied force.
In the example of
The example controller 214 is further in electrical communication with a control panel 216 of the material testing machine 102. In some examples, the control panel 216 may include one or more input devices (e.g., buttons, switches, slides, knobs, microphones, dials, and/or other electromechanical input devices). In some examples, the control panel 216 may be used by an operator to directly control the material testing machine 102. In some examples, the controller 214 may be configured to translate commands, control inputs, and/or test parameters received via the control panel 216 to appropriate (e.g., electrical) signals that may be delivered to the actuator(s) 210 and/or grip(s) 124 to control the material testing machine 102.
The controller 214 is also shown in electrical communication with a network interface 218b of the material testing machine 102. In some examples, the network interface 218b includes hardware, firmware, and/or software to connect the material testing machine 102 to a complementary network interface 218a of the computing system 200. In some examples, the controller 214 may receive information (e.g., commands) from the computing system 200 through the network interfaces 218, and/or send information (e.g., measurement data from sensor(s) 126) to the computing system 200 through the network interfaces 218.
In the example of
In some examples, the one or more input devices 206 may comprise one or more touch screens, mice, keyboards, buttons, switches, slides, knobs, microphones, dials, and/or other input devices 206. In some examples, the one or more output devices 208 may comprise one or more display/touch screens, speakers, lights, haptic devices, and/or other output devices 208. In some examples, the output device(s) 208 may comprise one or more printers. In some examples, the output device(s) 208 (e.g., a display screen) of the UI 204 may output one or more representations of a material testing workflow 300 configured to guide a user through setup, execution, and/or analysis of a test method conducted by the material testing machine 102. In some examples, the output device(s) 208 (e.g., a display screen) of the UI 204 may output one or more representations of a test method creation process 400 configured to assist a user in easily and/or quickly generating a test method and/or material testing workflow 300.
In the example of
As shown, one network interface 218a is in communication with the network interface 218b of the material testing machine 102 through cable 106. As shown, the computing device 102 further includes a network interface 218a in communication with the camera(s) 150, and another network interface 218a in communication with a network 220 (e.g., the Internet). In the example of
In the example of
Though shown as being connected to the remote computing system 250 via the network 220 in the example of
In the example of
The example computing device 202 further includes memory circuitry 226 connected to the common electrical bus 220. As shown, the memory circuitry 226 includes several parameters 232 (and/or parameter values), one or more data repositories 234, a screen recorder 299, a material testing workflow 300, a test method creation process 400, a sensor testing process 700, and a test method verification process 900.
In some examples, the data repositories 234 may comprise several different data structures (e.g., databases, look up table, etc.). The data repositories 234 may store both historical data (e.g., prior workflows 300, test methods, parameters 232, test results, reports, prompts 602, user inputs, etc.) and current data (e.g., mappings between user inputs and parameters 232). In some examples, the historical data may be associated with timestamps and/or other data (e.g., with similar timestamps). Though shown as part of the memory circuitry 226 of the computing device 202 in the example of
While shown as part of the memory circuitry 226 in the example of
In some examples, the processing circuitry 224 is configured to execute the machine readable instructions of the screen recorder 299 to capture and/or record the output(s) of the output device(s) 208 (e.g., display screen(s)) of the UI 204. In some examples, this output may be used to assist with the test method verification process 900 and/or generate one or more documents via the test method verification process 900. While termed a screen recorder 299, a person of ordinary skill will understand that the data captured and/or recorded by the screen recorder 299 may extend beyond visual data shown by a (e.g., display screen) output device 208.
In some examples, the processing circuitry 224 is configured to execute the machine readable instructions of the material testing workflow 300 to guide a user through setup, execution, and/or analysis of a test method of the material testing machine 102. In some examples, the setup (and/or creation) of the test method may involve setting values for several (e.g., test, sample, analysis, etc.) parameters 232 that define the test method and/or analysis of the tests results. In some examples, the UI 204 is configured to show (and/or otherwise output) one or more workflow screens 500 that show and/or allow a user to manually set parameters 232 during execution of the material testing workflow 300.
In some examples, during the sensor testing process 700, one or more sensor test methods for testing the test sensor(s) 126 of the material testing machine 102 are setup and/or executed. In some examples, the test method(s) may be executed manually (e.g., via the material testing workflow 300). In some examples, the test method(s) may be executed automatically, such as, for example, in response to expiration of a particular time limit and/or an indication that one or more of the sensors 126 are malfunctioning. In some examples, further test methods (e.g., for testing specimens 128) may be prohibited from executing if the test results of the sensor test method(s) indicate that one or more of the sensors 126 are malfunctioning. In some examples, the prohibition may continue until the test results of the sensor test method(s) indicate that all of the test sensors 126 are functioning correctly.
In some examples, during the test method creation process 400, the computing device 202 is configured to receive input from a user (e.g., via the UI 204), and use the input to generate and/or set the (e.g., test, sample, analysis, etc.) parameters 232 that define, setup, and/or create the test method. In some examples, the number of prompted user inputs may be far fewer than the number of parameters 232, and/or a single user input may serve as the basis for setting several different parameters 232. Various analyses may also be used optimize the parameters 232 to comply with certain standards, save time, and/or reduce the potential for error. As the number of user input prompts is far fewer than the number of parameters 232, the process of configuring the test method (and/or test result analysis) may be greatly shortened and/or simplified as compared to manual configuration using the material testing workflow 300.
In some examples, during the test method verification process 900, the computing device 202 may verify correct operation of one or more test methods created, setup, executed, and/or analyzed via the material testing workflow 300, test method creation process 400, and/or sensor testing process 700. In some examples, during the test method verification process 900, the computing device 202 may verify correct installation of a software application comprising the material testing workflow 300, test method creation process 400, and/or sensor testing process 700. In some examples, during the test method verification process 900, the computing device 202 may seek to verify that two test methods are identical, and/or identify similarities and/or differences between the test methods. In some examples, the test method verification process 900 may use one or more recordings of the output device(s) 208 of the UI 204 during setup, execution, and/or analysis of the test method to assist with the verifications.
In some examples, before using and/or progressing through the material testing workflow 300, a user may be required to login and/or be authenticated. In some examples, user authentication may occur via biometric (e.g., fingerprint, retinal, face, etc.), close proximity communication (e.g., RFID, NFC, Bluetooth, barcode etc.), cloud based (e.g., Oauth, Google, Single sign on), passkey, and/or multi-factor authentication.
While the material testing workflow 300 is sometimes described below as conducting certain actions for the sake of understanding, it should be understood that one or more of the above described components of the material testing system 100 (e.g., the processing circuitry 224, UI 204, etc.) may undertake the actions on behalf (and/or according to instructions) of the material testing workflow 300.
In some examples, the material testing workflow 300 progresses through the workflow states to guide a user through setup, execution, and analysis of a test method of the material testing machine 102. In some examples, a particular workflow state may be associated with an output of the UI 204 (e.g., a workflow screen 500, see
In the example of
In the example of
For example, the test parameters 232 may include a date the test will be run, identification information of the test (e.g., number, name, type, description, etc.), target start/end positions of grip(s) 124, target start/end positions of the crosshead 120, target distance/direction moved by crosshead 120, target speed of movement of crosshead 120, expected result(s) of test (e.g., position/type of break, distance moved before break, force applied before break, post-test characteristics of sample, etc.), time(s) when sensor(s) 126 should take measurement(s), and/or other relevant to a particular test method. As another example, specimen parameters 232 may include, a date the specimen 128 was manufactured/shipped/packaged, identification information of the specimen 128 (e.g., number, name, description, etc.), pre-test characteristics of the specimen 128 (e.g., measurements/dimensions, material type, weight, color, shape, modulus, ultimate tensile strength, etc.), and/or other information relevant to a particular specimen.
The workflow screens 500a-c have several input fields 502 where (e.g., specimen/test parameter 232) information may be shown and/or entered. And indeed, several of the input fields 502 already have information entered therein. In some examples, each input field 502 may correspond to a sample parameter 232 and/or test parameter 232, and the information present therein being used to set the sample parameter 232 and/or test parameter 232.
In the example of
In the example of
For example, the information prompted for (and/or collected) during the post-test specimen analysis setup state(s) 308 may include post-test characteristics of the specimen 128 (e.g., specimen parameters 232), actual parameters of the test (e.g., test parameters 232), actual results of the test (e.g., test parameters 232), test result report format(s) (e.g., report parameters 232), and/or other information relevant to an analysis of the test method and/or test sample. As another example, the information prompted for (and/or collected) during the post-test specimen analysis setup state(s) 308 may include calculation analysis parameters 232, such, for example, as one or more algorithms that may be used to evaluate results of the test method (and/or produce additional test results), and/or one or more thresholds and/or threshold ranges (e.g., by which test results may be adjudged to determine whether the specimen 128 passed or failed the test). While shown as a separate state in
In the example of
For example, the processing circuitry 224 may estimate a strength, reliability, quality, grade, resiliency, and/or other characteristic of the specimen 128 using the test results and/or the sample, specimen, test, and/or calculation analysis parameters 232. As another example, the processing circuitry 224 may hypothesize about the structure and/or composition of the specimen 128 using the test results and/or the sample, specimen, test, and/or calculation parameters 232. As another example, the processing circuitry may hypothesize about the future performance of the specimen 128 using the test results and/or the sample, specimen, test, and/or calculation analysis parameters 232.
In the example of
In the example of
The example workflow screen 600b in
In the example of
In some examples, the identification of suggested inputs may involve an analysis of prior inputs (e.g., stored in the data repositories 234) to identify the most recent and/or most used input(s) for a particular (e.g., empty) input field 502. In some examples, the test method creation process 400 may additionally (or alternatively) consider other factors, such as, for example, the logged in user, and/or the material testing machine 102 being used (e.g., to determine the most recent/used input for a particular user and/or machine 102). For example, a particular material testing machine 102 (and/or fixture 122, grip 124, sensor 126, etc.) may always be used for compression, in which case the test method creation process 400 may suggest “crushed” in answer to the question “will the specimen be bent, stretched, or crushed.” In some examples, the test method creation process 400 may additionally consider other prior inputs (e.g., to determine the most recent/used input when a particular pattern of inputs is present).
In some examples, a more complex analysis may be used, where the test method creation process 400 attempts to identify, using prior input data (e.g., stored in the repositories 234), one or more input patterns that match the current pattern of inputs. In some examples, one or more machine learning algorithms (e.g., cluster analysis, K-nearest neighbor, etc.) may be used for the pattern matching. This pattern matching may then be used to identify an input to suggest.
In some examples, the suggested input identified at block 406 may be output to the user via the UI 204. In some examples, the test method creation process 400 may give an explanation as to why the suggestion is being made. In some examples, the suggested input may be implemented and/or used as the actual input (e.g., in response to an input received via the UI 204 accepting the suggestion). While block 406 is described as occurring after user input is completed at block 402, in some examples, the suggestion(s) of block 406 may occur while a user is entering input, so that the user can see suggested inputs prior to actually choosing an input themselves.
In the example of
If appropriate, after block 408, the test method creation process 400 identifies one or more parameters 232 (and/or parameter values) to be suggested to the user at block 410. In some examples, the parameter(s) 232 (and/or parameter value(s)) may be suggested if one or more parameters 232 (and/or parameter values) remain unidentified and/or unset after block 408. In some examples, a parameter 232 (and/or parameter value) may be suggested even if a parameter 232 (and/or parameter value) was set at block 408, if a suggested change in the parameter 232 (and/or parameter value) would save time, reduce the potential for error, and/or move the parameter 232 (and/or test method) into compliance with a particular standard.
In some examples, the test method creation process 400 identifies the suggested parameter(s) 232 based on one or more inputs received at blocks 402 and/or 406. For example, one of the user inputs at block 402 may identify a particular standard, and the test method creation process 400 may evaluate the parameters 232 in view of the standard and identify one or more parameters 232 that should be adjusted to meet the standard (e.g., in view of the material testing machine 102, other parameters 232, and/or other inputs).
In some examples, the test method creation process 400 identifies the suggested parameter(s) 232 based on one or more other present and/or prior parameters 232. For example, the test method creation process 400 may evaluate prior parameters 232 in view of a current specimen parameter 232 (e.g., indicating that the specimen 128 is a particular material), and identify one or more particular test parameters 232 as being most often/recently used in the past in conjunction with that specimen parameter 232.
In some examples, the test method creation process 400 may use a more complex analysis, where the test method creation process 400 attempts to identify, using prior parameters 232 and/or inputs (e.g., stored in the repositories 234), one or more patterns that match the current pattern of parameters 232 and/or inputs. In some examples, one or more machine learning algorithms (e.g., cluster analysis, K-nearest neighbor, etc.) may be used for the pattern matching. In some examples, the test method creation process 400 may additionally (or alternatively) identify the suggested parameter(s) 232 based on prior test results (e.g., stored in the repositories 234).
In some examples, the suggested parameter(s) 232 identified at block 410 may be output to the user via the UI 204. In some examples, the test method creation process 400 may give an explanation as to why the suggestion is being made. In some examples, the suggested parameter(s) 232 may be used as the actual parameter(s) 232 (e.g., in response to an input received via the UI 204 accepting the suggestion).
In the example of
After block 412, the test method creation process 400 generates one or more documents based on the received inputs, the current parameters 232 (and/or parameter values), and/or the generated material testing workflow 300 (and/or test method) at block 414. For example, the document(s) may provide instructions on how to use the material testing workflow 300, and/or how to setup, execute, and/or analyze results of the test method. Such a document may be useful for an operator tasked with using the generated material testing workflow 300, and/or setting up, executing, and/or analyzing results of the test method.
After block 414, the test method creation process 400 generates and/or provides a tutorial (e.g., via the UI 204) at block 416. In some examples, the tutorial may instruct a user how to setup the material testing machine 102, operate the material testing machine 102, progress through the material testing workflow 300, and/or setup, execute, and/or analyze the results of the test method (e.g., via the material testing workflow 300). In some examples, the tutorial may provide access to a human or automated helper to provide guidance on how to setup the material testing machine 102, operate the material testing machine 102, progress through the material testing workflow 300, and/or setup, execute, and/or analyze the results of the test method.
In the example of
In some examples, block 418 may include executing the test method defined by the identified parameters 232 of the workflow 300 using (and/or via communication with) the material testing machine 102. For example, a user may use the test method execution state(s) 306 of the workflow 300 to execute the test method using the set parameters 232. Thereafter, at block 420, the results of the test method are compiled and/or output via one or more reports (e.g., via post-test states 308, 310, and/or 316 of the workflow 300).
In some examples, the report(s) may be provided over the network 220 (e.g., internet) to a remote user using the remote interface 230. In such examples, the report may take the form of a website, web platform, or other online interface.
While blocks 418 and 420 are shown as being part of the test method creation process 400 for the purposes of completeness and understanding, persons of ordinary skill will understand that blocks 418 and 420 may be undertaken outside of the test method creation process 400. As shown, the test method creation process 400 ends after block 420.
In some examples, before using and/or progressing through the sensor testing process 700, a user may be required to login and/or be authenticated, as discussed above. While the sensor testing process 700 is sometimes described below as conducting certain actions for the sake of understanding, it should be understood that one or more of the above described components of the material testing system 100 (e.g., the processing circuitry 224, UI 204, etc.) may undertake the actions on behalf (and/or according to instructions) of the sensor testing process 700.
In the example of
In some examples, each test sensor 126 of the material testing machine 102 is associated with its own unique test method. In some examples, a particular test sensor 126 may be associated with a particular test method by specifying (e.g., via the test parameter(s) 232) that the particular test sensor 126 is the one being used for the testing.
In some examples, the specimen 128 in a test method for a test sensor 126 may be an item of a known weight (e.g., specified as part of the sample/specimen parameter(s) 232). Thereby, the sensor testing process 700 may determine whether or not a test sensor 126 is functioning correctly by comparing the weight (and/or force) measurements of the test sensor 126 during the test method to the known weight (and/or force). In some examples, the test sensor 126 may be determined to have passed the test if the measured weight is the same as, or within some tolerance of, the known weight. In some examples, the tolerance may be specified as part of the analysis parameter(s) 232.
In the example of
In the example of
In some examples, one or more other parameters 232 not traditionally part of the material testing workflow 300 may also be provided at block 702. For example, the sensor testing process 700 may prompt for and/or receive input specifying one or more time parameters 232 representative of a time period, frequency, number of intervening test methods/samples/specimens 128, and/or other variable indicating how often the test method(s) should execute (e.g., for all test sensors 126 or for a particular test sensor 126).
As another example, the sensor testing process 700 may prompt for and/or receive input specifying one or more log parameters 232 representative of one or more conditions that might trigger the need for the test sensor(s) 126 to be tested via the appropriate test method(s). In some examples, the material testing system 100 may keep a data log (e.g., via one or more of the data repositories 234) recording measurements of the test sensor(s) 126 (as well as other data) from prior sensor measurements, prior test results, prior test methods, and/or prior material testing workflows 300. In some examples, the sensor testing process 700 may analyze the logged data to determine whether any of the conditions represented by the log parameters 232 have been met, in order to determine whether there is need for the sensor(s) 126 to be tested via the appropriate test method(s).
Examples of conditions represented by the log parameters 232 may include certain data trends, such as, for example, sequences of test method measurements (e.g., made by test sensors 126) showing potential sensor drift, sensor overload, and/or poor calibration (e.g., in view of other measurements by the same test sensor 126 and/or by other test sensors 126 during prior test methods). Another example might include one or more thresholds beyond a sensor measurement should never (or very rarely) reach.
In the example of
At block 706, the sensor testing process 700 analyzes the data log (e.g., stored via the data repositories 234) to see if there is evidence of any of the conditions represented by the log parameters 232 being met. If no such evidence is uncovered, the sensor testing process 700 returns to the test time determination block 704. However, if the sensor testing process 700 does find evidence of one or more of the conditions represented by the log parameters 232 being met, the sensor testing process 700 proceeds to block 708, where the sensor testing process 700 executes the sensor test method(s).
While shown in
In some examples, the computing device 202 communicates with the material testing machine 102 (e.g., via network interfaces 218) to execute the sensor test method(s), as discussed above. In some examples, execution of the sensor test method(s) may involve using the material testing machine 102 according to the (e.g., test/specimen/sample) parameters 232 set by the creation process 400 and/or during the material testing workflow 300, as discussed above. While shown as being automatically executed by the sensor testing process 700, in some examples, the sensor test method(s) may be executed manually at any time by a user (e.g. via the material testing workflow 300).
In the example of
As shown, the sensor testing process 700 proceeds to block 712 after block 710 if the sensor testing process 700 determines the test sensor 126 passes the test method. At block 712, the sensor testing process 700 determines whether there is another test sensor 126 of the material testing machine 102 that needs to be (or has yet to be) tested (and/or whether there is another associated test method that needs to be or has yet to be executed). If so, the sensor testing process 700 returns to block 708, where the next test method is executed.
If all the test sensors 126 are determined to have passed the test method(s) at block 710, and no further test sensors 126 are found to need testing at block 712, then the sensor testing process 700 proceeds to block 714 where the sensor testing process 700 enables further test methods to be conducted (e.g., on specimens 128 that need to be tested) using the material testing machine 102. In some examples, a user may also manually enable further test methods to be conducted. After the enablement at block 714, the sensor testing process 700 returns to block 704 (though, in some examples, the sensor testing process 700 may instead return to block 702, or end).
However, if a test sensor 126 is found not to have passed the test method at block 710, the sensor testing process 700 proceeds to block 716, where further test methods are prohibited from executing using the material testing machine 102. In some examples, this prohibition may ensure that the test results of other test methods are reliable and not thrown off by malfunctioning test sensor(s) 126. In some examples, the sensor testing process 700 may only proceed to block 716 if a certain number (e.g., stored in memory circuitry 226) of test sensors 126 fail their test methods. While shown as returning to block 708 after block 716, in some examples, the sensor testing process 700 may instead end after block 716.
In some examples, before using the test method verification process 900, a user may be required to login and/or be authenticated, as discussed above. While the test method verification process 900 is sometimes described below as conducting certain actions for the sake of understanding, it should be understood that one or more of the above described components of the material testing system 100 (e.g., the processing circuitry 224, UI 204, camera(s) 150, screen recorder 299, etc.) may undertake the actions on behalf (and/or according to instructions) of the sensor testing process 700.
In the example of
In some examples, the test method verification process 900 may capture and/or record output of the UI 204 during several test methods. For example, the test method verification process 900 may record a first output of the UI 204 during setup, execution, and/or analysis of a first test method conducted using the computing system 200 and material testing machine 102, and then record a second output of a remote UI (e.g., of the remote computing system 250) during the setup, execution, and/or analysis of a second test method conducted using the remote computing system 250 and the remote material testing machine 252.
In some examples, the test methods may differ with respect to one or more (e.g., sample, specimen, test, calculation analysis, and/or report) parameters 232. In some examples, the test methods may be the same test method (e.g., with identical parameters 232) but executed at different times or using different computing devices 202, UIs 204, output devices 208, and/or material testing machines 102. In some examples, the test methods may execute at the same time and/or use the same computing devices 202, UIs 204, output devices 208, and/or material testing machines 102. In some examples, the recordings may be of output from the same output devices 208 or different output devices 208.
After block 902, the test method verification process 900 proceeds to block 904 where the test method verification process 900 compares two or more test methods and/or recordings. In some examples, comparison of the two or more test methods may comprise comparison of the parameters 232 and/or test results of the test methods. In some examples, the test method verification process 900 may perform the comparison using parameters 232 and/or other data stored in memory circuitry 226 (and/or the data repositories 234).
In some examples, the test method verification process 900 may compare the parameters 232 and/or test results of the test methods based on the parameters 232 and/or test results captured and/or evident in the recordings. In some examples, the portions of the recordings where the relevant parameters 232 and/or test results are captured may be bookmarked, linked to, segregated, and/or otherwise noted (e.g., via an entry in memory circuitry 226 and/or the data repositories 234). In some examples, the recordings (and/or cameras 150) may further capture setup and/or operations of the material testing machine 102 (and/or remote material testing machine 252), and the comparison(s) may include comparison of these setups and/or operations.
In the example of
After block 906, the test method verification process 900 proceeds to block 908, where the test method verification process 900 generates one or more documents 1000a (see, e.g.,
In some examples, such documents 1000a may be helpful to an entity that wants to understand the impact that recent (e.g., intentional and/or unintentional) changes and/or updates (e.g., to their systems, processes, specimens, testing machines, test methods, etc.) may have had on their testing methods. Such documents 1000a may further be helpful to an entity wishing to confirm that their testing methods remain the same (or at least relatively similar in important areas), despite any recent changes that may have occurred.
In the example of
In some examples, the document(s) 1000b may be generated as evidence that an entity has a validation process that includes installation qualification (I.Q.) and/or operational qualification (O.Q.) verification. In certain industries, such a validation process may be federally required by statute (e.g., 37 C.F.R. § 820.75), and thus documentation of such a process may be valuable. Whereas such documentation may previously have been needed to be manually drafted using substantial manpower and/or man hours, the test method verification process 900 may collect the necessary data for the documentation during normal test method setup, execution, and/or result analysis, and generate the document(s) 1000b automatically using the collected data, thereby saving substantial time and/or effort.
The disclosed test method verification process 900 allows a user to quickly and easily verify that two test methods are similar or dissimilar, with recorded (e.g., visual/audio) data providing evidence of the similarity and/or dissimilarity. The test method verification process 900 additionally allows for fast and efficient documentation of the similarity and/or dissimilarity, as well as fast and/or efficient documentation of an I.Q./O.Q. validation process that may be statutorily required.
The present methods and/or systems may be realized in hardware, software, or a combination of hardware and software. The present methods and/or systems may be realized in a centralized fashion in at least one computing system, or in a distributed fashion where different elements are spread across several interconnected computing or cloud systems. Any kind of computing system or other apparatus adapted for carrying out the methods described herein is suited. A typical combination of hardware and software may be a general-purpose computing system with a program or other code that, when being loaded and executed, controls the computing system such that it carries out the methods described herein. Another typical implementation may comprise an application specific integrated circuit or chip. Some implementations may comprise a non-transitory machine-readable (e.g., computer readable) medium (e.g., FLASH drive, optical disk, magnetic storage disk, or the like) having stored thereon one or more lines of code executable by a machine, thereby causing the machine to perform processes as described herein.
While the present method and/or system has been described with reference to certain implementations, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the scope of the present method and/or system. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the present disclosure without departing from its scope. Therefore, it is intended that the present method and/or system not be limited to the particular implementations disclosed, but that the present method and/or system will include all implementations falling within the scope of the appended claims.
As used herein, “and/or” means any one or more of the items in the list joined by “and/or”. As an example, “x and/or y” means any element of the three-element set {(x), (y), (x, y)}. In other words, “x and/or y” means “one or both of x and y”. As another example, “x, y, and/or z” means any element of the seven-element set {(x), (y), (z), (x, y), (x, z), (y, z), (x, y, z)}. In other words, “x, y and/or z” means “one or more of x, y and z”.
As utilized herein, the terms “e.g.,” and “for example” set off lists of one or more non-limiting examples, instances, or illustrations.
As used herein, the terms “coupled,” “coupled to,” and “coupled with,” each mean a structural and/or electrical connection, whether attached, affixed, connected, joined, fastened, linked, and/or otherwise secured. As used herein, the term “attach” means to affix, couple, connect, join, fasten, link, and/or otherwise secure. As used herein, the term “connect” means to attach, affix, couple, join, fasten, link, and/or otherwise secure.
As used herein the terms “circuits” and “circuitry” refer to physical electronic components (i.e., hardware) and any software and/or firmware (“code”) which may configure the hardware, be executed by the hardware, and or otherwise be associated with the hardware. As used herein, for example, a particular processor and memory may comprise a first “circuit” when executing a first one or more lines of code and may comprise a second “circuit” when executing a second one or more lines of code. As utilized herein, circuitry is “operable” and/or “configured” to perform a function whenever the circuitry comprises the necessary hardware and/or code (if any is necessary) to perform the function, regardless of whether performance of the function is disabled or enabled (e.g., by a user-configurable setting, factory trim, etc.).
As used herein, a control circuit may include digital and/or analog circuitry, discrete and/or integrated circuitry, microprocessors, DSPs, etc., software, hardware and/or firmware, located on one or more boards, that form part or all of a controller, and/or are used to control a welding process, and/or a device such as a power source or wire feeder.
As used herein, the term “processor” means processing devices, apparatus, programs, circuits, components, systems, and subsystems, whether implemented in hardware, tangibly embodied software, or both, and whether or not it is programmable. The term “processor” as used herein includes, but is not limited to, one or more computing devices, hardwired circuits, signal-modifying devices and systems, devices and machines for controlling systems, central processing units, programmable devices and systems, field-programmable gate arrays, application-specific integrated circuits, systems on a chip, systems comprising discrete elements and/or circuits, state machines, virtual machines, data processors, processing facilities, and combinations of any of the foregoing. The processor may be, for example, any type of general purpose microprocessor or microcontroller, a digital signal processing (DSP) processor, an application-specific integrated circuit (ASIC), a graphic processing unit (GPU), a reduced instruction set computer (RISC) processor with an advanced RISC machine (ARM) core, etc. The processor may be coupled to, and/or integrated with a memory device.
As used, herein, the term “memory” and/or “memory device” means computer hardware or circuitry to store information for use by a processor and/or other digital device. The memory and/or memory device can be any suitable type of computer memory or any other type of electronic storage medium, such as, for example, read-only memory (ROM), random access memory (RAM), cache memory, compact disc read-only memory (CDROM), electro-optical memory, magneto-optical memory, programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically-erasable programmable read-only memory (EEPROM), a computer-readable medium, or the like. Memory can include, for example, a non-transitory memory, a non-transitory processor readable medium, a non-transitory computer readable medium, non-volatile memory, dynamic RAM (DRAM), volatile memory, ferroelectric RAM (FRAM), first-in-first-out (FIFO) memory, last-in-first-out (LIFO) memory, stack memory, non-volatile RAM (NVRAM), static RAM (SRAM), a cache, a buffer, a semiconductor memory, a magnetic memory, an optical memory, a flash memory, a flash card, a compact flash card, memory cards, secure digital memory cards, a microcard, a minicard, an expansion card, a smart card, a memory stick, a multimedia card, a picture card, flash storage, a subscriber identity module (SIM) card, a hard drive (HDD), a solid state drive (SSD), etc. The memory can be configured to store code, instructions, applications, software, firmware and/or data, and may be external, internal, or both with respect to the processor.
This application claims priority to, and the benefit of, U.S. Provisional Application No. 63/529,691 entitled “Material Testing Systems with Test Method Verifications,” filed Jul. 29, 2023, the entire contents of which being hereby incorporated by reference.
Number | Date | Country | |
---|---|---|---|
63529691 | Jul 2023 | US |