Systems, methods, and media for manufacturing processes

Information

  • Patent Grant
  • 12117812
  • Patent Number
    12,117,812
  • Date Filed
    Friday, June 18, 2021
    3 years ago
  • Date Issued
    Tuesday, October 15, 2024
    2 months ago
Abstract
A manufacturing system is disclosed herein. The manufacturing system includes one or more stations, a monitoring platform, and a control module. Each station of the one or more stations is configured to perform at least one step in a multi-step manufacturing process for a component. The monitoring platform is configured to monitor progression of the component throughout the multi-step manufacturing process. The control module is configured to dynamically adjust processing parameters of each step of the multi-step manufacturing process to achieve a desired final quality metric for the component.
Description
FIELD OF DISCLOSURE

The present disclosure generally relates to a system, method, and media for manufacturing processes.


BACKGROUND

Since the dawn of the industrial revolution in the 18th century, automation has governed the production of goods. Although today's factories have fully embraced automation as a core principle—with robots performing many repeatable tasks in high-production environments—many assembly tasks continue to be performed by humans. These tasks are difficult to automate due to cost, risk of critical failure, or logistics of deploying a robotic system for a low-quantity production run. These production lines are overseen by standard process control and people management, such that an assembler is taught to perform a certain quality metric over time, or they are replaced by another operator. This process has remained largely unchanged since the advent of the assembly.


SUMMARY

In some embodiments, a manufacturing system is disclosed herein. The manufacturing system includes one or more stations, a monitoring platform, and a control module. Each station is configured to perform at least one step in a multi-step manufacturing process for a component. The monitoring platform configured to monitor progression of the component throughout the multi-step manufacturing process. The control module configured to dynamically adjust processing parameters of a step of the multi-step manufacturing process to achieve a desired final quality metric for the component, the control module configured to perform operations. The operations include receiving image data of tooling of a first station of the one or more stations. The operations further include identifying a set of keypoints from the image data. The keypoints correspond to position information of the tooling during processing at the first station. The operations further include determining, by a machine learning model, a final quality metric for the component, based on the keypoints. The operations further include, based on the determining, assigning the component to a class of components based on a comparison between the final quality metric generated by the machine learning model and a canonical final quality metric for the component.


In some embodiments, a computer-implemented method for controlling a multi-step manufacturing process is disclosed herein. The multi-step manufacturing process involves one or more stations of a manufacturing system. Each station is configured to perform at least one step in a multi-step manufacturing process for a component. A computing system associated with the manufacturing system receives image data of tooling of a first station of the one or more stations. The computing system identifies a set of keypoints from the image data, the set of keypoints corresponding to position information of the tooling during processing at the first station. A machine learning model associated with the computing system determines a final quality metric for the component, based on the set of keypoints. Based on the determining, the computing system assigns the component to a class of components based on a comparison between the final quality metric generated by the machine learning model and a canonical final quality metric for the component.


In some embodiments, a manufacturing system is disclosed herein. The manufacturing system includes one or more stations, a monitoring platform, and a control module. Each station is configured to perform at least one step in a multi-step manufacturing process for a component. The monitoring platform configured to monitor progression of the component throughout the multi-step manufacturing process. T control module configured to dynamically adjust processing parameters of a step of the multi-step manufacturing process to achieve a desired final quality metric for the component, the control module configured to perform operations. The operations include receiving image data of tooling of a first station of the one or more stations. The operations further include identifying a set of keypoints from the image data. The keypoints correspond to position information of the tooling during processing at the first station. The operations further include determining, by a machine learning model, a final quality metric for the component, based on the keypoints. The operations further include determining that the final quality metric is not within a threshold tolerance from the final quality metric. The operations further include based on the determining, assigning the component to a class of components based on a comparison between the final quality metric generated by the machine learning model and a canonical final quality metric for the component. The operations further include, based on the assigning, determining that the class assigned to the component is not an acceptable class. The operations further include based on the determining, inferring positional information corresponding to the component at the first processing station. The operations further include based on the determining, generating an updated instruction set to be performed by at least one of the first processing station or a downstream station. The operations further include predicting, by a machine learning model, a final quality metric for the component based on the updated instruction set. The operations further include, based on the predicted final quality metric, providing the updated instruction set to at least one of the first processing station or the downstream station.





BRIEF DESCRIPTION OF THE DRAWINGS

So that the manner in which the above recited features of the present disclosure can be understood in detail, a more particular description of the disclosure, briefly summarized above, may be had by reference to embodiments, some of which are illustrated in the appended drawings. It is to be noted, however, that the appended drawings illustrate only typical embodiments of this disclosure and are therefore not to be considered limiting of its scope, for the disclosure may admit to other equally effective embodiments.



FIG. 1 is a block diagram illustrating a manufacturing environment, according to example embodiments.



FIG. 2 is a block diagram illustrating control module, according to exemplary embodiments.



FIG. 3 is a block diagram illustrating an exemplary architecture of LSTM model, according to example embodiments.



FIG. 4 is a block diagram visually illustrating the overall process flow of a feedback segment for tooling module, according to example embodiments.



FIG. 5 is a block diagram illustrating architecture of GRU model, according to example embodiments.



FIG. 6A is a flow diagram illustrating a method of correcting a multi-step manufacturing process, according to example embodiments.



FIG. 6B is a flow diagram illustrating a method of correcting a multi-step manufacturing process, according to example embodiments.



FIG. 7 is a flow diagram illustrating a method of correcting a multi-step manufacturing process, according to example embodiments.



FIG. 8A illustrates a system bus computing system architecture, according to example embodiments.



FIG. 8B illustrates a computer system having a chipset architecture, according to example embodiments.





To facilitate understanding, identical reference numerals have been used, where possible, to designate identical elements that are common to the figures. It is contemplated that elements disclosed in one embodiment may be beneficially utilized on other embodiments without specific recitation.


DETAILED DESCRIPTION

Manufacturing processes may be complex and include raw materials being processed by different process stations (or “stations”) until a final component is produced. In some embodiments, each process station receives an input for processing and may output an intermediate output that may be passed along to a subsequent (downstream) process station for additional processing. In some embodiments, a final process station may receive an input for processing and may output the final component or, more generally, the final output.


In some embodiments, each station may include one or more tools/equipment that may perform a set of process steps. Exemplary process stations may include, but are not limited to, conveyor belts, injection molding presses, cutting machines, die stamping machines, extruders, computer numerical control (CNC) mills, grinders, assembly stations, three-dimensional printers, quality control stations, validation stations, and the like.


In some embodiments, operations of each process station may be governed by one or more process controllers. In some embodiments, each process station may include one or more process controllers that may be programmed to control the operation of the process station. In some embodiments, an operator, or control algorithms, may provide the station controller with station controller setpoints that may represent the desired value, or range of values, for each control value. In some embodiments, values used for feedback or feed forward in a manufacturing process may be referred to as control values. Exemplary control values may include, but are not limited to: speed, temperature, pressure, vacuum, rotation, current, voltage, power, viscosity, materials/resources used at the station, throughput rate, outage time, noxious fumes, and the like.


One or more techniques provided herein improves upon conventional processes by providing a system through which image and/or video data may be used to predict or forecast a final quality metric of a component. In some embodiments, the system may include a monitoring system configured to capture or record video and/or image data of the tooling of each processing node or station of manufacturing system. Based on the positioning of the tooling during a process step, the present system may be configured to predict or forecast a final quality metric of a component. If the predicted final quality metric falls outside of a range of acceptable values, the present system may generate and provide updated processing instructions to at least one of the current processing nodes k or stations k or to processing nodes (k+1) and/or stations (k+1), in an attempt to correct for any errors in processing, such that a desired final quality metric may be achieved.


In some embodiments, the system may include a monitoring system configured to capture or record video and/or image data of a component at each processing node or station of the manufacturing system. Based on visual information about the component at an end of a processing node, the present system may be configured to predict or forecast a final quality metric of a component. If the predicted final quality metric falls outside of a range of acceptable values, the present system may generate and provide updated processing instructions to at least one of processing nodes k or stations k or to processing nodes (k+1) and/or stations (k+1), in an attempt to correct for any errors in processing, such that a desired final quality metric may be achieved.


In this manner, the present system is able to predict or forecast a final quality metric of a component, at any stage of the manufacturing process, without having to actually test the component. Such system is particularly useful for final quality metrics that would otherwise require destructive testing, or for which the final quality metric cannot be evaluated until processing is complete.



FIG. 1 is a block diagram illustrating a manufacturing environment 100, according to example embodiments. Manufacturing environment 100 may include a manufacturing system 102, a monitoring platform 104, and a control module 106. Manufacturing system 102 may be broadly representative of a multi-step manufacturing system. In some embodiments, manufacturing system 102 may be representative of an assembly line system, where each processing station may be representative of a human worker. In some embodiments, manufacturing system 102 may be representative of a manufacturing system for use in additive manufacturing (e.g., 3D printing system). In some embodiments, manufacturing system 102 may be representative of a manufacturing system for use in subtractive manufacturing (e.g., CNC machining. In some embodiments, manufacturing system 102 may be representative of a manufacturing system for use in a combination of additive manufacturing and subtractive manufacturing. More generally, in some embodiments, manufacturing system 102 may be representative of a manufacturing system for use in a general manufacturing process.


Manufacturing system 102 may include one or more stations 1081-108n (generally, “station 108”). Each station 108 may be representative of a step and/or station in a multi-step manufacturing process. For example, each station 108 may be representative of a layer deposition operation in a 3D printing process (e.g., station 1081 may correspond to layer 1, station 1082 may correspond to layer 2, etc.). In another example, each station 108 may correspond to a specific processing station. In another example, each station 108 may correspond to a specific human operator performing a specific task in an assembly line manufacturing process.


Each station 108 may include a process controller 114 and control logic 116. Each process controller 1141-114n may be programmed to control the operation of each respective station 108. In some embodiments, control module 106 may provide each process controller 114 with station controller setpoints that may represent the desired value, or range of values, for each control value. Control logic 116 may refer to the attributes/parameters associated with a station's 108 process steps. In operation, control logic 116 for each station 108 may be dynamically updated throughout the manufacturing process by control module 106, depending on a current trajectory of a final quality metric.


Monitoring platform 104 may be configured to monitor each station 108 of manufacturing system 102. In some embodiments, monitoring platform 104 may be a component of manufacturing system 102. For example, monitoring platform 104 may be a component of a 3D printing system. In some embodiments, monitoring platform 104 may be independent of manufacturing system 102. For example, monitoring platform 104 may be retrofit onto an existing manufacturing system 102. In some embodiments, monitoring platform 104 may be representative of an imaging device configured to capture an image of a component or tooling (e.g., a worker or a process tool) at each step of a multi-step process. For example, monitoring platform 104 may be configured to capture an image of the component at each station 108 and/or an image of a component developing the component at each station 108 (e.g., tooling, human, etc.). Generally, monitoring platform 104 may be configured to capture information associated with production of a component (e.g., an image, a voltage reading, a speed reading, etc.) and/or tool (e.g., hand position, tooling position, etc.), and provide that information, as input, to control module 106 for evaluation.


Control module 106 may be in communication with manufacturing system 102 and monitoring platform 104 via one or more communication channels. In some embodiments, the one or more communication channels may be representative of individual connections via the Internet, such as cellular or Wi-Fi networks. In some embodiments, the one or more communication channels may connect terminals, services, and mobile devices using direct connections, such as radio frequency identification (RFID), near-field communication (NFC), Bluetooth™, low-energy Bluetooth™ (BLE), Wi-Fi™, ZigBee™, ambient backscatter communication (ABC) protocols, USB, WAN, or LAN.


Control module 106 may be configured to control each process controller of manufacturing system 102. For example, based on information captured by monitoring platform 104, control module 106 may be configured to adjust process controls associated with a specific station 108. In some embodiments, control module 106 may be configured to adjust process controls of a specific station 108 based on a projected final quality metric.



FIG. 2 is a block diagram illustrating control module 106, according to exemplary embodiments. Control module 106 may include a tooling module 202 and a component module 204.


Tooling module 202 may be configured to project a final quality metric of a specimen at a given stage of production based on image data obtained by monitoring platform 104. In operation, control module 106 may receive input from monitoring platform 104. In some embodiments, such input may take the form of an image or video of the tooling performing a subprocess at a given step of the multi-step manufacturing process. For example, the image or video data may include image or video data of a human's hands while performing a specific subprocess of the multi-step manufacturing process. In another example, the image or video data may include image or video data of a three-dimensional printer performing depositing a specific layer of a multi-layer manufacturing process. Based on the input, control module 106 may project a final quality metric of the component. Depending on the projected final quality metric of the component, control module 106 may determine one or more actions to take in subsequent manufacturing steps in order to reach a desired or threshold final quality metric. For example, if the projected final quality metric falls outside of a range of acceptable values, control module 106 may take one or more actions to rectify the manufacturing process. In some embodiments, control module 106 may interface with station controllers in at least one of a current station 108(k) or subsequent stations 108(k+1) to adjust their respective control and/or station parameters. In some embodiments, control module 106 may provide human manufacturers with updated instructions to be performed at each processing station (k+1) of a production line. These adjustments may aid in correcting the manufacturing process, such that the final quality metric may be within the range of acceptable quality metrics.


Component module 204 may be configured to project a final quality metric of a specimen at a given stage of production based on image data obtained by monitoring platform 104. In some embodiments, component module 204 may receive input from monitoring platform 104. In some embodiments, such input may take the form of an image or video of the component at a given step of the multi-step manufacturing process. In some embodiments, component module 204 may receive inferred component data at a given step of the multi-step manufacturing process from tooling module 202. For example, tooling module 202 may infer information about the component at a given step of the multi-step manufacturing process based on the tooling image or video data and provide that inferred information to component module 204 as input. Based on the input, component module 204 may project a final quality metric of the component. Depending on the projected final quality metric of the component, component module 204 may determine one or more actions to take in subsequent manufacturing steps in order to reach a desired or threshold final quality metric. For example, if the projected final quality metric falls outside of a range of acceptable values, component module 204 may identify one or more actions to rectify the manufacturing process. In some embodiments, control module 106 may interface with station controllers in at least one of the current station 108k or subsequent stations 108(k+1) to adjust their respective control and/or station parameters. In some embodiments, control module 106 may provide human manufacturers with updated instructions to be performed in at least one of a current processing station k or other processing stations (k+1) of a production line. These adjustments may aid in correcting the manufacturing process, such that the final quality metric may be within the range of acceptable quality metrics.


Each of tooling module 202 and component module 204 may include one or more software modules. The one or more software modules may be collections of code or instructions stored on a media (e.g., memory of computing systems associated with control module 106) that represent a series of machine instructions (e.g., program code) that implements one or more algorithmic steps. Such machine instructions may be the actual computer code the processor interprets to implement the instructions or, alternatively, may be a higher level of coding of the instructions that is interpreted to obtain the actual computer code. The one or more software modules may also include one or more hardware components. One or more aspects of an example algorithm may be performed by the hardware components (e.g., circuitry) itself, rather as a result of the instructions. Further, in some embodiments, each of tooling module 202 and component module 204 may be configured to transmit one or more signals among the components. In such embodiments, such signals may not be limited to machine instructions executed by a computing device.


In some embodiments, tooling module 202 and component module 204 may communicate via one or more local networks. Network may be of any suitable type, including individual connections via the Internet, such as cellular or Wi-Fi networks. In some embodiments, network may connect terminals, services, and mobile devices using direct connections, such as radio frequency identification (RFID), near-field communication (NFC), Bluetooth™, low-energy Bluetooth™ (BLE), Wi-Fi™, ZigBee™, ambient backscatter communication (ABC) protocols, USB, WAN, or LAN. Because the information transmitted may be personal or confidential, security concerns may dictate one or more of these types of connection be encrypted or otherwise secured. In some embodiments, however, the information being transmitted may be less personal, and therefore, the network connections may be selected for convenience over security.


Tooling module 202 may include acquisition system 206, extractor module 208, and prediction module 210. Generally, a multi-node or multi-station assembly environment, such as manufacturing system 102, may be represented, broadly as G({right arrow over (sl)}, ({right arrow over (al)}), where {right arrow over (sl)} may represent the states of the component at all i nodes and where {right arrow over (al)} may represent that set of actions to be performed on the component by the tooling at all i nodes. Given a norm or canonical quality measurement, Vc, tooling module 202 may be configured to optimize the error in the assembly process with an estimated quality metric, {circumflex over (V)}c, such that {circumflex over ( )}{circumflex over (V)}c may be within a threshold distance of Vc. In some embodiments, tooling module 202 may estimate the state, custom character=[s0, s1, . . . , sN−1], which may be a numerical representation of a state of the component at all N nodes, and the actions, custom character=[a0, a1, . . . , aN−1], which may represent the instructions or control values at each node.


Acquisition system 206 may be configured to receive image data of the assembly process at each node, N. In some embodiments, acquisition system 206 may receive the image data of the assembly process of monitoring platform 104. In some embodiments, for each node N, acquisition system 206 may receive V number of images, where V may represent the number of cameras of monitoring platform 104 that may record the assembly procedure at each node. Accordingly, each image of the V number of images may capture a different perspective of the component during processing. Following receipt of the image data, acquisition system 206 may be configured to extract a subset of images of frames. For example, acquisition system 206 may be configured to extract L number of images from the received image data. The extracted images may be referred to as landmark frames. Landmark frames may be those image frames that are in high motion. The extracted images may include those images or frames that include certain landmarks, custom character, of the component, where custom character=[i0, i1, . . . , iL−1], and may represent the entire manufacturing process for the component.


Both minimal-motion (e.g., “landmark”) frames and maximal-motion (e.g., “high motion”) frames for a given operator may contain useful information for a classifier that is trying to correlate finger-hand-arm data and flight performance data in a robust way across a number of operators. In some embodiments, an optical flow algorithm may be used to measure an amount of motion in any given frame. Acquisition system 206 may select those frames that contain the most motion.


Extractor module 208 may be configured to extract keypoints from the L number of images. For example, extractor module 208 may be configured to extract K number of keypoints, i.e., (x, y) pairs, per landmark Ii. In other words, extractor module 208 may be configured to output K number of keypoints for a given input, Ilcustom characterw×h, where l ∈[0, L−1]. As output, extractor module 208 may generate a single vector







P


=


[



p
0



,


p


1

,


,


p



L
-
1



]

.






This vector may include landmark representations of K number of (x, y) pairs which may be represented by









p


i

=

[


(


x
0

,

y
0


)

,

(


x
1

,

y
1


)

,





(


x

K
-
1


,

y

K
-
1



)



]


,





where i ∈[0, L−1].


In some embodiments, to generate the custom character, extractor module 208 may implement two separate algorithms: (1) a bounding box estimation; and (2) keypoint detection.


With respect to the bounding box estimation, given custom character, each landmark frame may be processed with a threshold image segmentation to generate a mask image for each tooling component. For example, in an embodiment in which the tooling is a human, extractor module 208 may generate a mask image for each of the user's hands. In some embodiments, extractor module 208 may implement blob detection to locate components of the tooling. Using a human as an example, extractor module 208 may assume that the image always contains both the left and right hands of the user. When a frame fails to include both hands, extractor module 208 may assign the value with an arbitrary constant value, c.


With respect to keypoint detection, extractor module 208 may identify keypoints of the tooling based on the estimated bounding boxes. For example, with given input, extractor module 208 may estimate K number of points, custom characteri, along with its confidence value, custom characteri. In some embodiments, extractor module 208 may estimate, not only the points that are visible on the frame, but also points that may be occluded from the frame due to one or more of articulation, viewpoints, objects, or tool interactions. Because the objective may be to predict the quality measurement using the tracked keypoints, the non-estimated occluded points may be unique and important features representing the assembly process. Therefore, an occlusion threshold value, t0, may be derived from observation of ci value on occluded points in small randomly chosen subset of the landmark frames. Using to, extractor module 208 may filter out the estimation that are co<to. For those filtered points, extractor module 208 may assign them an arbitrary constant value, c. Regardless of the visibility of the tooling or keypoints on the frame, the output of extractor module 208 may include L*K (x, y) pairs for each component.


In other words, extractor module 208 may be configured to assign a default basal level of confidence for occluded components and, therefore, estimate those keypoints so that a full set of keypoints may be available to measure discrepancies between observed trial tooling component positions and the canonical tooling component positions. In some embodiments, tooling module 202 may draw inferences about the state of the component at a given point in time (e.g., station i) and then output recommended modified-from-canonical subsequent actions to take to correct for the measured discrepancies.


Prediction module 210 may be configured to predict a final quality metric, {circumflex over ( )}{circumflex over (V)}c. For example, prediction module 210 may be configured to predict a final quality metric, {circumflex over (V)}c, where {circumflex over (V)}c custom character1, given tooling tracking information for the L number of points-in-time, custom character, where custom charactercustom characterK*L may be gathered from each processing station 108. In some embodiments, prediction module 210 may implement a long short-term memory (LSTM) model to output the final quality metric. The LSTM model may allow prediction module 210 to overcome a vanishing gradient problem that is common in conventional recurrent neural networks. The vanishing gradient problem is the case where a model fails to relate the early weights of the neural network when the magnitude of the gradients are small for later layers. LSTM model eliminates this issue.


In some embodiments, tooling module 202 may further include a classification module 215. Classification module 215 may be configured to classify a specimen into one or more classes based on the predicted final quality metric, {circumflex over (V)}c. For example, in operation, classification module 215 may receive a canonical or desired final quality metric, Vc, for the specimen. Classification module 215 may be configured to compare the predicted final quality metric, Vc, to the canonical or desired final quality metric, Vc, to generate a delta, Av. Depending on the value of Av, classification module 215 may sort the specimen into one or more classes. For example, assume that classification module 215 may sort the specimen into one of four classes: fail (class 1), fair (class 2), excellent (class 3), and over-qualify (class 4). Classification module 215 may sort the specimen based on associated class definitions. For example, the associated definitions may be:










Class


1
:
Fail
:


V
ˆ

c


<


V
c

3








Class


2
:
Fair
:


V
c

3




V
c

<


2


V
c


3








Class


3
:
Excellent
:


2


V
c


3





V
ˆ

c



V
c








Class






4
:
Over


Qualify
:

V
c


<


V
ˆ

c








As those skilled in the art recognize, the associated definitions may change based on operator or client preference.


In some embodiments, control module 106 may use the classification assigned to the component to determine whether the final quality metric is acceptable. For example, an operator or client may specify that components classified in Class 1 are not acceptable, while components classified in any of Classes 2-4 are acceptable. In another example, an operator or client may specify that components classified in Class 1 or Class 2 are not acceptable, while components classified in Class 3 or Class 4 are acceptable.



FIG. 3 is a block diagram illustrating an exemplary architecture of LSTM model, according to example embodiments. As shown, LSTM model 300 may include three layers 3021-3023 (generally layer 302). As shown, each layer 302 may include one or more cells 304. In some embodiment, each cell's 304 input may be pi, where i ∈[0, L−1]. Each cell in the LSTM may be defined as:

fi=σ(Wf*[hi−1,pi]+bf)eq.1:
I_i=σ(W_I*[h_(i−1),p_i]+b_I)  eq. 2:
c=fi*ct−1+Ii*tanh(Wc*[hi−1,pi]+bc)  eq. 3:
oi=σ(Wo*[h−1,pi]+bo)  eq. 4:
hi=oi*tanh ci  eq. 5:

where equation 1 decides whether to keep information from the previous cell or not; equation (2) decides which values to update; equation (3) updates the cell; and equation (4) decides which part to output. Equation (5) may filter the output parts so that LSTM model 300 only outputs what it is programmed to output.


In some embodiments, LSTM model 300 may include three layers 302 with 30 hidden sizes. In some embodiments, LSTM model 300 may be a sequence-to-one LSTM model. For training, the L1, mean absolute error (MAE), loss function:









(


y


,


y
ˆ




)

=








b
=
1

B

|


y
b

-


y
ˆ

b


|

B






may be minimized using an Adam optimizer. In some embodiments, MAE may be used because the goal may be to minimize or reduce the magnitude of errors, regardless of the direction of the error.


Referring back to FIG. 2, prediction module 210 may be trained using a canonical set of instructions, custom character, and multiple components, M. For example, using 10 node videos of M number of data, the input for prediction module 210 may be structured by first being pre-processed through acquisition system 206 and extractor module 208. In some embodiments, each component involved in training may be validated through a validation algorithm that checks their composition (e.g., shape). In some embodiments, the validation algorithm may compute a similarity index on the composition at the end of each node by comparing the composition with canonical compositions. As a result, the components for training are roughly similar to the canonical composition.


In some embodiments, for output, the corresponding physical components may be tested in a controlled environment to measure their quality metric. Using the prepared input data with corresponding output data, prediction module 210 may be trained, for example, with a portion of M being training data and another portion of M being validation data. Once trained, prediction module 210 may be able to predict a quality measurement of a component at a given processing step, based on image data of the tooling.


Now referring to component module 204, component module 204 may include stochastic gradient descent (SGD) module 212, gated recurrent unit (GRU) model 214, and simulation module 216. For purposes of this discussion, a partial construct of a component may be defined as âk. where step k introduces an irreversible error in the manufacturing process, and steps k+1, . . . , N are not yet defined. Component module 204 may be configured to identify an optimal corrective sequence of remaining actions [{rk+1, tk+1}, . . . , {rN, tN}], where r and t may correspond to specific operations to be performed on the component, at least one of the current processing station or each subsequent processing station (k+1 to N) of manufacturing system 102. More generally, any component custom character may be defined as a sequence of all operations performed at each processing station 1 . . . N of manufacturing system. Mathematically, custom character=[{r1, t1}, . . . , {rN, tN}]. At each manufacturing step i=1, . . . , 10, the virtual representation system may represent the component in Euclidean space (e.g., custom character3) as a set of connected surface and a set of connected points uniformly distributed along the outer contour of each surface. In some embodiments, the virtual representation functions generating these representations may be referred to as S(custom characteri) and P(custom characteri), respectively. In some embodiments, component module 204 may be configured to correct custom character toward a particular, canonical component, custom character.


Simulation module 216 may be configured to simulate or generate a surface model for a given component âi. For example, simulation module 216 may receive tooling information from tooling module 202. Based on the keypoints generated by tooling module 202, simulation module 216 may be configured to generate a surface model representing a state of the component âi at a specific process step i. In some embodiments, the surface model may be represented as S(custom characteri). In some embodiments, simulation module 216 may further be configured to generate or estimate a quality metric of the component custom characteri. From the surface model, simulation module 216 may be configured to generate a points model, P(custom characteri), representing specific coordinates of the component, custom characteri. For example, from surface model S(custom characteri), simulation module 216 may create points model, P(custom characteri), by placing a number of points uniformly spaced around a bounding contour of each surface in S(custom characteri). In some embodiments, S(custom characteri) may be used to simulate performance of the artifact custom characteri.


SGD module 212 may receive the points model, P(custom characteri), from simulation module 216. SGD module 212 may determine whether an irreversible error, k, has occurred by comparing the points model, P(custom characteri), at step i to a canonical points model, P(custom character*) of a canonical component, custom character*. An irreversible error may be defined to be a measurably significant structural deviation from a canonical component at step k. SGD module 212 may be configured to detect an irreversible error by taking a Hausdorff distance. For example, SGD module 212 may match a processing step of current component custom character to canonical component custom character* based on respective Euclidean point sets. Mathematically, SGD module 212 may be configured to compute the Hausdorff distance between P(custom characteri) and P(custom character*) for and i ∈i=1, . . . , N. For example,







h

(

X
,
Y

)

=


max

x

X



{


min

y

Y



{

d

(

x
,
y

)

}


}







where d(x, y) may be the Euclidean distance between x and y, and the undirected Hausdorff distance may be:

H(X,Y)=max{h(X,Y),h(Y,X)


An irreversible error may be present when the Hausdorff distance between the current component custom characteri and the canonical component custom character* exceeds some threshold tolerance. For example, SGD module 212 may determine that an error occurs at step k when:

H(P(custom characterk),P(custom characterk*))>τH

where τH is some suitably defined tolerance threshold.


Assuming an error is present, SGD module 212 may be configured to construct a set of updated actions [{rk+1, tk+1}, . . . , {rN, tN}] given a set of actions up to a point of error an irreversible error k. In some embodiments, this set of updated actions may be referred to as xtail. The sequences of steps or actions that preceded and included the error step k may be referred to as xhead. Together, xtail and xhead may define a component custom characteri. Based on xhead, SGD module 212 may solve for xtail using a stochastic gradient descent method.


GRU model 214 may be configured to predict a final quality metric for the component, custom characteri, based on xtail ⊕xhead, where ⊕ may represent a vector concatenation operator. The final quality metric generated by GRU model 214 may be compared against a canonical final quality metric to determine if xtail is proper. For example, assuming that the combination of xtail and xhead yields a final quality metric that falls outside of a range of acceptable values, GRU model 214 may instruct SGD module 212 to generate an updated sequence of updated actions for further evaluation.



FIG. 5 is a block diagram illustrating architecture of GRU model 214, according to example embodiments. As shown, GRU model 214 may include N GRU cells 5021-502N (generally, GRU cell 502), with each GRU cell 502 corresponding to a respective processing station 108. Each GRU cell 502 may include an (ri, ti) input pair and a hidden state output custom character of a predetermined size. Together, these input pairs (ri, ti) may define a given component custom characteri. In some embodiments, each GRU cell 502 may be defined by:

custom character=σ(Wircustom character+bir+Whrcustom character−1+custom characterhr)
custom character=σ(Wizcustom character+custom characteriz+Whzcustom charactert−1+custom characterhz)
custom charactert=tanh(Wincustom charactert+custom characterin+custom charactert(Whncustom charactert−1+custom characterhn))
custom charactert=(custom charactercustom charactert)custom charactert+custom charactercustom charactert−1

where custom character may be the hidden state at time t, custom character, may be the input at time t, and custom character, custom character, and custom charactert may represent the reset, update, and new gates at time t, respectively.


GRU model 214 may be trained to generate weights corresponding thereto. For example, GRU model 214 may be trained iteratively to bias GRU model 214 toward solving a specific subproblem. In some embodiments, during a first iteration, GRU model 214 may generate a plurality (e.g., several hundred to several thousand) best predictions (one for each possible error at step k) to complete a given component custom characterk at steps k+1, . . . , N, along with the corresponding predicted quality metrics corresponding to each prediction.


In some embodiments, all of the predicted completions may be rendered in a virtual representation system, their stepwise Hausdorff distances computed, and their rendered surfaces simulated, to obtain distance measures between the generated predictions and the canonical. In some embodiments, the loss values between the canonical and predicted quality metric measures may be computed and fed back into the GRU model 214, whose weights may be adjusted via backpropagation, producing a second iteration. This process may continue until a desired error threshold is obtained.


In some embodiments, GRU model 214 may be trained on a prepared or synthetic data set. For example, simulator module 216 or another simulation system (e.g., a CAD model) may be used to generate a training set for training of GRU model 214. For example, a simulation system may generate, for example, a set of prepared components (e.g., 100,000 prepared components) to represent many unbiased randomized trials of actions that may cover the full admissible range of all parameter values. In some embodiments, the set prepared components may also include biased randomized trials of action, in addition to the unbiased randomized trials. This set may be referred to as P(U(0,1), 100000). Each custom character∈P(U(0,1), 100000) may obtain its defining parameter values by sampling from the uniform distribution U(0,1).


In some embodiments, for example, a set of 50,000 prepared components may be generated to represent many trials of processing a component to replicate the canonical component custom character*. This may be referred to as the set P(N(custom character*), 50000). Each p∈P(N(custom character*), 50000) may obtain its defining parameter values by a statistical sampling process. In some embodiments, the ith action may centrally tend to the corresponding canonically-defined fold and may vary according to an appropriately parameterized Gaussian distribution. The prepared component data set may be defined as P=P(U(0,1), 100000) UP(N(custom character*), 50000).



FIG. 4 is a block diagram visually illustrating the overall process flow of a feedback segment 400 for tooling module 202, according to example embodiments.


As shown, feedback segment 400 may include a first portion 402, a second portion 404, and a third portion 406. During first portion 402, tooling module 202 may perform an acquisition process, such as that carried out, at least partially, by acquisition system 206.


As shown, at block 408, manufacturing system 102 may receive processing instructions for processing a component. In some embodiments, manufacturing system 102 may receive processing instructions on a per station 108 basis. For example, each station 108 of manufacturing system 102 may receive independent processing instructions. In some embodiments, processing instructions may include control values that define attributes (e.g., temperature, pressure, etc.) of a station 108 for manufacturing. In some embodiments, processing instructions may include videos or images that visually illustrate to a human operator how to perform a specific processing step at a processing node of the manufacturing process.


At block 410, acquisition system 206 may receive image data from monitoring platform 104. In some embodiments, for each node N, acquisition system 206 may receive V number of images, where V may represent the number of cameras of monitoring platform 104 that may record the assembly procedure at each node. Following receipt of the image data, at block 412, acquisition system 206 may be configured to extract a subset of images of frames. For example, acquisition system 206 may be configured to extract L number of images from the received image data. The extracted images may be referred to as landmark frames. The extracted images may include those images or frames that include certain landmarks and may represent the entire manufacturing process for the component.


Second portion 404 may correspond to operations performed by extractor module 208. As shown, extractor module 208 may receive at least the extracted images from acquisition system 206. Extractor module 208 may be configured to extract keypoints from the L number of images. For example, extractor module 208 may be configured to extract K number of keypoints, i.e., (x, y) pairs, per landlark li.


At block 414, extractor module 208 may perform bounding box estimation. For example, given custom character, each landmark frame may be processed with a threshold image segmentation to generate a mask image for each tooling component. For example, as shown, in an embodiment in which the tooling is a human, extractor module 208 may generate a mask image for each of the user's hands.


At block 416, extractor module 208 may perform keypoint detection, given the bounding box estimation. For example, with given input, extractor module 208 may estimate K number of points, custom characteri, along with its confidence value, custom characteri. In some embodiments, extractor module 208 may estimate, not only the points that are visible on the frame, but also points that may be occluded from the frame due to one or more of articulation, viewpoints, objects, or tool interactions.


Third portion 406 may correspond to operations performed by prediction module 210. As shown, at block 418, prediction module 210 may receive keypoint information from extractor module 208 and may be configured to predict a final quality metric, {circumflex over (V)}c. In some embodiments, prediction module 210 may implement a long short-term memory (LS™) model to output the final quality metric.



FIG. 6A is a flow diagram illustrating a method 600 of correcting a multi-step manufacturing process, according to example embodiments. Method 600 may begin at step 602.


At step 602, an instruction set may be provided to manufacturing system 102. The instruction set may be representative of a set of instructions for a manufacturing process to be carried out by manufacturing system 102. In some embodiments, the instruction set may be provided to each station 108. For example, each canonical instruction set provided to each respective station 108 may define the processing parameters for a specific manufacturing step. In another example, each canonical instruction set may be a video of discrete steps to be performed by a human actor at a specific processing node or station 108.


At step 604, control module 106 may receive image data of the tooling (e.g., station 108) from monitoring platform 104. For example, acquisition system 206 may receive image data of the assembly process at a respective processing node. In some embodiments, acquisition system 206 may receive V number of images, where V may represent the number of cameras of monitoring platform 104 that may record the assembly procedure at a specific procession station 108. Accordingly, each image of the V number of images may capture a different perspective of the tooling during processing.


At step 606, control module 106 may extract a subset of images from the obtained image data. For example, following receipt of the image data, acquisition system 206 may be configured to extract a subset of images of frames. For example, acquisition system 206 may be configured to extract L number of images, i.e., landmark frames, from the received image data. The extracted images may be referred to as landmark frames. Landmark frames may be those image frames that are in high motion. The extracted images may include those images or frames that include certain landmarks, custom character, of the component, where custom character=[i0, i1, . . . , iL−1], and may represent the entire manufacturing process for the component.


At step 608, control module 106 may extract one or more keypoints of the tooling from the landmark frames. For example, extractor module 208 may extract keypoints from the L number of images. Extractor module 208 may identify or extract K number of keypoints for a given input, Ilcustom characterw×h, where l∈[0, L−1]. As output, extractor module 208 may generate a single vector custom character=[custom character0, custom character1, . . . , custom characterL−1]. This vector may include landmark representations of K number of (x, y) pairs which may be represented by custom characteri=[(x0, y0), (x1, y1), . . . (xK−1, yK−1)], where i ∈[0, L−1].


At step 610, control module 106 may predict a final quality metric for the component based on at least the identified keypoints. In some embodiments, prediction module 210 may implement a long short-term memory (LSTM) model to output the final quality metric.


At step 612, control module 106 may compare the final quality metric to a desired quality metric. If, at step 612, control module 106 determines that the final quality metric is a threshold tolerance of the desired quality metric, the manufacturing process may proceed to the next processing station or node (e.g., step 616), in accordance with the original instruction set. If, however, at step 612, control module 106 determines that the final quality metric is not within a threshold tolerance of the desired quality metric, then, at step 614 control module 106 may adjust the processing parameters of at least one of the processing station k or one or more processing stations k+1, where k may represent the current processing station and k+1 may represent other processing stations downstream of the current processing station. In some embodiments, control module 106 may interface with station controllers in a current station 108(k) and/or subsequent stations 108 (k+1) to adjust their respective control and/or station parameters. In some embodiments, control module 106 may provide human manufacturers with updated instructions to be performed in at least one of a processing station k or one or more processing station (k+1)of a production line. These adjustments may aid in correcting the manufacturing process, such that the final quality metric may be within the range of acceptable quality metrics.



FIG. 6B is a flow diagram illustrating a method 650 of correcting a multi-step manufacturing process, according to example embodiments. Method 650 may begin at step 652.


At step 652, an instruction set may be provided to manufacturing system 102. The instruction set may be representative of a set of instructions for a manufacturing process to be carried out by manufacturing system 102. In some embodiments, the instruction set may be provided to each station 108. For example, each canonical instruction set provided to each respective station 108 may define the processing parameters for a specific manufacturing step. In another example, each canonical instruction set may be a video of discrete steps to be performed by a human actor at a specific processing node or station 108.


At step 654, control module 106 may receive image data of the tooling (e.g., station 108) from monitoring platform 104. For example, acquisition system 206 may receive image data of the assembly process at a respective processing node. In some embodiments, acquisition system 206 may receive V number of images, where V may represent the number of cameras of monitoring platform 104 that may record the assembly procedure at a specific procession station 108. Accordingly, each image of the V number of images may capture a different perspective of the tooling during processing.


At step 656, control module 106 may extract a subset of images from the obtained image data. For example, following receipt of the image data, acquisition system 206 may be configured to extract a subset of images of frames. For example, acquisition system 206 may be configured to extract L number of images, i.e., landmark frames, from the received image data. The extracted images may be referred to as landmark frames. Landmark frames may be those image frames that are in high motion and may contain the most meaningful features that clearly captures the progress of the manufacturing process. The extracted images may include those images or frames that include certain landmarks, custom character, of the component, where custom character=[i0, i1, . . . , iL−1], and may represent the entire manufacturing process for the component.


At step 658, control module 106 may extract one or more keypoints of the tooling from the landmark frames. For example, extractor module 208 may extract keypoints from the L number of images. Extractor module 208 may identify or extract K number of keypoints for a given input, llcustom characterw×h, where l ∈[0, L−1]. As output, extractor module 208 may generate a single vector custom character=[custom character=0, custom character1, . . . , custom characterL−1]. This vector may include landmark representations of K number of (x, y) pairs which may be represented by custom characteri=[(x0, y0), (x1, y1), . . . (xK−1, yK−1)], where i ∈[0, L−1].


At step 660, control module 106 may predict a final quality metric for the component based on at least the identified keypoints. In some embodiments, prediction module 210 may implement a long short-term memory (LSTM) model to output the estimate of the final quality metric.


At step 662, control module 106 may classify the component based on the final quality metric. For example, classification module 215 may classify the component into one or more classes based on the predicted final quality metric, {circumflex over (V)}c. To classify the component, tooling module 202 may receive a canonical or desired final quality metric, Vc, for the component. Tooling module 202 may be configured to compare the predicted final quality metric, Vc, to the canonical or desired final quality metric, Vc, to generate a delta, Δv. Depending on the value of Δv, classification module 215 may sort the component into one or more classes.


In some embodiments, method 650 may further include operations 664-668.


At step 614, control module 106 may determine if the class assigned to the component is an acceptable class. If, at step 614, control module 106 determines that component has been assigned to an acceptable class, the manufacturing process may proceed to the next processing station or node (e.g., step 616), in accordance with the original instruction set. If, however, at step 614, control module 106 determines that component is assigned to an unacceptable class, then, at step 618 control module 106 may adjust the processing parameters of at least one of a processing station k or one or more processing stations (k+1). In some embodiments, control module 106 may interface with station controllers in at least one of current station 108 or subsequent stations 108 to adjust their respective control and/or station parameters. In some embodiments, control module 106 may provide human manufacturers with updated instructions to be performed in at least one of a processing station k or one or more processing stations (k+1) of a production line. These adjustments may aid in correcting the manufacturing process, such that the final quality metric may be within the range of acceptable quality metrics.



FIG. 7 is a flow diagram illustrating a method 700 of correcting a multi-step manufacturing process, according to example embodiments. Method 700 may begin at step 702.


At step 702, an instruction set may be provided to manufacturing system 102. The instruction set may be representative of a set of instructions for a manufacturing process to be carried out by manufacturing system 102. In some embodiments, the instruction set may be provided to each station 108. For example, each canonical instruction set provided to each respective station 108 may define the processing parameters for a specific manufacturing step. In another example, each canonical instruction set may be a video of discrete steps to be performed by a human actor at a specific processing node or station 108.


At step 704, control module 106 may identify information corresponding to a component at a respective processing node. In some embodiments, simulation module 216 may receive tooling information from tooling module 202. Based on the keypoints generated by tooling module 202, simulation module 216 may be configured to generate a surface model representing a state of the component âi, at a specific process step i. In some embodiments, the surface model may be represented as S(custom characteri). In some embodiments, simulation module 216 may further be configured to generate or estimate a quality metric of the component custom characteri. From the surface model, simulation module 216 may be configured to generate a points model, P(custom characteri), representing specific coordinates of the component, custom characteri.


At step 706, control module 106 may determine whether an irreversible error has occurred. For example, SGD module 212 may receive points model, P(custom characteri), from simulation module 216. SGD module 212 may determine whether an irreversible error, k, has occurred by comparing the points model, P(custom characteri), at step i to a canonical points model, P(custom characteri) of a canonical component, custom character*. SGD module 212 may be configured to detect an irreversible error by taking a Hausdorff distance between the points model and the corresponding canonical points model. An irreversible error may be present when the Hausdorff distance between the current component custom characteri and the canonical component custom character*, at the respective processing station or node, exceeds some threshold tolerance.


If at step 706, control module 106 determines that an irreversible error has not occurred, then the manufacturing process may proceed to the next processing station or node (step 705), in accordance with the original instruction set. If, however, at step 706, control module 106 determines that an irreversible error has occurred, then method 700 proceed to step 708.


At step 708, control module 106 may generate an updated set of actions to correct the irreversible error. SGD module 212 may construct a set of updated actions [{rk+1, tk+1}, . . . , {rN, tN}] given a set of actions up to a point of error an irreversible error k. In some embodiments, this set of updated actions may be referred to as xtail. The sequences of steps or actions that preceded and included the error step k may be referred to as xhead. Together, xtail and Xhead may define a component custom characteri. Based on xhead, SGD module 212 may solve for xtail using a stochastic gradient descent method. The sequences of steps or actions that preceded and included the error step k may be referred to as xhead. Together, xtail and Xhead may define a component custom characteri. Based on xhead, SGD module 212 may solve for xtail using a stochastic gradient descent method.


At step 710, control module 106 may generate a predicted final quality metric for the component, based on the set of actions generated by SGD module 212. For example, GRU model 214 may be configured to predict a final quality metric for the component, custom characteri, based on xtail ⊕xhead, where ⊕ may represent a vector concatenation operator.


At step 712, control module 106 may determine if the predicted final quality metric is within a threshold tolerance of the canonical final quality metric. For example, the final quality metric generated by GRU model 214 may be compared against a canonical final quality metric to determine if xtail is proper. If, at step 712, control module 106 determines that the predicted quality metric is within the threshold tolerance, at step 714, control module 106 may adjust the processing parameters of at least one of processing station k or one or more processing stations (k+1). In some embodiments, control module 106 may interface with station controllers in a station 108(k) or subsequent stations 108(k+1) to adjust their respective control and/or station parameters. In some embodiments, control module 106 may provide human manufacturers with updated instructions to be performed in at least one of a processing station k or one or more processing stations (k+1) of a production line. These adjustments may aid in correcting the manufacturing process, such that the final quality metric may be within the range of acceptable quality metrics.


If, however, at step 712, control module 106 determines that the predicted quality metric is not within the threshold tolerance, GRU model 214 may prompt SGD module 212 to generate a new xtail. For example, method 700 may revert to step 708 for SGD module 212 to construct a new set of updated actions.



FIG. 8A illustrates a system bus computing system architecture 800, according to example embodiments. One or more components of system 800 may be in electrical communication with each other using a bus 805. System 800 may include a processor (e.g., one or more CPUs, GPUs or other types of processors) 810 and a system bus 805 that couples various system components including the system memory 815, such as read only memory (ROM) 820 and random access memory (RAM) 825, to processor 810. System 800 can include a cache of high-speed memory connected directly with, in close proximity to, or integrated as part of processor 810. System 800 can copy data from memory 815 and/or storage device 830 to cache 812 for quick access by processor 810. In this way, cache 812 may provide a performance boost that avoids processor 810 delays while waiting for data. These and other modules can control or be configured to control processor 810 to perform various actions. Other system memory 815 may be available for use as well. Memory 815 may include multiple different types of memory with different performance characteristics. Processor 810 may be representative of a single processor or multiple processors. Processor 810 can include one or more of a general purpose processor or a hardware module or software module, such as service 1832, service 2834, and service 3836 stored in storage device 830, configured to control processor 810, as well as a special-purpose processor where software instructions are incorporated into the actual processor design. Processor 810 may essentially be a completely self-contained computing system, containing multiple cores or processors, a bus, memory controller, cache, etc. A multi-core processor may be symmetric or asymmetric.


To enable user interaction with the computing device 800, an input device 845 which can be any number of input mechanisms, such as a microphone for speech, a touch-sensitive screen for gesture or graphical input, keyboard, mouse, motion input, speech and so forth. An output device 835 can also be one or more of a number of output mechanisms known to those of skill in the art. In some instances, multimodal systems can enable a user to provide multiple types of input to communicate with computing device 800. Communications interface 840 can generally govern and manage the user input and system output. There is no restriction on operating on any particular hardware arrangement and therefore the basic features here may easily be substituted for improved hardware or firmware arrangements as they are developed.


Storage device 830 may be a non-volatile memory and can be a hard disk or other types of computer readable media that can store data that are accessible by a computer, such as magnetic cassettes, flash memory cards, solid state memory devices, digital versatile disks, cartridges, random access memories (RAMs) 825, read only memory (ROM) 820, and hybrids thereof.


Storage device 830 can include services 832, 834, and 836 for controlling the processor 810. Other hardware or software modules are contemplated. Storage device 830 can be connected to system bus 805. In one aspect, a hardware module that performs a particular function can include the software component stored in a computer-readable medium in connection with the necessary hardware components, such as processor 810, bus 805, display 835, and so forth, to carry out the function.



FIG. 8B illustrates a computer system 850 having a chipset architecture, according to example embodiments. Computer system 850 may be an example of computer hardware, software, and firmware that can be used to implement the disclosed technology. System 850 can include one or more processors 855, representative of any number of physically and/or logically distinct resources capable of executing software, firmware, and hardware configured to perform identified computations. One or more processors 855 can communicate with a chipset 860 that can control input to and output from one or more processors 855. In this example, chipset 860 outputs information to output 865, such as a display, and can read and write information to storage device 870, which can include magnetic media, and solid state media, for example. Chipset 860 can also read data from and write data to RAM 875. A bridge 880 for interfacing with a variety of user interface components 885 can be provided for interfacing with chipset 860. Such user interface components 885 can include a keyboard, a microphone, touch detection and processing circuitry, a pointing device, such as a mouse, and so on. In general, inputs to system 850 can come from any of a variety of sources, machine generated and/or human generated.


Chipset 860 can also interface with one or more communication interfaces 890 that can have different physical interfaces. Such communication interfaces can include interfaces for wired and wireless local area networks, for broadband wireless networks, as well as personal area networks. Some applications of the methods for generating, displaying, and using the GUI disclosed herein can include receiving ordered datasets over the physical interface or be generated by the machine itself by one or more processors 855 analyzing data stored in storage 870 or 875. Further, the machine can receive inputs from a user through user interface components 885 and execute appropriate functions, such as browsing functions by interpreting these inputs using one or more processors 855.


It can be appreciated that example systems 800 and 850 can have more than one processor 810 or be part of a group or cluster of computing devices networked together to provide greater processing capability.


While the foregoing is directed to embodiments described herein, other and further embodiments may be devised without departing from the basic scope thereof. For example, aspects of the present disclosure may be implemented in hardware or software or a combination of hardware and software. One embodiment described herein may be implemented as a program product for use with a computer system. The program(s) of the program product define functions of the embodiments (including the methods described herein) and can be contained on a variety of computer-readable storage media. Illustrative computer-readable storage media include, but are not limited to: (i) non-writable storage media (e.g., read-only memory (ROM) devices within a computer, such as CD-ROM disks readably by a CD-ROM drive, flash memory, ROM chips, or any type of solid-state non-volatile memory) on which information is permanently stored; and (ii) writable storage media (e.g., floppy disks within a diskette drive or hard-disk drive or any type of solid state random-access memory) on which alterable information is stored. Such computer-readable storage media, when carrying computer-readable instructions that direct the functions of the disclosed embodiments, are embodiments of the present disclosure.


It will be appreciated to those skilled in the art that the preceding examples are exemplary and not limiting. It is intended that all permutations, enhancements, equivalents, and improvements thereto are apparent to those skilled in the art upon a reading of the specification and a study of the drawings are included within the true spirit and scope of the present disclosure. It is therefore intended that the following appended claims include all such modifications, permutations, and equivalents as fall within the true spirit and scope of these teachings.

Claims
  • 1. A manufacturing system, comprising: one or more stations, each station configured to perform at least one step in a multi-step manufacturing process for a component;a monitoring platform configured to monitor progression of the component throughout the multi-step manufacturing process; anda control module configured to dynamically adjust processing parameters of a step of the multi-step manufacturing process to achieve a desired final quality metric for the component, the control module configured to perform operations, comprising: receiving image data of tooling of a first station of the one or more stations;determining, by a first machine learning model, a final quality metric for the component, based on the image data;determining that the final quality metric is outside an acceptable range of the desired final quality metric;based on the determining, generating an updated instruction set to be performed by at least one of the first station or a downstream station to achieve the desired final quality metric;predicting, by a second machine learning model, an updated final quality metric for the component based on the updated instruction set; andbased on the updated final quality metric, providing the updated instruction set to at least one of a first station controller associated with the first station or a second station controller associated with the downstream station.
  • 2. The manufacturing system of claim 1, wherein the second machine learning model is trained with a training set comprising a synthetic set of data.
  • 3. The manufacturing system of claim 1, wherein the final quality metric cannot be measured until processing of the component is complete.
  • 4. The manufacturing system of claim 1, further comprising: determining, based on positional information of the component, that an irreversible error is not present.
  • 5. The manufacturing system of claim 1, further comprising: identifying a set of keypoints from the image data, the set of keypoints corresponding to position information of the tooling during processing at the first station.
  • 6. The manufacturing system of claim 5, wherein the set of keypoints are used by the first machine learning model to determine the final quality metric.
  • 7. The manufacturing system of claim 5, further comprising: comparing the set of keypoints corresponding to coordinates of the component to a canonical set of keypoints corresponding to a canonical component.
  • 8. A method for dynamically adjusting processing parameters of a step of a multi-step manufacturing process to achieve a desired final quality metric for a component undergoing the multi-step manufacturing process, comprising: receiving, by a computing system, image data of tooling of a first station of a plurality of stations involved in the multi-step manufacturing process;determining, by a first machine learning model of the computing system, a final quality metric for the component, based on the image data, the final quality metric representing a metric associated with the component that cannot be measured until the multi-step manufacturing process is complete;determining, by the computing system, that the final quality metric is outside an acceptable range of the desired final quality metric;based on the determining an updated instruction set to be performed by at least one of the first station or a downstream station to achieve the desired final quality metric;predicting, by a second machine learning model of the computing system, an updated final quality metric for the component based on the updated instruction set; andbased on the updated final quality metric, providing, by the computing system, the updated instruction set to at least one of a first station controller associated with the first station or a second station controller associated with the downstream station.
  • 9. The method of claim 8, wherein the second machine learning model is trained with a training set comprising a synthetic set of data.
  • 10. The method of claim 8, wherein the final quality metric cannot be measured until processing of the component is complete.
  • 11. The method of claim 8, further comprising: determining, based on positional information of the component, that an irreversible error is not present.
  • 12. The method of claim 8, further comprising: identifying a set of keypoints from the image data, the set of keypoints corresponding to position information of the tooling during processing at the first station.
  • 13. The method of claim 12, wherein the set of keypoints are used by the first machine learning model to determine the final quality metric.
  • 14. The method of claim 12, further comprising: comparing the set of keypoints corresponding to coordinates of the component to a canonical set of keypoints corresponding to a canonical component.
  • 15. A non-transitory computer readable medium comprising one or more sequences of instructions, which, when executed by a computing system, causes a processor to perform operations for dynamically adjusting processing parameters of a step of a multi-step manufacturing process to achieve a desired final quality metric for a component undergoing the multi-step manufacturing process, comprising: receiving, by the computing system, image data of tooling of a first station of one or more stations involved in the multi-step manufacturing process;determining, by a first machine learning model of the computing system, a final quality metric for the component, based on the image data;determining, by the computing system, that the final quality metric is outside an acceptable range of the desired final quality metric;based on the determining, generating an updated instruction set to be performed by at least one of the first station or a downstream station to achieve the desired final quality metric;predicting, by a second machine learning model of the computing system, an updated final quality metric for the component based on the updated instruction set; andbased on the updated final quality metric, providing, by the computing system, the updated instruction set to at least one of a first station controller associated with the first station or a second station controller associated with the downstream station.
  • 16. The non-transitory computer readable medium of claim 15, wherein the second machine learning model is trained with a training set comprising a synthetic set of data.
  • 17. The non-transitory computer readable medium of claim 15, wherein the final quality metric cannot be measured until processing of the component is complete.
  • 18. The non-transitory computer readable medium of claim 15, further comprising: determining, based on positional information of the component, that an irreversible error is not present.
  • 19. The non-transitory computer readable medium of claim 15, further comprising: identifying a set of keypoints from the image data, the set of keypoints corresponding to position information of the tooling during processing at the first station.
  • 20. The non-transitory computer readable medium of claim 19, wherein the set of keypoints are used by the first machine learning model to determine the final quality metric.
CROSS-REFERENCE TO RELATED APPLICATION

This application claims priority to U.S. Provisional Application Ser. No. 63/040,792, filed Jun. 18, 2020, which is hereby incorporated by reference in its entirety. This application is a continuation-in-part of U.S. application Ser. No. 17/195,746, filed Mar. 9, 2021, which claims priority to U.S. Provisional Application Ser. No. 62/986,987 filed Mar. 9, 2020, which is hereby incorporated by reference in its entirety, and is a continuation-in-part of U.S. application Ser. No. 17/091,393, filed Nov. 6, 2020, which claims priority to U.S. Provisional Application Ser. No. 62/931,448, filed Nov. 6, 2019, U.S. Provisional Application Ser. No. 62/932,063, filed Nov. 7, 2019, and U.S. Provisional Application Ser. No. 62/931,453, filed Nov. 6, 2019, which are hereby incorporated by reference in their entireties.

US Referenced Citations (319)
Number Name Date Kind
4056716 Baxter et al. Nov 1977 A
4433385 De Gasperi et al. Feb 1984 A
5027295 Yotsuya Jun 1991 A
5808432 Inoue et al. Sep 1998 A
5815198 Vachtsevanos et al. Sep 1998 A
6240633 Kent et al. Jun 2001 B1
6266436 Bett et al. Jul 2001 B1
6650779 Vachtesvanos et al. Nov 2003 B2
6757571 Toyama Jun 2004 B1
7149337 Michaelis et al. Dec 2006 B2
7551274 Wornson et al. Jun 2009 B1
8185217 Thiele May 2012 B2
8612043 Moyne et al. Dec 2013 B2
8909926 Brandt et al. Dec 2014 B2
9945264 Wichmann et al. Apr 2018 B2
9977425 McCann et al. May 2018 B1
10061300 Coffman et al. Aug 2018 B1
10102495 Zhang et al. Oct 2018 B1
10481579 Putman et al. Nov 2019 B1
11117328 Hough Sep 2021 B2
11156982 Putman et al. Oct 2021 B2
11156991 Putman et al. Oct 2021 B2
11156992 Putman et al. Oct 2021 B2
11209795 Putman et al. Dec 2021 B2
11675330 Putman et al. Jun 2023 B2
11703824 Putman et al. Jul 2023 B2
20020002414 Hsiung et al. Jan 2002 A1
20020143417 Ito et al. Oct 2002 A1
20030061004 Discenzo Mar 2003 A1
20040030431 Popp et al. Feb 2004 A1
20040070509 Grace et al. Apr 2004 A1
20050267607 Paik Dec 2005 A1
20060013505 Yau et al. Jan 2006 A1
20060058898 Emigholz et al. Mar 2006 A1
20060149407 Markham et al. Jul 2006 A1
20070005525 Collette, III et al. Jan 2007 A1
20070036421 Toba et al. Feb 2007 A1
20070047797 Mlella Mar 2007 A1
20070177787 Maeda et al. Aug 2007 A1
20080100570 Friedrich et al. May 2008 A1
20080276128 Lin et al. Nov 2008 A1
20080300709 Collette, III et al. Dec 2008 A1
20090158577 Schweikle Jun 2009 A1
20090198464 Clarke et al. Aug 2009 A1
20090242513 Funk et al. Oct 2009 A1
20090281753 Noy Nov 2009 A1
20100106458 Leu et al. Apr 2010 A1
20100131202 Dannevik et al. May 2010 A1
20110141265 Holtkamp et al. Jun 2011 A1
20120151585 Lamastra et al. Jun 2012 A1
20120304007 Hanks et al. Nov 2012 A1
20130031037 Brandt et al. Jan 2013 A1
20130339919 Baseman et al. Dec 2013 A1
20140082730 Vashst et al. Mar 2014 A1
20140247347 McNeill et al. Sep 2014 A1
20140336785 Asenjo et al. Nov 2014 A1
20150045928 Perez et al. Feb 2015 A1
20150067844 Brandt et al. Mar 2015 A1
20150096022 Vincent et al. Apr 2015 A1
20150184549 Pamujula et al. Jul 2015 A1
20150185716 Wichmann et al. Jul 2015 A1
20150213369 Brandt et al. Jul 2015 A1
20150286202 Amano et al. Oct 2015 A1
20150324329 Blevins et al. Nov 2015 A1
20160170996 Frank et al. Jun 2016 A1
20160253618 Imazawa et al. Sep 2016 A1
20160259318 Vogt et al. Sep 2016 A1
20160261465 Gupta et al. Sep 2016 A1
20160300338 Zafar et al. Oct 2016 A1
20160330222 Brandt et al. Nov 2016 A1
20160352762 Friedlander et al. Dec 2016 A1
20170034205 Canedo et al. Feb 2017 A1
20170093897 Cochin et al. Mar 2017 A1
20170102694 Enver et al. Apr 2017 A1
20170102696 Bell et al. Apr 2017 A1
20170109646 David Apr 2017 A1
20170149820 Ruvio et al. May 2017 A1
20170156674 Hochman Jun 2017 A1
20170169219 Ogawa et al. Jun 2017 A1
20170255723 Asenjo et al. Sep 2017 A1
20170264629 Wei et al. Sep 2017 A1
20180005083 Georgescu et al. Jan 2018 A1
20180033130 Kimura et al. Feb 2018 A1
20180079125 Perez et al. Mar 2018 A1
20180114121 Rana et al. Apr 2018 A1
20180144248 Lu et al. May 2018 A1
20180150070 Johnson et al. May 2018 A1
20180157831 Abbaszadeh et al. Jun 2018 A1
20180165602 Van Seijen et al. Jun 2018 A1
20180180085 Watanabe et al. Jun 2018 A1
20180188704 Cella et al. Jul 2018 A1
20180188714 Cella et al. Jul 2018 A1
20180188715 Cella et al. Jul 2018 A1
20180210425 Cella et al. Jul 2018 A1
20180210426 Cella et al. Jul 2018 A1
20180210427 Cella et al. Jul 2018 A1
20180248905 Côté et al. Aug 2018 A1
20180253073 Cella et al. Sep 2018 A1
20180253074 Cella et al. Sep 2018 A1
20180253075 Cella et al. Sep 2018 A1
20180253082 Asenjo et al. Sep 2018 A1
20180255374 Cella et al. Sep 2018 A1
20180255375 Cella et al. Sep 2018 A1
20180255376 Cella et al. Sep 2018 A1
20180255377 Cella et al. Sep 2018 A1
20180255378 Cella et al. Sep 2018 A1
20180255379 Cella et al. Sep 2018 A1
20180255380 Cella et al. Sep 2018 A1
20180255381 Cella et al. Sep 2018 A1
20180255382 Cella et al. Sep 2018 A1
20180255383 Cella et al. Sep 2018 A1
20180262528 Jain Sep 2018 A1
20180276375 Arov et al. Sep 2018 A1
20180284735 Cella et al. Oct 2018 A1
20180284736 Cella et al. Oct 2018 A1
20180284737 Cella et al. Oct 2018 A1
20180284741 Cella et al. Oct 2018 A1
20180284742 Cella et al. Oct 2018 A1
20180284743 Cella et al. Oct 2018 A1
20180284744 Cella et al. Oct 2018 A1
20180284745 Cella et al. Oct 2018 A1
20180284746 Cella et al. Oct 2018 A1
20180284747 Cella et al. Oct 2018 A1
20180284749 Cella et al. Oct 2018 A1
20180284752 Cella et al. Oct 2018 A1
20180284753 Cella et al. Oct 2018 A1
20180284754 Cella et al. Oct 2018 A1
20180284755 Cella et al. Oct 2018 A1
20180284756 Cella et al. Oct 2018 A1
20180284757 Cella et al. Oct 2018 A1
20180284758 Cella et al. Oct 2018 A1
20180292811 Baseman et al. Oct 2018 A1
20180292812 Baseman et al. Oct 2018 A1
20180299878 Cella et al. Oct 2018 A1
20180316719 Schneider et al. Nov 2018 A1
20180321666 Cella et al. Nov 2018 A1
20180321667 Cella et al. Nov 2018 A1
20180321672 Cella et al. Nov 2018 A1
20180358271 David Dec 2018 A1
20180367550 Musuvathi et al. Dec 2018 A1
20180376067 Martineau Dec 2018 A1
20190020669 Glatfelter et al. Jan 2019 A1
20190025805 Cella et al. Jan 2019 A1
20190025806 Cella et al. Jan 2019 A1
20190025812 Cella et al. Jan 2019 A1
20190033845 Cella et al. Jan 2019 A1
20190033846 Cella et al. Jan 2019 A1
20190033847 Cella et al. Jan 2019 A1
20190033848 Cella et al. Jan 2019 A1
20190033849 Cella et al. Jan 2019 A1
20190041836 Cella et al. Feb 2019 A1
20190041840 Cella et al. Feb 2019 A1
20190041841 Cella et al. Feb 2019 A1
20190041843 Cella et al. Feb 2019 A1
20190041844 Cella et al. Feb 2019 A1
20190041845 Cella et al. Feb 2019 A1
20190041846 Cella et al. Feb 2019 A1
20190064766 Friebolin et al. Feb 2019 A1
20190064792 Cella et al. Feb 2019 A1
20190068618 Mestha et al. Feb 2019 A1
20190068620 Avrahami et al. Feb 2019 A1
20190072922 Cella et al. Mar 2019 A1
20190072923 Cella et al. Mar 2019 A1
20190072924 Cella et al. Mar 2019 A1
20190072925 Cella et al. Mar 2019 A1
20190072926 Cella et al. Mar 2019 A1
20190072928 Cella et al. Mar 2019 A1
20190073585 Pu et al. Mar 2019 A1
20190079483 Cella et al. Mar 2019 A1
20190089722 Ciocarlie et al. Mar 2019 A1
20190094829 Cella et al. Mar 2019 A1
20190094842 Lee et al. Mar 2019 A1
20190094843 Lee et al. Mar 2019 A1
20190104138 Storms et al. Apr 2019 A1
20190107816 Cella et al. Apr 2019 A1
20190114756 Weiss et al. Apr 2019 A1
20190118300 Penny et al. Apr 2019 A1
20190121339 Cella et al. Apr 2019 A1
20190121340 Cella et al. Apr 2019 A1
20190121342 Cella et al. Apr 2019 A1
20190121343 Cella et al. Apr 2019 A1
20190121344 Cella et al. Apr 2019 A1
20190121345 Cella et al. Apr 2019 A1
20190121346 Cella et al. Apr 2019 A1
20190121347 Cella et al. Apr 2019 A1
20190121349 Cella et al. Apr 2019 A1
20190129404 Cella et al. May 2019 A1
20190129405 Cella et al. May 2019 A1
20190129406 Cella et al. May 2019 A1
20190129408 Cella et al. May 2019 A1
20190129409 Cella et al. May 2019 A1
20190137985 Cella et al. May 2019 A1
20190137987 Cella et al. May 2019 A1
20190137988 Cella et al. May 2019 A1
20190137989 Cella et al. May 2019 A1
20190138897 Xu et al. May 2019 A1
20190138932 Akella et al. May 2019 A1
20190146474 Cella et al. May 2019 A1
20190146476 Cella et al. May 2019 A1
20190146477 Cella et al. May 2019 A1
20190146481 Cella et al. May 2019 A1
20190146482 Cella et al. May 2019 A1
20190155272 Cella et al. May 2019 A1
20190179277 Cella et al. Jun 2019 A1
20190179278 Cella et al. Jun 2019 A1
20190179279 Cella et al. Jun 2019 A1
20190179300 Cella et al. Jun 2019 A1
20190179301 Cella et al. Jun 2019 A1
20190180153 Buckler et al. Jun 2019 A1
20190187646 Cella et al. Jun 2019 A1
20190187647 Cella et al. Jun 2019 A1
20190187648 Cella et al. Jun 2019 A1
20190187649 Cella et al. Jun 2019 A1
20190187650 Cella et al. Jun 2019 A1
20190187651 Cella et al. Jun 2019 A1
20190187652 Cella et al. Jun 2019 A1
20190187653 Cella et al. Jun 2019 A1
20190187654 Cella et al. Jun 2019 A1
20190187655 Cella et al. Jun 2019 A1
20190187656 Cella et al. Jun 2019 A1
20190187657 Cella et al. Jun 2019 A1
20190187680 Cella et al. Jun 2019 A1
20190187681 Cella et al. Jun 2019 A1
20190187682 Cella et al. Jun 2019 A1
20190187683 Cella et al. Jun 2019 A1
20190187684 Cella et al. Jun 2019 A1
20190187685 Cella et al. Jun 2019 A1
20190187686 Cella et al. Jun 2019 A1
20190187687 Cella et al. Jun 2019 A1
20190187688 Cella et al. Jun 2019 A1
20190187689 Cella et al. Jun 2019 A1
20190187690 Cella et al. Jun 2019 A1
20190197236 Niculescu-Mizil et al. Jun 2019 A1
20190213099 Schmidt et al. Jul 2019 A1
20190219995 Cella et al. Jul 2019 A1
20190219996 Cella et al. Jul 2019 A1
20190227536 Cella et al. Jul 2019 A1
20190227537 Cella et al. Jul 2019 A1
20190230099 Mestha et al. Jul 2019 A1
20190230106 Abbaszadeh et al. Jul 2019 A1
20190235461 Cella et al. Aug 2019 A1
20190235462 Cella et al. Aug 2019 A1
20190238568 Goswami et al. Aug 2019 A1
20190243323 Cella et al. Aug 2019 A1
20190243346 Baseman et al. Aug 2019 A1
20190286111 Yennie et al. Sep 2019 A1
20190286892 Li et al. Sep 2019 A1
20190294869 Naphade et al. Sep 2019 A1
20190295887 Trickett et al. Sep 2019 A1
20190295890 Clark et al. Sep 2019 A1
20190295891 Clark et al. Sep 2019 A1
20190295906 Clark et al. Sep 2019 A1
20190299536 Putman et al. Oct 2019 A1
20190302707 Guo et al. Oct 2019 A1
20190339684 Cella et al. Nov 2019 A1
20190339685 Cella et al. Nov 2019 A1
20190339686 Cella et al. Nov 2019 A1
20190339687 Cella et al. Nov 2019 A1
20190362480 Diao et al. Nov 2019 A1
20190379677 Zenz et al. Dec 2019 A1
20190384250 Cella et al. Dec 2019 A1
20190386595 Fujita et al. Dec 2019 A1
20190391550 Cella et al. Dec 2019 A1
20190391551 Cella et al. Dec 2019 A1
20190391552 Cella et al. Dec 2019 A1
20200012248 Cella et al. Jan 2020 A1
20200013156 Weiss et al. Jan 2020 A1
20200019154 Cella et al. Jan 2020 A1
20200019155 Cella et al. Jan 2020 A1
20200026270 Cella et al. Jan 2020 A1
20200076838 Mestha et al. Mar 2020 A1
20200081423 Clark et al. Mar 2020 A1
20200083070 Clark et al. Mar 2020 A1
20200083074 Clark et al. Mar 2020 A1
20200083080 Clark et al. Mar 2020 A1
20200096986 Cella et al. Mar 2020 A1
20200096987 Cella et al. Mar 2020 A1
20200096988 Cella et al. Mar 2020 A1
20200096989 Cella et al. Mar 2020 A1
20200096990 Cella et al. Mar 2020 A1
20200096992 Cella et al. Mar 2020 A1
20200096993 Cella et al. Mar 2020 A1
20200096994 Cella et al. Mar 2020 A1
20200096995 Cella et al. Mar 2020 A1
20200096996 Cella et al. Mar 2020 A1
20200096997 Cella et al. Mar 2020 A1
20200096998 Cella et al. Mar 2020 A1
20200099707 Abbaszadeh et al. Mar 2020 A1
20200103890 Cella et al. Apr 2020 A1
20200103891 Cella et al. Apr 2020 A1
20200103892 Cella et al. Apr 2020 A1
20200103893 Cella et al. Apr 2020 A1
20200110398 Cella et al. Apr 2020 A1
20200110399 Cella et al. Apr 2020 A1
20200110400 Cella et al. Apr 2020 A1
20200110401 Cella et al. Apr 2020 A1
20200111689 Banna et al. Apr 2020 A1
20200117180 Cella et al. Apr 2020 A1
20200125978 Abbaszadeh et al. Apr 2020 A1
20200166909 Noone et al. May 2020 A1
20200175171 Rieger et al. Jun 2020 A1
20200310380 Sun et al. Oct 2020 A1
20200314128 Hild Oct 2020 A1
20200333777 Maruyama Oct 2020 A1
20200401120 Putman et al. Dec 2020 A1
20210069990 Hough et al. Mar 2021 A1
20210118730 Clark et al. Apr 2021 A1
20210125863 Clark et al. Apr 2021 A1
20210132593 Sundstrom et al. May 2021 A1
20210138735 Limoge May 2021 A1
20210168976 Kawai et al. Jun 2021 A1
20210192779 Putman Jun 2021 A1
20210263495 Putman Aug 2021 A1
20210378190 Limoge et al. Dec 2021 A1
20210394456 Hough et al. Dec 2021 A1
20220011727 Hlavac et al. Jan 2022 A1
20220236709 Cella et al. Jul 2022 A1
20220308653 Pu et al. Sep 2022 A1
20230182235 Penny et al. Jun 2023 A1
Foreign Referenced Citations (107)
Number Date Country
2002359881 Jul 2003 AU
1371489 Sep 2002 CN
1705938 Dec 2005 CN
101771702 Jul 2010 CN
102466566 May 2012 CN
102778858 Nov 2012 CN
103324175 Sep 2013 CN
104656602 May 2015 CN
105094030 Nov 2015 CN
105264640 Jan 2016 CN
105488806 Apr 2016 CN
105960777 Sep 2016 CN
106687981 May 2017 CN
106857797 Jun 2017 CN
106921676 Jul 2017 CN
107389701 Nov 2017 CN
107835982 Mar 2018 CN
107851047 Mar 2018 CN
107886500 Apr 2018 CN
107976969 May 2018 CN
108353078 Jul 2018 CN
108604393 Sep 2018 CN
108780314 Nov 2018 CN
108885446 Nov 2018 CN
109167796 Jan 2019 CN
109561112 Apr 2019 CN
109766992 May 2019 CN
110381045 Oct 2019 CN
110431503 Nov 2019 CN
110647414 Jan 2020 CN
110851834 Feb 2020 CN
0671677 Mar 1999 EP
2585248 Oct 2017 EP
4028228 Jul 2022 EP
5-108126 Apr 1993 JP
H05322789 Dec 1993 JP
2001100838 Apr 2001 JP
2002230337 Aug 2002 JP
2003167613 Jun 2003 JP
2004104576 Apr 2004 JP
2004178388 Jun 2004 JP
2005211105 Aug 2005 JP
2005250990 Sep 2005 JP
2007280366 Oct 2007 JP
2008009868 Jan 2008 JP
2008512792 Apr 2008 JP
2008146621 Jun 2008 JP
2009134623 Jun 2009 JP
2009282740 Dec 2009 JP
4601492 Dec 2010 JP
4621773 Jan 2011 JP
2015099022 May 2015 JP
2015181024 Oct 2015 JP
2016-157357 Sep 2016 JP
5984096 Sep 2016 JP
2017091091 May 2017 JP
6224873 Nov 2017 JP
2017211713 Nov 2017 JP
2018022210 Feb 2018 JP
2018-103309 Jul 2018 JP
6356909 Jul 2018 JP
2018139101 Sep 2018 JP
2019061565 Apr 2019 JP
6527295 Jun 2019 JP
2019095859 Jun 2019 JP
2019145042 Aug 2019 JP
2020-035420 Mar 2020 JP
2020114597 Jul 2020 JP
2022522159 Apr 2022 JP
10-2011-0069934 Jun 2011 KR
10-2015-0075742 Jul 2015 KR
101568879 Nov 2015 KR
10-2017-0127430 Nov 2017 KR
10-2019-0000182 Jan 2019 KR
454137 Sep 2001 TW
489443 Jun 2002 TW
200307972 Dec 2003 TW
200415453 Aug 2004 TW
200629117 Aug 2006 TW
200715080 Apr 2007 TW
200724237 Jul 2007 TW
201212140 Mar 2012 TW
I409658 Sep 2013 TW
201339069 Oct 2013 TW
201717144 May 2017 TW
201723914 Jul 2017 TW
201809640 Mar 2018 TW
201816717 May 2018 TW
201839626 Nov 2018 TW
201842403 Dec 2018 TW
201908896 Mar 2019 TW
201939634 Oct 2019 TW
201941194 Oct 2019 TW
201941328 Oct 2019 TW
202001678 Jan 2020 TW
03060812 Jul 2003 WO
2005093535 Nov 2005 WO
2018044410 Mar 2018 WO
2018055754 Mar 2018 WO
2018061842 Apr 2018 WO
2018062398 Apr 2018 WO
2018105296 Jun 2018 WO
2018217903 Nov 2018 WO
2019012653 Jan 2019 WO
2019058532 Mar 2019 WO
2019182913 Sep 2019 WO
2019195039 Oct 2019 WO
Non-Patent Literature Citations (109)
Entry
PCT International Application No. PCT/US21/21440, International Search Report and Written Opinion of the International Searching Authority, dated May 20, 2021, 10 pages.
PCT International Application No. PCT/US20/59339, International Search Report and Written Opinion of the International Searching Authority, dated Feb. 5, 2021, 14 pages.
Szkilnyk G., “Vision Based Fault Detection In Assembly Automation,” Queen's University, Jun. 2012, 219 Pages.
Thrun, “Probabilistic robotics,” Communications of the ACM 45.3, 2002, pp. 52-57.
Zhou C., et al, “Anomaly Detection with Robust Deep Autoencoders,” Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Aug. 13-17, 2017, pp. 665-674.
Vecerik M., et al, “Leveraging Demonstrations For Deep Reinforcement Learning on Robotics Problems with Sparse Rewards,” arXiv preprint, arXiv:1707.08817, Submitted on Jul. 27, 2017, 10 Pages, Last revised on Oct. 8, 2018.
Zhong R.Y., et al, “Intelligent Manufacturing In The Context Of Industry 4.0: A Review,” Engineering, Mar. 31, 2017, vol. 3, No. 5, pp. 616-630.
American Society for Quality: “What is Statistical Process Control?,” 2021, 07 Pages, [Retrieved on Jul. 23, 2019], Retrieved from URL: https://asq.org/quality-resources/statistical-process-control.
An J., etal, “Variational Autoencoder Based Anomaly Detection Using Reconstruction Probability,” Special Lecture on IE 2.1, Dec. 27, 2015, pp. 1-18.
Bose A., et al., “Behavioral Detection of Malware on Mobile Handsets,” MobiSys, Proceedings of the 6th International Conference on Mobile Systems, Applications, and Services, Jun. 17-20, 2008, pp. 225-238.
Evangelidis G.D., etal, “Parametric Image Alignment Using Enhanced Correlation Coefficient Maximization,” IEEE Transactions on Pattern Analysis and Machine Intelligence, Oct. 2008, vol. 30, No. 10, pp. 1-8.
Extended European Search Report for European Application No. 19916905.3, mailed Sep. 9, 2022, 10 Pages.
Extended European Search Report for European Application No. 20156940.7, mailed Aug. 10, 2020, 12 Pages.
Extended European Search Report for European Application No. 20763956.8, mailed Sep. 9, 2022, 11 Pages.
Extended European Search Report for European Application No. 20832713.0, mailed Jan. 3, 2023, 10 Pages.
Fujimoto S., et al., “Addressing Function Approximation Error In Actor-critic Methods,” Proceedings of the 35th International Conference on Machine Learning Research, Oct. 22, 2018, 15 Pages.
Goodfellow I.J., et al., “Generative Adversarial Nets,” Proceedings of Advances in Neural Information Processing Systems, 2014, 9 Pages.
International Preliminary Report on Patentability for International Application No. PCT/US2019/053746, mailed Sep. 10, 2021, 6 Pages.
International Preliminary Report on Patentability for International Application No. PCT/US2020/029022, mailed Sep. 10, 2021, 6 Pages.
International Preliminary Report on Patentability for International Application No. PCT/US2020/039064, mailed Jan. 6, 2022, 7 Pages.
International Preliminary Report on Patentability for International Application No. PCT/US2020/052254, mailed Apr. 21, 2022, 7 Pages.
International Preliminary Report on Patentability for International Application No. PCT/US2020/059339, mailed May 19, 2022, 13 Pages.
International Preliminary Report on Patentability for International Application No. PCT/US2020/061434, mailed Jun. 2, 2022, 09 Pages.
International Preliminary Report on Patentability for International Application No. PCT/US2021/019857, mailed Sep. 9, 2022, 14 Pages.
International Preliminary Report on Patentability for International Application No. PCT/US2021/021440, mailed Sep. 22, 2022, 09 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/053746, mailed Nov. 5, 2019, 7 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2020/029022, mailed Jul. 9, 2020, 08 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2020/039064, mailed Jul. 30, 2020, 8 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2020/052254, mailed Jan. 12, 2021, 8 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2020/061434, mailed Feb. 22, 2021, 10 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2021/019857, mailed May 7, 2021, 15 Pages.
Karnouskos S., “Stuxnet Worm Impact On Industrial Cyber-Physical System Security,” IECON, 37th Annual Conference of the IEEE Industrial Electronics Society, IEEE, 2011, 5 Pages.
Kingma D.P., et al, “Adam: A Method for Stochastic Optimization,” arXiv preprint arXiv:1412.6980, ICLR 2015, Jan. 30, 2017, 15 pages.
Lardinois F., “Nvidia's Researchers Teach A Robot To Perform Simple Tasks By Observing A Human,” 6 Pages, [Retrieved on Mar. 11, 2019], Retrieved from URL: https://techcrunch.com/2018/05/20/nvidias-researchers-teach-a-robot-to-learn-simple-tasks-by-observing-a-human/?utm_source=tcfbpage&amp;utm_medium=feedutm_campaign=Feed%3A+Techcrunch+%28TechCrunch%29&amp;sr_share=facebook.
Lillicrap T.P., et al, Continuous Control With Deep Reinforcement Learning, Published as a Conference Paper at ICLR 2016, arXiv: 1509.02971v6 [cs.LG], Last Revised on Jul. 5, 2019, 14 Pages.
Liu H., et al., “Intelligent Tuning Method Of Pid Parameters Based On Iterative Learning Control For Atomic Force Microscopy,” Science Direct Micron, 2018, vol. 104, pp. 26-36.
Malhotra P., et al., “LSTM-Based Encoder-Decoder for Multi-Sensor Anomaly Detection,” arXiv preprint arXiv: 1607.00148, Last Revised on Jul. 11, 2016, 5 pages.
Mnih V., et al, “Playing Atari With Deep Reinforcement Learning,” arXiv preprint arXiv: 1312.5602v1, Dec. 19, 2013, 9 pages.
Mueller F., et al, “Real-time Hand Tracking under Occlusion from an Egocentric RGB-D Sensor,” Max-Planck-Institute for Informatics, Germany, Universidad Rey Juan Carlos, Spain, Oct. 5, 2017, 16 Pages.
Ng A., “Sparse Autoencoder,” CS294A Lecture Notes 72.2011, 2011, pp. 1-19.
Office Action and Search Report from Taiwan Patent Application No. 108137373, dated Mar. 31, 2023, 24 pages.
Office Action and Search Report from Taiwan Patent Application No. 111130991, dated May 17, 2023, 19 pages.
Office Action and Search Report from Taiwan Patent Application No. 112103333, dated Aug. 21, 2023, 8 Pages.
Office Action for Chinese Patent Application No. 2020800738527, mailed Apr. 25, 2023, 37 Pages.
Office Action for European Patent Application No. 20156940.7, mailed Feb. 10, 2023, 6 Pages.
Office Action for Japanese Patent Application No. 2021-549835, mailed Mar. 3, 2023, 7 Pages.
Office Action for Japanese Patent Application No. 2021575060, mailed Jun. 2, 2023, 7 pages.
Office Action for Japanese Patent Application No. 2022520885, mailed Jun. 30, 2023, 10 Pages.
Office Action for Japanese Patent Application No. 2022529027, mailed Jun. 30, 2023, 5 pages.
Papanastasiou S., et al., “Bridging the Gap between Physical Layer Emulation and Network Simulation,” IEEE Wireless Communication and Networking Conference, Date of Conference: Apr. 18-21, 2010, 06 pages.
Purdue University: “Intrusion Alert: System Uses Machine Learning, Curiosity-Driven ‘Honeypots’ To Stop Cyber Attackers,” Research Foundation News, Feb. 6, 2020, 06 Pages, Retrieved From URL: https://engineering.purdue.edu/ECE/News/2020/intrusion-alert-system-uses-machine-learning-curiosity-driven-honeypots-to-stop-cyber-attackers.
Real R., et al., “The Probabilistic Basis Of Jaccard's Index Of Similarity,” Systematic Biology, 1996, vol. 45, No. 3, pp. 380-385.
Sakurada M., et al., “Anomaly Detection Using Autoencoders With Nonlinear Dimensionality Reduction,” Proceedings of the Machine Learning for Sensory Data Analysis (MLSDA) 2nd Workshop on Machine Learning for Sensory Data Analysis, 2014, 8 Pages.
Saunders J.A., et al., “Visual Feedback Control Of Hand Movements,” The Journal of Neuroscience, Mar. 31, 2004, vol. 24, No. 13, pp. 3223-3234.
Simon T., etal., “Hand Keypoint Detection in Single Images Using Multiview Bootstrapping,” Proceedings of the IEEE conference on Computer Vision and Pattern Recognition, 2017, pp. 1145-1153.
SPC for Excel: “Control Chart Rules and Interpretation,” BPI Consulting, LLC, Mar. 2016, 20 Pages, [Retrieved on Jul. 23, 2019], Retrieved From URL: https://www.spcforexcel.com/knowledge/control-chart-basics/control-chart-rules-interpretation.
SPC for Excel: “Interpreting Control Charts,” BPI Consulting, LLC, Apr. 2004, 9 Pages, [Retrieved on Jul. 23, 2019], Retrieved From URL: https://www.spcforexcel.com/knowledge/control-charts-basics/interpreting-control-charts.
PCT International Application No. PCT/US2021/038085, International Search Report and Written Opinion of the International Searching Authority, dated Sep. 29, 2021, 14 pages.
Office Action for Japanese Patent Application No. 2022-553668, mailed Sep. 1, 2023, 9 Pages.
Supplementary European Search Report for European Patent Application No. 21760563.3, mailed Jul. 18, 2023, 12 Pages.
Vollmer, et al., “Cyber-physical system security with deceptive virtual hosts for industrial control networks,” IEEE Transactions on Industrial Informatics 10.2, May 2014, pp. 1337-1347.
Office Action for Japanese Patent Application No. 2021549835, mailed Sep. 22, 2023, 7 pages.
Extended Search Report from European Patent Application No. 20874557.0, dated Oct. 19, 2023, 12 Pages.
Potluri, et al., “Deep learning based efficient anomaly detection for securing process control systems against injection attacks,” 2019 IEEE 15th International Conference on Automation Science and Engineering (CASE), 2019, pp. 854-860.
Erba, et al., “Real-time evasion attacks with physical constraints on deep learning-based anomaly detectors in industrial control systems,” arXiv preprint arXiv:1907.07487, 2019, 15 pages.
Notification of Reason for Refusal from Korean Patent Application No. 10-2021-7030695, dated Dec. 18, 2023, 13 Pages.
Office Action for Chinese Patent Application No. 202080016336.0, mailed Feb. 1, 2024, 8 pages.
Office Action for Japanese Patent Application No. 2022-553668, mailed Feb. 9, 2024, 9 pages.
Office Action for Japanese Patent Application No. 2021575060, mailed Oct. 13, 2023, 3 pages.
Office Action for Japanese Patent Application No. 2022529027, mailed Oct. 13, 2023, 3 pages.
Office Action for Japanese Patent Application No. 2022-551360, mailed Nov. 2, 2023, 4 pages.
Office Action for Japanese Patent Application No. 2022-207136, mailed Nov. 24, 2023, 6 pages.
Office Action for TW Patent Application No. 11221179860, mailed Nov. 27, 2023, 10 pages.
Office Action from Taiwan Patent Application No. 11221224400, dated Dec. 6, 2023, 18 pages.
Office Action from Indian Patent Application No. 202217044168, dated Nov. 30, 2023, 10 pages.
Extended European Search Report for European Application No. 20885424.0 dated Jan. 5, 2024, 12 pages.
Extended European Search Report for European Application No. 20889594.6 dated Nov. 27, 2023, 87 pages.
Notice of Allowance from Japanese Patent Application No. 2022-551360, dated Feb. 16, 2024, 3 pages.
Notice of Allowance for Japanese Patent Application No. 2021-549835, mailed Jan. 5, 2024, 3 Page.
Office Action for Japanese Patent Application No. 2022577143, mailed Jan. 12, 2024, 7 pages.
Office Action for Chinese Patent Application No. 202080044987.0, mailed Jan. 29, 2024, 7 pages.
Notice of Allowance for Taiwanese Patent Application No. 108137373, mailed Oct. 12, 2023, 4 pages.
Office Action for Chinese Patent Application No. 202080073852.7, mailed Nov. 1, 2023, 4 pages.
Office Action for Japanese Patent Application No. 2022-520885, mailed Nov. 2, 2023, 5 pages.
Office Action from Chinese Patent Application No. 201980092196.2, dated Feb. 29, 2024, 12 pages.
Notice of Allowance for Korean Patent Application No. 10-2021-7039615, mailed Feb. 27, 2024, 8 pages.
Office Action from KR Patent Application No. 10-2021-7030700, dated Mar. 19, 2024, 16 pages.
Notice of Allowance for JP Patent Application No. 2022-520885, mailed Mar. 29, 2024, 3 pages.
Tang, et al., “HonIDS: Enhancing honeypot system with intrusion detection models,” Fourth IEEE International Workshop on Information Assurance (IWIA'06), IEEE, 2006.
Notice of Allowance for TW Patent Application No. 111130991, mailed Mar. 4, 2024, 4 pages.
Office Action from KR Patent Application No. 10-2022-7014934, dated Mar. 21, 2024, 15 pages.
Office Action from Chinese Patent Application No. 202080073852.7, dated Apr. 19, 2024, 15 pages.
Office Action from Chinese Patent Application No. 202180014828.0, dated Apr. 28, 2024, 8 pages.
Office Action from TW Patent Application No. 112103333, dated May 2, 2024, 3 pages.
Notice of Allowance from Taiwan Patent Application No. 112113077, dated May 14, 2024, 6 pages.
Office Action for European Patent Application No. 21767468.8, mailed Mar. 22, 2024, 12 Pages.
Office Action from Taiwan Patent Application No. 111108232, dated May 30, 2024, 8 pages.
Notice of Allowance from Japan Patent Application No. 2022-577143, dated May 31, 2024, 2 pages.
Office Action from Chinese Patent Application No. 202080079163.7, dated Jun. 4, 2024, 10 pages.
Office Action from India Patent Application No. 202318013498, dated May 31, 2024, 7 pages.
Office Action for Japanese Patent Application No. 2023-116653, mailed Jul. 12, 2024, 6 pages.
Notice of Allowance for TW Patent Application No. 109121636, mailed Jul. 9, 2024, 4 pages.
Office Action for Japanese Patent Application No. 2022-207136, mailed Jul. 26, 2024, 6 pages.
Office Action for Korean Patent Application No. 10-2022-7032975, mailed Jul. 20, 2024, 16 pages.
Office Action for Korean Patent Application No. 10-2021-7030695, mailed Jul. 12, 2024, 6 pages.
Office Action for Japanese Patent Application No. 2023-192266, mailed Aug. 9, 2024, 9 pages.
Notice of Allowance for Chinese Patent Application No. 202080044987.0, mailed Aug. 20, 2024, 4 pages.
Office Action for EP Patent Application No. 19916905.3, mailed May 28, 2024, 7 pages.
Office Action for EP Patent Application No. 20763956.8, mailed May 28, 2024, 7 pages.
Related Publications (1)
Number Date Country
20210311440 A1 Oct 2021 US
Provisional Applications (5)
Number Date Country
63040792 Jun 2020 US
62986987 Mar 2020 US
62932063 Nov 2019 US
62931448 Nov 2019 US
62931453 Nov 2019 US
Continuation in Parts (2)
Number Date Country
Parent 17195746 Mar 2021 US
Child 17304349 US
Parent 17091393 Nov 2020 US
Child 17195746 US