At least one embodiment of the present invention pertains to manufacturing techniques, and more particularly, to a technique for manufacturing monitoring and inspection of a manufacturing process based on coregistration of diverse sensor data.
Manufacturing processes are inherently subject to variance, and consequently, the quality of manufactured parts or assemblies is inherently variable. Therefore, part inspection is routinely performed in manufacturing processes to assess whether a manufactured part meets all specifications within tolerance. However, the specific inspection protocol can vary from process to process and/or part to part.
For instance, a high throughput assembly line that produces identical versions of the same part may require inspection of only a small, but statistically significant/representative, fraction of the final product to forecast part quality and production efficiency of an entire lot. Conversely, inspection may occur for every part for high value boutique production or in additive manufacturing (AM) processes where the part design can be varied at every fabrication step. Also, quality inspection can occur throughout a build process and/or after part fabrication is complete. Irrespective of how it is implemented, inspection remains critical to—and often is a major bottleneck in—manufacturing processes in terms of part assessment, process qualification, error, and root cause analysis, etc.
In this context, there are many forms of inspection. Inspection can be characterized in terms of, for example, the sensing probe used, e.g., contactless (confocal scans, photography, hyperspectral imagery, x-ray scanning, etc.), contact (kinematic or electrical resistance probes), or both. Such sensing probes can collect the necessary measurements for parameters such as geometric dimensioning and tolerance, mass, density, minimum thickness, friction coefficient, color, chemical composition, electrical properties, smell, or whatever comprehensive set of specifications that must be met. Depending on the quality metric(s), the inspection process can be non-destructive or destructive, e.g., yield stress of a support beam. Digital cameras represent a ubiquitous sensing modality and likely will remain so, especially due to the advent of machine learning and computer vision based classification and regression algorithms that readily extract quality metrics from images or video, i.e. sequences of images.
A variety of hardware and sensors may be needed to fully qualify a part that undergoes inspection. For each inspection task, there is often complementary analysis that extracts signal(s) from the relevant sensor(s) and ultimately compares the measurement(s) to some specification(s). In conventional manufacturing inspection processes, however, data streams are treated separately, for example, using images to identify surface defects, using infrared (IR) probes for inspecting temperature throughout a build, using machine motion profiles to examine tool paths, etc.
One or more embodiments of the present invention are illustrated by way of example and not limitation in the figures of the accompanying drawings, in which like references indicate similar elements.
As noted above, in conventional manufacturing inspection processes, data streams are treated separately. Consequently, conventional manufacturing inspection methods do not leverage the salient information from each respective data stream for a given quality metric. Introduced here, therefore, is a technique for coregistering, or “fusing,” data streams from various sensing modalities in any given manufacturing process. The technique systematically and intuitively marries layer-wise images with other in situ sensor data. The result is a rich image-like tensor with information beyond what is captured by a standard two-dimensional image comprising grayscale or color (e.g., RGB) pixels. The technique can add readings from one or more sensors collected at each location in an image to every pixel in the image. Merging all data that is collected from a sensor suite with image pixel data yields a synergistic data structure that offers all the benefits of spatial depiction alongside (possibly higher-frequency) information gathered by sensors.
The technique introduced here can operate at any scale and thus can be performed: (a) with sensing suites that can contain any number of sensors and in which a camera can be considered a sensor; (b) on each frame of a movie (which is just a temporal series of images); (c) using imaging set-ups that include multiple and/or different types of cameras, e.g., grayscale, color, multispectral, hyperspectral, etc.; and/or (d) using sensing data collected at any time, irrespective of when image(s) are taken.
Since the generated tensor has higher and better quality information density than standard images, actionable insights into manufacturing and inspection processes are more readily extractable via advanced analytical approaches (e.g. computer vision, machine learning, process modeling, etc.). For example, combined image and sensor data can be used to generate alerts of manufacturing defects, to trigger corrective actions, and/or to produce rich, multi-parameter graphics, tables, etc.
The technique introduced here can be applied in many different fields and contexts. One example application is part and process quality control for additive manufacturing (AM) (also called “3D printing”), particularly in direct ink write (DIW) AM (DIW-AM), as further described below. Note, however, that the technique introduced here is not limited to AM. In general, the technique introduced here can be used to provide improved inspection capabilities and routines, process monitoring, improved/accelerated quality detection of additive and traditional manufacturing process traceability, rapid closed-loop control, scientific and physics-guided machine learning, on-machine metrology, and in-situ data and process monitoring.
In at least some implementations, the technique introduced here includes a method of manufacturing a physical object. The method can comprise, during a process of manufacturing the physical object by a machine, concurrently capturing a) image data of at least a portion of the physical object from an imaging sensor (e.g., a conventional visible-spectrum digital camera), and b) other sensor data related to the machine or to the at least a portion of the physical object. The “other sensor data” can include data from one or more non-imaging sensors (e.g., pressure sensor, temperature sensor, stage positioning and motion, etc.), as well as data from one or more other imaging sensors (e.g., an IR camera). The method further comprises, during the process of manufacturing the physical object, at each of a plurality of points in time, and for each of the plurality of pixels of the image data, coregistering (spatially associating) the image data with the other sensor data on a pixel-by-pixel basis, storing the coregistered image data and other sensor data in association with each other in a data structure, and using at least a portion of the coregistered image data and other sensor to detect an anomaly in the physical object or in the process of manufacturing the physical object. In at least some implementations, the method can further comprise triggering an action in response to detection of an anomaly in the physical object or in the process of manufacturing the physical object.
Coregistering the image data with the other sensor data can comprise, at each of a plurality of points in time, and for each of the plurality of pixels of the image data, associating other sensor data with that pixel. More specifically, the coregistering further can comprise, for each of a plurality of pixels of the image data, identifying a particular pixel to be associated with a particular sensor value of the other sensor data, by: a) for each of a plurality of orthogonal coordinate axes, computing a number of pixels occupied by the physical object in the image data along the coordinate axis, and determining a number of pixels per unit length along the coordinate axis; and b) identifying the particular pixel to be associated with the particular sensor value based on a set of reference position coordinates, current position coordinates of a part of the machine, and the number of pixels per unit length along the coordinate axes.
At least a portion of the coregistered image data and other sensor can be used to detect an anomaly in the physical object or in the process of manufacturing the physical object, by identifying, in the image data of the physical object, a particular pixel indicative of the anomaly, and ascertaining a position coordinate associated with the anomaly, based on a position coordinate of the particular pixel, the number of pixels per unit length along the first coordinate axis and the first reference position coordinate.
Example Implementation
The technique introduced here includes a methodology to combine different classes of sensor data. Consequently, in the technique introduced here, sensor readings and images are registered into the same spatial and temporal frame of reference (“coregistered”) to assign to each pixel lighting and/or color intensity values representing the sensor reading(s) recorded at each pixel location. This approach goes beyond different sensors merely recording respective signals versus time, e.g., a digital thermometer, pH probe, humidity sensor, microphone, etc. In the technique introduced here, these types and/or other types of sensors can be registered with a temporal frame of reference, e.g., at the beginning or end of a build (manufacturing process), even if they are collected at different rates. Multiple images (from the same or different cameras) can then be spatially coregistered by use of spatial references within each image, e.g., the edge of a part being manufactured, a fiducial marker, etc., to place them in perspective of each other.
Coregistering an image with data from other sensors involves implementing a routine that starts at a known time point, tknown, in the manufacturing process. Although tknown can be selected at the discretion of the user or designer of the manufacturing process, choosing to perform the following sequence at gown enables the resulting data tensor to be used throughout the process. All images and sensors are referenced against tknown, irrespective of when data is collected by any of the sensors. In addition to tknown, a spatial reference point is also used, as described below.
From (xknown, yknown, zknown), tknown, and PixelsPerLength, any pixel in an image (or portion of an image) can be assigned one or more sensor output values. For example, continuing with the DIW-AM example in
(Px, Py)=((x+xknown, y+yknown)−(xknown, yknown))×PixlsPerLength (1)
Since this process works reversibly, if a pixel is known in an image (e.g., a pixel indicative of an anomaly), its spatial coordinates (x+xknown, y+yknown) can be determined by equation (2):
Equations (1) and (2) assume that an image is taken from a plan (or “bird's eye”) view, but can be adapted easily for any camera orientation, i.e., elevation view of the side of part, or using several images for which (xknown, yknown, zknown), tknown, and PixelsPerLength have been determined. In any case, a sensor output value Si(t+tknown) can then be associated with each pixel location along with the pixel's existing intensity and/or color values.
The following are some examples of other sensor-based information that can be acquired and coregistered with image data in a tensor in a manufacturing process: temperature, humidity, accelerometer values, machine encoder information (detailing positional, velocity, acceleration, etc. of all motion platforms), local conductivity, vibration, fluid properties, microphone/audio, pressure, tilt, etc. Separate images whose pixel locations can be referenced to (xknown, yknown, zknown) can be combined wherever there is an overlap in their spatial coordinates. Since hyperspectral, multispectral, and IR cameras can capture a broader and/or higher resolution range of the electromagnetic spectrum than conventional visible-spectrum digital cameras, information from these other types of cameras can be used to augment the output of conventional (typically cheaper) cameras where common location(s) can be identified.
Sensors often have differing data collection rates. In the case of probes that scan the surface of what they measure, the scanning rate for each probe is variable and each may have its own data collection frequency. But since pixels in the image have a known length, i.e. 1/PixelsPerLength, and/or there is a known dwell time associated with each pixel, there are several ways to attribute data into each pixel, and the optimal approach will be application and sensor dependent. One method would be to append all non-imaging sensor output values to each pixel, resulting the highest possible data density per pixel. Additionally or alternatively, if multiple readings are taken from a given sensor at different pixels of a given image, the sensor output values can be interpolated to create additional, artificial sensor output values for other pixels that are not spatially aligned with the actual sensor readings. Additionally or alternatively, various statistical metrics can be ascribed to the pixels, such as mean, mode, standard deviation, kurtosis, or any other salient features that can be derived from the distribution of data collected across the each pixel. The image-like data structure such as described above (e.g., a tensor) can be stored in a database for later retrieval and subsequent processing. Such processing can involve reformatting the composition of a pixel, retroactively incorporating collected sensor data, and/or adding new information from offline measurements.
Initially, at step 701 the process 700 captures an initial image of the part to be fabricated. The initial image will be used to ascertain the reference coordinates and dimensions described above. Hence, at step 702 the process determines reference coordinates (xknown, yknown, zknown). These coordinates can be any known coordinates. For example, in the case where the object has a simple square outline (as shown in
The data acquisition and coregistration portions of the process include capturing image data, capturing other sensor data (i.e., other than image data), and coregistering the image data and the other sensor data. These steps can occur in parallel, and may occur asynchronously to each other. More specifically, the process waits at step 705 until it is time to acquire image data. The timing of image data capture can be based on any of various criteria, such as at set times, at a particular frequency, based on current tool position, based on current stage or layer of the build process, etc. At the appropriate time, t, the image data is then captured along with its associated spatial (x, y, z) coordinates and timestamp (t value) at step 708. The spatial coordinates may correspond to the position of the object or of a portion of the fabrication tool (e.g., the position of an extruder of an AM machine) at the time t at which the data is captured. Similarly to, and in parallel with, steps 705 and 708, the process waits at step 706 until it is time to acquire other (non-image) sensor data. Timing criteria similar to those described in relation to step 705 can be used in step 706. At the appropriate time, the other sensor data is then captured along with its associated spatial (x, y, z) coordinates and timestamp (t value) at step 709. Note that a given system can include multiple imaging sensors and/or multiple non-imaging sensors, at least some of which can capture data asynchronously to the other sensors. Hence, the process flow can include a separate waiting/data-capture branch like steps 706 and 709 for each individual sensor, or for certain groups of sensors.
In parallel with steps 705, 706, 708 and 709, the process also waits at step 707 until it is time to coregister image and other sensor data. At the appropriate time, the image data and other sensor data are coregistered and stored in a data structure, such as a tensor, at step 710. Coregistration assigns non-imaging sensor data to each of one or more pixel values, and can be done, for example, in the manner described above in relation to equation (1) in at least one implementation. The timing of the coregistration can be based on any of various criteria, such as those mentioned above in relation to steps 705, 706, 708 and 709. Other criteria that might be used to trigger coregistration might include the amount of newly acquired data that has not yet been coregistered, the amount of available memory space for buffering not-yet-coregistered data, etc. Following coregistration and storage, if the build process is not yet complete at step 711, the process loops back to steps 705, 706 and 707. Otherwise, the process ends.
Note that in other implementations, the order of the above-described steps may be different. Additionally, in other implementations there may be additional steps and/or some of the above-described steps may be omitted.
The timing of evaluating sensor data can be based on any of various criteria, such as a set schedule, a particular frequency, based on the current position, stage or layer of the fabrication process, etc. at the appropriate time, the process accesses an appropriate subset of the stored coregistered image data and other sensor data, at step 802. The process then determines whether the accessed coregistered data is indicative of an anomaly at step 803. Any of various methods can be used to make this determination, the details of which are not germane to the technique introduced here. If the accessed coregistered data is indicative of an anomaly, the process outputs an alert, triggers a corrective action, and/or generates or updates a report, at step 804. Next, if the build process is not yet complete at step 805, the process loops back to step 801. Otherwise, the process ends.
Unless contrary to physical possibility, it is envisioned that (i) the methods/steps described herein may be performed in any sequence and/or in any combination, and that (ii) the components of respective embodiments may be combined in any manner.
The machine-implemented operations described above can be implemented by programmable circuitry programmed/configured by software and/or firmware, or entirely by special-purpose circuitry, or by a combination of such forms. Such special-purpose circuitry (if any) can be in the form of, for example, one or more application-specific integrated circuits (ASICs), programmable logic devices (PLDs), field-programmable gate arrays (FPGAs), system-on-a-chip systems (SOCs), etc.
Software or firmware to implement the techniques introduced here may be stored on a machine-readable storage medium and may be executed by one or more general-purpose or special-purpose programmable microprocessors. A “machine-readable medium”, as the term is used herein, includes any mechanism that can store information in a form accessible by a machine (a machine may be, for example, a computer, network device, cellular phone, personal digital assistant (PDA), manufacturing tool, any device with one or more processors, etc.). For example, a machine-accessible medium includes recordable/non-recordable media (e.g., read-only memory (ROM); random access memory (RAM); magnetic disk storage media; optical storage media; flash memory devices; etc.), etc.
Any or all of the features and functions described above can be combined with each other, except to the extent it may be otherwise stated above or to the extent that any such embodiments may be incompatible by virtue of their function or structure, as will be apparent to persons of ordinary skill in the art. Unless contrary to physical possibility, it is envisioned that (i) the methods/steps described herein may be performed in any sequence and/or in any combination, and that (ii) the components of respective embodiments may be combined in any manner.
Although the subject matter has been described in language specific to structural features and/or acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as examples of implementing the claims and other equivalent features and acts are intended to be within the scope of the claims.
This invention was made with Government support under Contract No. DE-AC52-07NA27344 awarded by the United States Department of Energy. The Government has certain rights in the invention.