Measurement of thickness of thermal barrier coatings using 3D imaging and surface subtraction methods for objects with complex geometries

Information

  • Patent Grant
  • 11009339
  • Patent Number
    11,009,339
  • Date Filed
    Tuesday, February 5, 2019
    6 years ago
  • Date Issued
    Tuesday, May 18, 2021
    3 years ago
Abstract
Embodiments described herein relate to a non-destructive measurement device measurement device and a non-destructive measurement method for determining coating thickness of a three-dimensional (3D) object. In one embodiment, at least one first 3D image of an uncoated surface of the object and at least one second 3D image of a coated surface of the object are collected and analyzed to the determine the coating thickness of the object.
Description
BACKGROUND
Field

Embodiments of the present disclosure generally relate to determining thickness of three-dimensional (3D) object coatings. More particularly, embodiments of the present disclosure relate to determining thickness of protective coatings for turbine blades and other components exposed to corrosive enviroments.


Description of the Related Art

Aerospace components including turbine vanes and blades are fabricated from nickel and cobalt-based superalloys. Superalloy protection during engine operation employs a plurality of layers, including a stable oxide scale that is dense, adheres to the surface or surfaces of the component, and is stable at high temperatures up to about 1900° C. Various barrier coatings, including thermal barrier coatings (TBCs), can be used to inhibit oxidation and corrosion of the aerospace components. Various materials are employed to form these corrosion-resistant coatings, such as native-grown oxides include Cr2O3 for hot corrosion protection and Al2O3 for oxidation resistance. TBCs and other barrier coatings can be deposited using either e-beam PVD or thermal spray. Deposited TBCs include yttria-stabilized zirconia, gadolinium zirconate, tantalum-yttrium zirconium oxides, and other mixed zirconate, halfnate, silicate, and aluminate compounds. However, measuring TBC thickness on three-dimensional (3D) objects may be destructive, inaccurate, costly, and time-consuming.


Thus, there remains a need in the art for measuring coating thicknesses of 3D objects with non-destructive imaging methods.


SUMMARY

In one embodiment, a method of determining a thickness of an object coating is provided. The method includes, in a non-destructive measurement device having at least one image sensor system, positioning an uncoated surface of an object in a field view of the at least one image sensor system. The object has one or more surfaces. A first 3D image of the uncoated surface is collected without chemically or physically changing the one or more surfaces of the object. The first 3D image corresponds to a first surface profile of the uncoated surface. A coated surface of the object is positioned in the field view of the at least one image sensor system. A second 3D image of the coated surface is collected without chemically or physically changing the one or more surfaces of the object. The second 3D image corresponds to a second surface profile of the coated surface. The first 3D image and the second 3D image are analyzed.


In another embodiment, a method of determining a thickness of an object coating is provided. The method includes in a non-destructive measurement device having at least one image sensor system, positioning a surface of an object in a field view of the at least one image sensor system. The object has one or more surfaces and the surface having a uncoated portion and a coated portion. A 3D image of the surface is collected without chemically or physically changing the one or more surfaces of the object. The 3D image corresponds to a first surface profile of the uncoated portion and a second surface profile of the coated portion. The 3D image is analyzed.


In yet another embodiment, a non-destructive measurement device is provided. The non-destructive measurement device includes a body, a stage assembly disposed in the body having a stage configured to retained an object and a coordinate grid, an image sensor assembly disposed in the body, an alignment mechanism disposed in the body, and a controller. Each of the one or more image sensor systems has an illumination unit, one or more image sensors, and a Quick Response (QR) code reader. The alignment mechanism operable to align position the object at an alignment position on the coordinate grid. The alignment position corresponds to the QR code of the object. The controller is coupled to the stage assembly, the image sensor assembly, and the alignment mechanism. The controller is interfaced with a coating system and controls automation integration with the coating system via a system controller of the coating system. The controller is configured to instruct the one or more image sensor systems to collect one or more 3D images of one or more surfaces of the object analyze the one or more 3D images to obtain a thickness of a coating of the object.





BRIEF DESCRIPTION OF THE DRAWINGS

So that the manner in which the above recited features of the present disclosure can be understood in detail, a more particular description of the disclosure, briefly summarized above, may be had by reference to embodiments, some of which are illustrated in the appended drawings. It is to be noted, however, that the appended drawings illustrate only exemplary embodiments and are therefore not to be considered limiting of its scope, and may admit to other equally effective embodiments.



FIG. 1 is a schematic view of a coating system having an at least one integrated non-destructive measurement device according to an embodiment.



FIG. 2A is a schematic cross-sectional view of a non-destructive measurement according to an embodiment.



FIG. 2B is a schematic top view of a non-destructive measurement according to an embodiment.



FIG. 2C is a schematic view of an image sensor system according to an embodiment.



FIG. 3 is a flow diagram of a sub-method of a non-destructive measurement method for determining coating thickness of a 3D object according to an embodiment.



FIG. 4 is a flow diagram of a sub-method of a non-destructive measurement method for determining coating thickness of a 3D object according to an embodiment.



FIG. 5 is a flow diagram of a method for determining coating thickness of a 3D object.





To facilitate understanding, identical reference numerals have been used, where possible, to designate identical elements that are common to the figures. It is contemplated that elements and features of one embodiment may be beneficially incorporated in other embodiments without further recitation.


DETAILED DESCRIPTION

Embodiments described herein relate to a non-destructive measurement device measurement device and a non-destructive measurement method for determining coating thickness of a three-dimensional (3D) object.



FIG. 1 is a schematic view of a coating system 100 having an at least one integrated non-destructive measurement device 102. Additionally, the non-destructive measurement device 102 may be provided as a standalone device unattached and remote from the system 100. The system 100 and the integrated non-destructive measurement device 102 are utilized for coating 3D objects and measuring coating thicknesses on 3D objects with non-destructive imaging methods. It is to be understood that the system described below is an exemplary system and other systems, including systems from other manufacturers, may be used with or modified to accomplish aspects of the present disclosure. The system 100 includes one or more non-destructive measurement devices 102 and one or more of coating modules 104 coupled to a transfer chamber 106. In one embodiment, which can be combined with other embodiments described herein, the system 100 includes one or more processing modules 108 coupled to the transfer chamber 106. The one or more coating modules 104 are adapted for coating 3D objects. In one embodiment, which can be combined with other embodiments described herein, the 3D objects include aerospace components, such as turbine vanes and blades. In another embodiment, which can be combined with other embodiments described herein, the coatings are barrier coatings, such as thermal barrier coatings (TBCs). The one or more processing modules are adapted for processing the 3D objects, and the like. The transfer chamber 106 houses a transfer mechanism 110 used to transfer 3D objects between the measurement devices 102, coating modules 104, and processing modules 108.


A system controller 112 is coupled to and controls each module and measurement device 102 of the system 100. Generally, the system controller 112 may control all aspects of operation of the system 100 using a direct control of modules and measurement devices 102 of the system 100 or, alternatively, by controlling the computers associated with these modules and the measurement devices 102. Furthermore, the system controller 112 is interfaced with a controller 208 (shown in FIG. 2A) associated with the measurement device 102. The controller 208 controlling automation integration with the system 100 via the system controller 112. For example, movements of the transfer mechanism 110, transferring 3D objects to and from the measurement devices 102 and coating modules 104, performing process sequences, coordinating operations of the measurement devices 102, and so on, may be controlled by the system controller.


In operation, the system controller 112 enables feedback from each module and measurement device 102 to optimize 3D object throughput. The system controller 112 comprises a central processing unit (CPU) 114, a memory 116, and a support circuit 118. The CPU 114 may be one of any form of a general purpose computer processor that can be used in an industrial setting. The support circuit 118 is conventionally coupled to the CPU 114 and may comprise cache, clock circuits, input/output subsystems, power supplies, and the like. The software routines, when executed by the CPU 114, transform the CPU into a specific purpose computer (controller). The software routines may also be stored and/or executed by a second controller (not shown) that is located remotely from the measurement device 102.



FIG. 2A is a schematic cross-sectional view of a measurement device 102. FIG. 2B is a schematic top view of the measurement device 102. FIG. 2C is a schematic view of an image sensor system 212. In one embodiment, which can be combined with other embodiments described herein, the measurement device 102 is utilized for non-destructive methods of measuring coating thicknesses of 3D objects. The measurement device 102 includes a body 200, a stage assembly 202, an image sensor assembly 204, an alignment mechanism 206, and controller 208. The stage assembly 202 includes a stage 210 configured to retain an object 101. In one embodiment, which can be combined with other embodiments described herein, the object 101 is a three-dimensional (3D) object. In another embodiment, which can be combined with other embodiments described herein, the stage 210 is a goniometric stage and/or optical stage. The stage 210 is configured to rotate the object 101 about at least one of an x-axis, y-axis, and z-axis of the stage 210 to position one or more of surfaces 103 of the object 101 in relation to one or more image sensor systems 212 of the image sensor assembly 204. The stage 210 includes an actuator 222 that moves the stage 210 between a measurement position (as shown) and a transfer position. The actuator 222 facilitates object transfer to and from the measurement device 102 through an opening 224 formed though the body 200 and sealable by a door 226. In one embodiment, which can be combined with other embodiments described herein, the actuator 222 is operable to rotate the object 101 about at least one of the x-axis, y-axis, and z-axis of the stage 210. The stage 210 includes a coordinate grid 214 disposed thereon. The coordinate gird 214 is utilized for the non-destructive methods of measuring coating thicknesses of 3D objects described herein.


The controller 208 is coupled to the stage assembly 202, the one or more image sensor systems 212, and the alignment mechanism 206. The controller 208 includes a central processing unit (CPU) 216, a memory 218, and support circuits (or I/O) 220. The CPU 216 is one of any form of computer processors used in industrial settings for controlling various processes and hardware (e.g., goniometers, motors, and other hardware) and/or monitoring the processes (e.g., processing time and object 101 position). The memory 218 is connected to the CPU 216. The memory 218 is one or more of a readily available memory, such as random access memory (RAM), read only memory (ROM), floppy disk, hard disk, or any other form of digital storage, local or remote. Software instructions and data are coded and stored within the memory 218 for instructing the CPU 216. The support circuits 220 are also connected to the CPU 216 for supporting the processor in a conventional manner. The support circuits include conventional cache, power supplies, clock circuits, input/output circuitry, subsystems, and the like. A program (or computer instructions), which may be referred to as an imaging program, is readable by the controller determines which tasks are performable on the object 101. The program is software readable by the controller and includes code to monitor and control, for example, the processing time and object 101 position. In one embodiment, which can be combined with other embodiments described herein, the memory 218 includes image acquisition software, image analysis software, image numbering software to determine position and orientation of the object 101, a data base with part numbers and image information, each number corresponding to an object 101 and each image information corresponding to the object 101 of the part number, and/or software that outputs thickness of the object 101 and/or one or more surfaces 103 of the object 101.


The image sensor assembly 204 includes the image sensor systems 212. In one embodiment, which can be combined with other embodiments described herein, the image sensor assembly 204 includes each of the one or more image sensor systems 212 are in fixed positions oriented toward, i.e., facing the field of view 211 of the image sensor systems 212, a surface of the one or more surfaces 103 of the object 101. Therefore, rotation of the stage 210 and movement of the one or more image sensor systems 212 are not required for the non-destructive imaging methods described herein. In another embodiment, which can be combined with other embodiments described herein, the imaging sensor assembly 204 includes only one image sensor system 212 and the rotation of the stage 210 is utilized to position each of the one or more surfaces 103 in an orientation toward the image sensor system 212. In yet another embodiment, as shown, which can be combined with other embodiments described herein, the one or more image sensor systems 212 are operable to be oriented toward each of the one or more surfaces 103. For example, the one or more image sensor systems 212 are coupled a track 228 disposed in the body 200 and around a circumference 231 of the stage 210. In one embodiment, which can be combined with other embodiments described herein, the track 228 is a rail or cable. Each of the one or more image sensor systems 212 includes an actuator 230 that move the one or more image sensor systems 212 along the track 228 around the circumference 231 of the stage 210.


In embodiments described herein, which can be combined with other embodiments described herein, multiple 3D objects are coated utilizing the system 100 and the coating thickness of each of the multiple 3D objects are measured utilizing at least one non-destructive measurement device 102. The multiple 3D objects include different pluralities of 3D objects and the alignment mechanism 206 is operable to position each 3D object of each plurality of the different pluralities of 3D objects in substantially the same positon on the stage 210. Measuring the coating thickness each 3D object of each plurality of the different pluralities of 3D objects from substantially the same positon allows for part coordinates each 3D object of each plurality of the different pluralities of 3D objects to be in substantially the same positon. In one embodiment, which can be combined with other embodiments described herein, the alignment mechanism 206 includes an actuated arm mechanism 232 and a gripper 234. The actuated arm mechanism 232 and the gripper 234 allows the alignment mechanism 206 to capture the object 101 on the stage 210 and align the object 101 by positioning the object 101 at an alignment position 236 of coordinate gird 214 of the stage 210.


Referring to FIG. 1, the system 100 having the one or more integrated non-destructive measurement devices 102 may be utilized in concurrently with a non-destructive measurement method 500 for determining coating thickness of a 3D object. Multiple 3D objects are coated utilizing the system 100 and the coating thickness of each of the multiple 3D objects are measured utilizing at least one non-destructive measurement device 102. For example, an uncoated 3D object of a one of the pluralities of 3D objects is measured in a first measurement device and a coated 3D object of a one of the pluralities of 3D objects is measured in a second measurement device of the one or more integrated non-destructive measurement devices 102. Concurrently, 3D objects are coated in the one or more of coating modules 104. The system controller 112 controls movements of the transfer mechanism 110, transferring 3D objects to and from the measurement devices 102 and coating modules 104, and is interfaced with the controller 208 for performing the method 500.


The image sensor system 212 includes an illumination unit 201 and one or more image sensors 203. In one embodiment, which can be combined with other embodiments described herein, the one or more image sensors 203 are cameras. In a first configuration, also known as a time-of-flight configuration, the illumination unit 201 is configured to project one or more pulses of light, such as infrared light, on the one or more surfaces 103 of the object 101 without chemically or physically changing the one or more surfaces 103 of the object 101. Each of the one or more image sensors 203 coupled to the controller 208 having a timing mechanism determine the time-of-flight of the one or more pulses of light from the one or more surfaces 103 of the object 101 to each of the one or more image sensors 203. Determining the time-of-flight of the one or more pulses of light from the one or more surfaces 103 of the object 101 to each of the one or more image sensors 203 allows for z-axis positions (z1, z2, . . . , zn) on a z-axis 207 of the one or more surfaces 103 to be determined to generate a 3D image of the one or more surfaces 103.


In a second configuration, also known as a structured light configuration, the illumination unit 201 is configured to project a fringe pattern on the one or more surfaces 103 of the object 101 without chemically or physically changing the one or more surfaces 103 of the object 101. The beams of light are reflected off the one or more surfaces 103 and one or more images are collected by the one or more image sensors 203. Each of the one or more image sensors 203 has a field of view 211. The distances between the beams of light captured in the one or more images collected by the one or more image sensors 203 allows for the z-axis positions (z1, z2, . . . , zn) on a z-axis 207 of the one or more surfaces 103 to be determined to generate a 3D image of the one or more surfaces 103.


As shown in FIG. 2C, the surface 103 of the object 101 has at least one part coordinate 213. Each part coordinate 213 includes an x-position (x1, x2, . . . , xn) on an x-axis 205 and a y-position (y1, y2, . . . , yn) on an y-axis 209. The x-axis 205, y-axis 209, and z-axis 207 are relative the image sensor system 212 facing the surface 103. Thus, each of the one or more surfaces 103 has at least one part coordinate on the x-axis 205 and the y-axis 209 of the coordinate gird 214. The image sensor system 212 includes a Quick Response (QR) code reader 215 operable to read a QR code 217 disposed on the object 101. Each plurality of 3D objects of the pluralities of 3D objects has a different QR code 217. The image sensor system 212 is coupled to the controller 208 such that the QR code is provided to controller 208. The controller instructs the image sensor system 212 to determine the z-axis positions of each part coordinate 213 of the object 101 corresponding to the QR code 217.



FIG. 3 is a flow diagram of a sub-method 300 of non-destructive measurement method 500 for determining coating thickness of a 3D object. In one embodiment, which can be combined with other embodiments described herein, the measurement device 102 is utilized for the sub-method 300. At operation 301, the QR code 217 of an uncoated object 101 is read by the QR reader 215 and the aligned on the stage 210 at the alignment position 236 corresponding to the QR code 217. At optional operation 302, the image sensor system 212 determines the z-axis position of each part coordinate 213 of a first uncoated surface of the object 101. At operation 303, a 3D image of the first uncoated surface of the object 101 is collected by the image sensor system 212. At operation 304, optional operation 302 and operation 303 are repeated for a predetermined number of uncoated surfaces. In one embodiment, which can be combined with other embodiments described herein, the image sensor assembly 204 includes each of the one or more image sensor systems 212 in fixed positions facing a surface of the one or more surfaces 103 of the object 101. Therefore, the z-axis positions are determined and 3D images are collected without rotating the object 101. In other embodiments, which can be combined with other embodiments described herein, the object 101 is rotated by the stage 210 and/or the one or more image sensor systems 212 are moved around the circumference 231 of the stage 210. At operation 305, the uncoated object 101 is removed from the stage 210 and coated. At operation 306, operations 301-304 are repeated for the coated object 101.



FIG. 4 is a flow diagram of a sub-method 400 of non-destructive measurement method 500 for determining coating thickness of a 3D object. In one embodiment, which can be combined with other embodiments described herein, the measurement device 102 is utilized for the sub-method 400. At operation 401, the QR code 217 of an object 101 is read by the QR reader 215 and the aligned on the stage 210 at the alignment position 236 corresponding to the QR code 217. The object 101 includes one or more surfaces 103. At optional operation 402, the image sensor system 212 determines the z-axis position of each part coordinate 213 of the one or more surfaces. At operation 403, a 3D image of each of the one or more surfaces 103 is collected. In one embodiment, which can be combined with other embodiments described herein, at least one surface of the one or more surfaces 103 is uncoated and at least one surface of the one or more surfaces is coated. In another embodiment, which can be combined with other embodiments described herein, at least one surface of the one or more surfaces 103 has a coated portion and an uncoated portion. In one embodiment, which can be combined with other embodiments described herein, the image sensor assembly 204 includes each of the one or more image sensor systems 212 in fixed positions facing the one or more surfaces 103 of the object 101. Therefore, the z-axis positions are determined and 3D images are collected without rotating the object 101. In other embodiments, which can be combined with other embodiments described herein, the object 101 is rotated by the stage 210 and/or the one or more image sensor systems 212 are moved around the circumference 231 of the stage 210.



FIG. 5 is a flow diagram of a method 500 for determining coating thickness of a 3D object. In one embodiment, which can be combined with other embodiments described herein, the measurement device 102 is utilized for the method 500. At operation 501, one of the sub-method 300 and the sub-method 400 are performed to collect one or more 3D images. In one embodiment, which can be combined with other embodiments described herein, one of the sub-method 300 and the sub-method 400 are performed to collect at least one first 3D image of an uncoated surface of the object 101 and at least one second 3D image of a coated surface of the object 101. In the embodiment, the uncoated surface and coated surface correspond to a same surface of the one or more surfaces 103 of the object 101. In another embodiment, which can be combined with other embodiments described herein, the sub-method 400 collects at least one first 3D image of a surface having an uncoated portion and a coated portion. The first 3D image include z-axis positions (z1, z2, . . . , zn) on the z-axis 207 of the surface of the first 3D image. The second 3D image include z-axis positions (z1, z2, . . . , zn) on the z-axis 207 of the surface of the second 3D image.


At optional operation 502, the one or more 3D images are mirrored in the z-axis 207. The mirrored one or more 3D images correspond to surface profiles. In one embodiment, which can be combined with other embodiments described herein, the first 3D image is mirrored in the z-axis 207 such that a mirrored 3D image corresponds to a first surface profile of the uncoated surface. In another embodiment, which can be combined with other embodiments described herein, the first 3D image is mirrored in the z-axis 207 such that a mirrored first 3D image corresponds to the first surface profile of the uncoated portion and a second surface profile of the coated portion of the surface. A mirrored second 3D image corresponds to the second surface profile of the coated surface. At operation 503, outliers of the one or more 3D images are removed. In one embodiment, which can be combined with other embodiments described herein, outliers the first 3D image and the second 3D image are removed. At optional operation 504, areas of the one or more 3D images are selected for operations 505-507. In one embodiment, which can be combined with other embodiments described herein, the first 3D image and the second 3D image are selected for operations 505-507. At operation 505, the one or more 3D images are filtered. In one embodiment, which can be combined with other embodiments described herein, the first 3D image and the second 3D image are filtered. In one embodiment, which can be combined with other embodiments described herein, the first 3D image and the second 3D image are filtered utilizing a Gaussian filter. Filtering the first 3D image and the second 3D image removes one of image noise and surface roughness.


At optional operation 506, the one or more 3D images are overlapped. In one embodiment, which can be combined with other embodiments described herein, the first 3D image and the second 3D image are overlapped. When the first 3D image includes the uncoated portion and the coated portion it is not necessary to overlap the first 3D image and the second 3D image because the second 3D image is not collected. At operation 507, the surface profiles of the one or more images are subtracted. In one embodiment, which can be combined with other embodiments described herein, the second surface profile is subtracted first surface profile to obtain a thickness of the coating. At optional operation 508, operations 502-507 are repeated. In one embodiment, which can be combined with other embodiments described herein, operations 502-507 are repeated for at least one subsequent first 3D image and at least one subsequent second 3D image collected from one of the sub-method 300 and the sub-method 400. In another embodiment, which can be combined with other embodiments described herein, operations 501-508 are repeated to collect and analyze one or more 3D images of additional surfaces.


In summation, a non-destructive measurement device measurement device and a non-destructive measurement method for determining coating thickness of a three-dimensional (3D) object are provided. The utilization non-destructive image collection methods in addition to the time-of-flight configuration and structured light configuration of the image sensor system allow for the determining coating thickness of a 3D object without chemically or physically changing the one or more surfaces of the object.


While the foregoing is directed to examples of the present disclosure, other and further examples of the disclosure may be devised without departing from the basic scope thereof, and the scope thereof is determined by the claims that follow

Claims
  • 1. A method of determining a thickness of an object coating, comprising: positioning an uncoated surface of an object in a field of view of at least one image sensor system in a non-destructive measurement device, the object having one or more surfaces;reading a QR code corresponding to at least one part coordinate of the uncoated surface or corresponding to an alignment position of the uncoated surface in the non-destructive measurement device, the QR code disposed on the object;determining a z-axis position of the at least one part coordinate of the uncoated surface;collecting a first 3D image of the uncoated surface without chemically or physically changing the one or more surfaces of the object, the first 3D image corresponding to a first surface profile of the uncoated surface;repeating the determining the z-axis position of the at least one part coordinate of the uncoated surface and the collecting the first 3D image for a predetermined number of uncoated surfaces;positioning a coated surface of the object in the field of view of the at least one image sensor system;collecting a second 3D image of the coated surface without chemically or physically changing the one or more surfaces of the object, the second 3D image corresponding to a second surface profile of the coated surface; andanalyzing the first 3D image and the second 3D image.
  • 2. The method of claim 1, wherein the analyzing the second 3D image and the first 3D image comprises: removing outliers of the first surface profile and the second surface profile;filtering the first surface profile and the second surface profile;overlapping the first surface profile and the second surface profile; andsubtracting the second surface profile from the first surface profile to obtain a thickness of a coating of the object.
  • 3. The method of claim 2, wherein the analyzing the first 3D image and the second 3D image further comprises: mirroring the first 3D image and the second 3D image; andselecting a first area of the first 3D image and a second area of the second 3D image.
  • 4. The method of claim 1, wherein the positioning a surface of the object in the field of view of the at least one image sensor system includes aligning the object on a stage of the non-destructive measurement device with an alignment mechanism of the non-destructive measurement device to align the object at the alignment position corresponding to the QR code.
  • 5. The method of claim 1, wherein the at least one image sensor system comprises an illumination unit and one or more image sensors, and collecting a 3D image of a surface of the one or more surfaces comprises: projecting one or more pulses of light to the surface of the object with the illumination unit; anddetermining a time-of-flight of the one or more pulses of light with the one or more image sensors coupled to a controller having a timing mechanism.
  • 6. The method of claim 1, wherein the image sensor system comprises an illumination unit and one or more image sensors, and collecting a 3D image of a surface of the one or more surfaces comprises: projecting a fringe pattern onto the surface of the object with the illumination unit; andcollecting beams of light of the fringe pattern reflected off the surface of the object.
  • 7. The method of claim 1, further comprising: positioning a subsequent uncoated surface of the object in the field of view of the at least one image sensor system;collecting a third 3D image of the subsequent uncoated surface without chemically or physically changing the one or more surfaces of the object, the third 3D image corresponding to a third surface profile of the subsequent uncoated surface;positioning a subsequent coated surface of the object in the field of view of the at least one image sensor system;collecting a fourth 3D image of the subsequent coated surface without chemically or physically changing the one or more surfaces of the object, the fourth 3D image corresponding to a fourth surface profile of the subsequent coated surface; andanalyzing the third 3D image and the fourth 3D image.
  • 8. A method of determining a thickness of an object coating, comprising: positioning a surface of an object in a field of view of at least one image sensor system in a non-destructive measurement device, the object having one or more surfaces and at least one surface having an uncoated portion and a coated portion;reading a QR code corresponding to at least one part coordinate or an alignment position of the surface having the uncoated portion and the coated portion in the non-destructive measurement device, the QR code disposed on the object;collecting a 3D image of the at least one surface having the uncoated portion and the coated portion without chemically or physically changing the one or more surfaces of the object, the 3D image corresponding to a first surface profile of the uncoated portion and a second surface profile of the coated portion; andanalyzing the 3D image.
  • 9. The method of claim 8, wherein the analyzing the 3D image comprises: removing outliers of the first surface profile and the second surface profile;filtering the first surface profile and the second surface profile; andsubtracting the second surface profile from the first surface profile to obtain a thickness of a coating of the object.
  • 10. The method of claim 9, wherein the analyzing the 3D image further comprises: mirroring the 3D image; andselecting an area of the 3D image.
  • 11. The method of claim 8, wherein the image sensor system includes a QR code reader operable to read the QR code, and wherein positioning the surface of the object in the field of view of the at least one image sensor system includes aligning the object on a stage of the non-destructive measurement device with an alignment mechanism of the non-destructive measurement device to align the object at the alignment position.
  • 12. The method of claim 11, wherein the collecting the 3D image of the surface includes determining a z-axis position of the at least one part coordinate.
  • 13. The method of claim 8, wherein the at least one image sensor system comprises an illumination unit and one or more image sensors, and the collecting the 3D image of the surface of the one or more surfaces comprises: projecting one or more pulses of light to the surface of the object with the illumination unit; anddetermining a time-of-flight of the one or more pulses of light with the one or more image sensors coupled to a controller having a timing mechanism.
  • 14. The method of claim 8, wherein the at least one image sensor system comprises an illumination unit and one or more image sensors, and the collecting the 3D image of the surface of the one or more surfaces comprises: projecting a fringe pattern onto the surface of the object with the illumination unit; andcollecting beams of light of the fringe pattern reflected off the surface of the object.
  • 15. The method of claim 8, wherein the analyzing the 3D image further comprises: mirroring the 3D image; andselecting an area of the 3D image.
  • 16. The method of claim 8, further comprising: positioning a second surface of the object in the field of view of the at least one image sensor system, the second surface having a second uncoated portion and a second coated portion;collecting a second 3D image of the second surface without chemically or physically changing the one or more surfaces of the object, the second 3D image corresponding to a third surface profile of the second uncoated portion and a fourth surface profile of the second coated portion; andanalyzing the second 3D image.
  • 17. A non-destructive measurement device, comprising: a body;a stage assembly disposed in the body, the stage assembly comprising: a stage configured to retain an object; anda coordinate grid;an image sensor assembly disposed in the body, the image sensor assembly comprising: one or more image sensor systems, each of the one or more image sensor systems having an illumination unit, one or more image sensors, and a Quick Response (QR) code reader;an alignment mechanism disposed in the body, the alignment mechanism operable to align the object at an alignment position on the coordinate grid, the alignment position of an uncoated surface corresponding to a QR code of the object; anda controller coupled to the stage assembly, the image sensor assembly, and the alignment mechanism, the controller interfaced with a coating system and controlling automation integration with the coating system via a system controller of the coating system, the controller configured to: repeatedly instruct the one or more image sensor systems to determine a z-axis positon of at least one part coordinate of the uncoated surface corresponding to the QR code of the object and collect one or more 3D images for a predetermined number of surfaces of the object; andanalyze the one or more 3D images to obtain a thickness of a coating of the object.
  • 18. The device of claim 17, wherein each of the one or more image sensor systems comprises: an illumination unit to project one or more pulses of light a surface of the object; andone or more image sensors to determine a time-of-flight of the one or more pulses of light with a timing mechanism of the controller.
  • 19. The device of claim 17, wherein each of the one or more image sensor systems comprises: an illumination unit to project a fringe pattern onto a surface of the object; andone or more image sensors to collect beams of light of the fringe pattern reflected off the surface of the object.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims benefit of U.S. Provisional Patent Application Ser. No. 62/722,008, filed Aug. 23, 2018, and U.S. Provisional Patent Application Ser. No. 62/770,129, filed Nov. 20, 2018 which are herein incorporated by reference.

US Referenced Citations (242)
Number Name Date Kind
5217757 Olson et al. Jun 1993 A
5503874 Ackerman et al. Apr 1996 A
5950925 Fukunaga et al. Sep 1999 A
6042898 Burns et al. Mar 2000 A
6156382 Rajagopalan et al. Dec 2000 A
6162715 Mak et al. Dec 2000 A
6245192 Dhindsa et al. Jun 2001 B1
6309713 Mak et al. Oct 2001 B1
6332926 Pfaendtner et al. Dec 2001 B1
6359089 Hung et al. Mar 2002 B2
6379466 Sahin et al. Apr 2002 B1
6402898 Brumer et al. Jun 2002 B1
6437066 Hung et al. Aug 2002 B1
6551929 Kori et al. Apr 2003 B1
6607976 Chen Aug 2003 B2
6620670 Song et al. Sep 2003 B2
6620723 Byun et al. Sep 2003 B1
6620956 Chen et al. Sep 2003 B2
6630244 Mao et al. Oct 2003 B1
6677247 Yuan et al. Jan 2004 B2
6740585 Yoon et al. May 2004 B2
6784096 Chen et al. Aug 2004 B2
6797340 Fang et al. Sep 2004 B2
6805750 Ristau et al. Oct 2004 B1
6809026 Yoon et al. Oct 2004 B2
6811814 Chen et al. Nov 2004 B2
6821891 Chen et al. Nov 2004 B2
6825134 Law et al. Nov 2004 B2
6827978 Yoon et al. Dec 2004 B2
6831021 Chua et al. Dec 2004 B2
6833161 Wang et al. Dec 2004 B2
6838125 Chung et al. Jan 2005 B2
6846516 Yang et al. Jan 2005 B2
6869838 Law et al. Mar 2005 B2
6872429 Chen et al. Mar 2005 B1
6905939 Yuan et al. Jun 2005 B2
6911391 Yang et al. Jun 2005 B2
6924191 Liu et al. Aug 2005 B2
6936538 Byun Aug 2005 B2
6939801 Chung et al. Sep 2005 B2
6939804 Lai et al. Sep 2005 B2
6951804 Seutter et al. Oct 2005 B2
6972267 Cao et al. Dec 2005 B2
7026238 Xi et al. Apr 2006 B2
7041335 Chung May 2006 B2
7049226 Chung et al. May 2006 B2
7081271 Chung et al. Jul 2006 B2
7101795 Xi et al. Sep 2006 B1
7211144 Lu et al. May 2007 B2
7211508 Chung et al. May 2007 B2
7241686 Marcadal et al. Jul 2007 B2
7244683 Chung et al. Jul 2007 B2
7262133 Chen et al. Aug 2007 B2
7264846 Chang et al. Sep 2007 B2
7265048 Chung et al. Sep 2007 B2
7279432 Xi et al. Oct 2007 B2
7285312 Li Oct 2007 B2
7317229 Hung et al. Jan 2008 B2
7371467 Han et al. May 2008 B2
7396565 Yang et al. Jul 2008 B2
7404985 Chang et al. Jul 2008 B2
7405158 Lai et al. Jul 2008 B2
7416979 Yoon et al. Aug 2008 B2
7429402 Gandikota et al. Sep 2008 B2
7429540 Olsen Sep 2008 B2
7439191 Law et al. Oct 2008 B2
7473655 Wang et al. Jan 2009 B2
7507660 Chen et al. Mar 2009 B2
7531468 Metzner et al. May 2009 B2
7547952 Metzner et al. Jun 2009 B2
7569501 Metzner et al. Aug 2009 B2
7573586 Boyer et al. Aug 2009 B1
7585762 Shah et al. Sep 2009 B2
7595263 Chung et al. Sep 2009 B2
7601652 Singh et al. Oct 2009 B2
7651955 Ranish et al. Jan 2010 B2
7732327 Lee et al. Jun 2010 B2
7737028 Wang et al. Jun 2010 B2
7776395 Mahajani Aug 2010 B2
7816200 Kher Oct 2010 B2
7824743 Lee et al. Nov 2010 B2
7833358 Chu et al. Nov 2010 B2
7846840 Kori et al. Dec 2010 B2
7867900 Lee et al. Jan 2011 B2
7875119 Gartland et al. Jan 2011 B2
7910165 Ganguli et al. Mar 2011 B2
7910446 Ma et al. Mar 2011 B2
7964505 Khandelwal et al. Jun 2011 B2
7972978 Mahajani Jul 2011 B2
8043907 Ma et al. Oct 2011 B2
8056652 Lockwood et al. Nov 2011 B2
8227078 Morra et al. Jul 2012 B2
8277670 Heo et al. Oct 2012 B2
8470460 Lee Jun 2013 B2
8741420 Bunker et al. Jun 2014 B2
8871297 Barnett et al. Oct 2014 B2
9255327 Winter et al. Feb 2016 B2
9683281 Meehan et al. Jun 2017 B2
9777583 Leggett Oct 2017 B2
9873940 Xu et al. Jan 2018 B2
10287899 Dierberger May 2019 B2
10369593 Barnett et al. Aug 2019 B2
10633740 Melnik et al. Apr 2020 B2
20020002258 Hung et al. Jan 2002 A1
20020045782 Hung et al. Apr 2002 A1
20020117399 Chen et al. Aug 2002 A1
20020127336 Chen et al. Sep 2002 A1
20030010451 Tzu et al. Jan 2003 A1
20030057526 Chung et al. Mar 2003 A1
20030059535 Luo et al. Mar 2003 A1
20030059538 Chung et al. Mar 2003 A1
20030072884 Zhang et al. Apr 2003 A1
20030082301 Chen et al. May 2003 A1
20030123216 Yoon et al. Jul 2003 A1
20030124262 Chen et al. Jul 2003 A1
20030132319 Hytros et al. Jul 2003 A1
20030136520 Yudovsky et al. Jul 2003 A1
20030139005 Song et al. Jul 2003 A1
20030157760 Xi et al. Aug 2003 A1
20030172872 Thakur et al. Sep 2003 A1
20030198754 Xi et al. Oct 2003 A1
20030203616 Chung et al. Oct 2003 A1
20030215570 Seutter et al. Nov 2003 A1
20030235961 Metzner et al. Dec 2003 A1
20040009665 Chen et al. Jan 2004 A1
20040013803 Chung et al. Jan 2004 A1
20040018738 Liu Jan 2004 A1
20040079648 Khan et al. Apr 2004 A1
20040171280 Conley et al. Sep 2004 A1
20050003310 Bai et al. Jan 2005 A1
20050008780 Ackerman et al. Jan 2005 A1
20050019593 Mancini et al. Jan 2005 A1
20050053467 Ackerman et al. Mar 2005 A1
20050085031 Lopatin et al. Apr 2005 A1
20050158590 Li Jul 2005 A1
20050255329 Hazel Nov 2005 A1
20050260347 Narwankar et al. Nov 2005 A1
20050260357 Olsen et al. Nov 2005 A1
20050271813 Kher et al. Dec 2005 A1
20060019032 Wang et al. Jan 2006 A1
20060019033 Muthukrishnan et al. Jan 2006 A1
20060040052 Fang et al. Feb 2006 A1
20060062917 Muthukrishnan et al. Mar 2006 A1
20060084283 Paranjpe et al. Apr 2006 A1
20060148180 Ahn et al. Jul 2006 A1
20060153995 Narwankar et al. Jul 2006 A1
20060228895 Chae et al. Oct 2006 A1
20060246213 Moreau et al. Nov 2006 A1
20060286819 Seutter et al. Dec 2006 A1
20070009658 Yoo et al. Jan 2007 A1
20070009660 Sasaki et al. Jan 2007 A1
20070049043 Muthukrishnan et al. Mar 2007 A1
20070054487 Ma et al. Mar 2007 A1
20070065578 McDougall Mar 2007 A1
20070099415 Chen et al. May 2007 A1
20070134518 Feist et al. Jun 2007 A1
20070202254 Ganguli et al. Aug 2007 A1
20070259111 Singh et al. Nov 2007 A1
20070274837 Taylor et al. Nov 2007 A1
20080032510 Olsen Feb 2008 A1
20080038578 Li Feb 2008 A1
20080056905 Golecki Mar 2008 A1
20080090425 Olsen Apr 2008 A9
20080113095 Gorman et al. May 2008 A1
20080135914 Krishna et al. Jun 2008 A1
20080268154 Kher et al. Oct 2008 A1
20080268635 Yu et al. Oct 2008 A1
20090004386 Makela et al. Jan 2009 A1
20090004850 Ganguli et al. Jan 2009 A1
20090053426 Lu et al. Feb 2009 A1
20090061613 Choi et al. Mar 2009 A1
20090098289 Deininger et al. Apr 2009 A1
20090098346 Li Apr 2009 A1
20090155976 Ahn et al. Jun 2009 A1
20090269507 Yu et al. Oct 2009 A1
20090286400 Heo et al. Nov 2009 A1
20100062149 Ma et al. Mar 2010 A1
20100062614 Ma et al. Mar 2010 A1
20100075999 Olsen Mar 2010 A1
20100110451 Biswas May 2010 A1
20100120245 Tjandra et al. May 2010 A1
20100159150 Kirby et al. Jun 2010 A1
20100167627 Wu et al. Jul 2010 A1
20100239758 Kher et al. Sep 2010 A1
20100270609 Olsen et al. Oct 2010 A1
20110043820 Sansom Feb 2011 A1
20110175038 Hou et al. Jul 2011 A1
20110293825 Atwal et al. Dec 2011 A1
20120024403 Gage et al. Feb 2012 A1
20120082783 Barnett et al. Apr 2012 A1
20120148944 Oh et al. Jun 2012 A1
20120276306 Ueda Nov 2012 A1
20120318773 Wu et al. Dec 2012 A1
20130048605 Sapre et al. Feb 2013 A1
20130164456 Winter et al. Jun 2013 A1
20130292655 Becker et al. Nov 2013 A1
20140103284 Hsueh et al. Apr 2014 A1
20140264297 Kumar et al. Sep 2014 A1
20140271220 Leggett Sep 2014 A1
20150017324 Barnett et al. Jan 2015 A1
20150184296 Xu et al. Jul 2015 A1
20150221541 Nemani et al. Aug 2015 A1
20160010472 Murphy et al. Jan 2016 A1
20160251972 Dierberger Sep 2016 A1
20160281230 Varadarajan et al. Sep 2016 A1
20160298222 Meehan et al. Oct 2016 A1
20160300709 Posseme et al. Oct 2016 A1
20160328635 Dave Nov 2016 A1
20170076968 Wang et al. Mar 2017 A1
20170084425 Uziel et al. Mar 2017 A1
20170213570 Cheng et al. Jul 2017 A1
20170233930 Keuleers et al. Aug 2017 A1
20170292445 Nelson et al. Oct 2017 A1
20170314125 Fenwick et al. Nov 2017 A1
20180006215 Jeong et al. Jan 2018 A1
20180105932 Fenwick et al. Apr 2018 A1
20180127868 Xu et al. May 2018 A1
20180261516 Lin et al. Sep 2018 A1
20180261686 Lin et al. Sep 2018 A1
20180329189 Banna et al. Nov 2018 A1
20180339314 Bhoyar et al. Nov 2018 A1
20180351164 Hellmich et al. Dec 2018 A1
20180358229 Koshizawa et al. Dec 2018 A1
20190019690 Choi et al. Jan 2019 A1
20190032194 Dieguez-Campo et al. Jan 2019 A2
20190041192 Bourne Feb 2019 A1
20190079388 Fender et al. Mar 2019 A1
20190088543 Lin et al. Mar 2019 A1
20190130731 Hassan et al. May 2019 A1
20190271076 Fenwick et al. Sep 2019 A1
20190284686 Melnik et al. Sep 2019 A1
20190284692 Melnik et al. Sep 2019 A1
20190284694 Knisley et al. Sep 2019 A1
20190287808 Goradia et al. Sep 2019 A1
20190311900 Pandit et al. Oct 2019 A1
20190311909 Bajaj et al. Oct 2019 A1
20190382879 Jindal et al. Dec 2019 A1
20200027767 Zang et al. Jan 2020 A1
20200043722 Cheng et al. Feb 2020 A1
20200240018 Melnik et al. Jul 2020 A1
20200361124 Britz Nov 2020 A1
20200392626 Chatterjee et al. Dec 2020 A1
Foreign Referenced Citations (16)
Number Date Country
0387113 Dec 1993 EP
2022868 Feb 2009 EP
2103707 Sep 2009 EP
2392895 Dec 2011 EP
2161352 Feb 2014 EP
2823086 Nov 1998 JP
2001342556 Dec 2001 JP
2003013745 Jan 2003 JP
2006199988 Aug 2006 JP
20060106104 Oct 2006 KR
20110014989 Feb 2011 KR
2630733 Sep 2017 RU
0009778 Feb 2000 WO
2005059200 Jun 2005 WO
2014159267 Oct 2014 WO
2015047783 Apr 2015 WO
Non-Patent Literature Citations (34)
Entry
PCT International Search Report and the Written Opinion for International Application No. PCT/US2019/041181; dated Oct. 25, 2019; 15 total pages.
Lang “The Role of Active Elements in the Oxidation Behaviour of High Temperature Metals and Alloys” Elsevier, 1989, pp. 111-129 and 153.
International Search Report and Written Opinion dated Jul. 2, 2019 for Application No. PCT/US2019/022788.
International Search Report and Written Opinion for International Application No. PCT/US2019/022737 dated Jul. 2, 2019, 11 pages.
“A Review on Alumina-Chrome (Al2O3—Cr2O3) and Chrome-Silica (Cr2O3—SiO2) Refractories along with their Binary Phase Diagrams,” Nov. 18, 2009, 6 pages, <http://www.idc-online.com/technical_references/pdfs/chemical_engineering/A_Review_on_Alumina_Chrome.pdf>.
Bensch et al. “Modeling of the Influence of Oxidation of Thin-Walled Specimens of Single Crystal Superalloys,” Superalloys 2012: 12th International Symposium on Superalloys, The Minerals, Metals & Materials Society, pp. 331-340, <https://www.tms.org/superalloys/10.7449/2012/Superalloys_2012_331_340.pdf>.
Fujita et al. “Sintering of Al2O3—Cr2O3 Powder Prepared by Sol-Gel Process,” Journal of the Society of Materials Science, Japan, vol. 56, No. 6, Jun. 2007, pp. 526-530, <http://www.ecm.okayama-u.ac.jp/ceramics/Research/Papers/2007/Fujita_JSMS56(2007)526.pdf>.
Hirata et al. “Corrosion Resistance of Alumina-Chromia Ceramic Materials against Molten Slag,” Materials Transactions, vol. 43, No. 10, 2002, pp. 2561-2567, <https://www.jim.or.jp/journal/e/pdf3/43/10/2561.pdf>.
Knisley et al. “Volatility and High Thermal Stability in Mid to Late First Row Transition Metal Diazadienyl Complexes,” Organometallics, 2011, vol. 30, No. 18, pp. 5010-5017.
Pettit et al. “Oxidation and Hot Corrosion of Superalloys,” Jan. 1984, The Metal Society AIME, Warrendale, PA, pp. 651-687, <http://www.tms.org/superalloys/10.7449/1984/Superalloys_1984_651_687_pdf>.
Tsai et al. “Growth mechanism of Cr2O3 scales: oxygen and chromium diffusion, oxidation kinetics and effect of yttrium,” Materials Science and Engineering A, vol. 212, No. 1, pp. 6-13, 1996, <https://doi.org/10.1016/0921-5093(96)10173-8>.
He et al. “Role of annealing temperatures on the evolution of microstructure and properties of Cr2O3 films,” Applied Surface Science, vol. 357, Part B, Dec. 1, 2015, pp. 1472-1480, <https://doi.org/10.1016/j.apsusc.2015.10.023>.
International Search Report and Written Opinion for International Application No. PCT/US2019/022709 dated Jun. 28, 2019, 13 pages.
Kaloyeros et al. “Review—Silicon Nitrtide and Silicon Nitride-Rich Thin Film Technologies: Trends in Deposition Technniques and Related Application”. ECS Journal of Solid State Science and Technology, 6 (10) p. 691-p. 714 (2017).
Heidary et al. “Study on the behavior of atomic layer deposition coatings on a nickel substrate at high temperature,” Nanotechnology, 27, 245701, 2016, pp. 1-32.
Vargas Garcia et al. “Thermal barrier coatings produced by chemical vapor deposition,” Science and Technology of Advanced Materials, vol. 4, No. 4, 2003, pp. 397-402.
Dyer et al. “CVD Tungsten Carbide and Titanium Carbide Coatings for Aerospace Components,” SAE Transactions, vol. 98, Section 1: Journal of Aerospace (1989), pp. 64-70. Abstract Only.
International Search Report and Written Opinion dated Jul. 6, 2020 for Application No. PCT/US2020/024285.
PCT Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority for International Application No. PCT/US2019/019113; dated Jun. 10, 2019; 11 total pages.
Taiwan Office Action dated Apr. 22, 2020 for Application No. 108106406.
International Search Report and Written Opinion dated Jun. 24, 2020 for Application No. PCT/US2020/019151.
Liu et al., “Ultrathin high-temperature oxidation-resistant coatings of hexagonal boron nitride” Nature Communications; doi: 10.1038/ncomms3541; Pub. Oct. 4, 2013, 8 pages.
Calderon, “Boron Nitride Growth and Electronics”, Cornell University, May 2018.
W. Auwarter, “Hexagonal boron nitride monolayers on metal supports: Versatile templates for atoms, molecules and nanostructures”, Surface Science Reports 74 (2019) 1-95.
International Search Report and Written Report dated Jul. 31, 2020 for Application No. PCT/US2020/027247.
International Search Report/Written Opinion issued to PCT/US2020/028462 dated Jul. 29, 2020.
Leppaniemi, Jarmo, et al., “Corrosion protection of steel with multilayer coatings: Improving the sealing properties of physical vapor deposition CrN coatings with Al2O3/Ti02atomic layer deposition nanolaminates”. Thin Solid Films 627 (2017) pp. 59-68.
Ali, Muhammad Rostom, et al., “Electrodeposition of aluminum-chromium alloys from Al&BPC melt and its corrosion and high temperature oxidation behaviors”. Electrochimica Acta, vol. 42. No. 15., pp. 2347-2354, 1997.
Wu, Yanlin, et al., “Atomic Layer Deposition from Dissolved Precursors”. Nano Letters 2015, 15, 6379-6385.
Johnson, Andrew L., et al., “Recent developments in molecular precursors for atomic layer deposition”. Organomet. Chem., 2019, 42, 1-53.
Haukka, Suvi, et al., “Chemisorption of chromium acetylacetonate on porous high surface area silica”. Applied Surface Science, vol. 75, Issues 1-4, Jan. 2, 1994, pp. 220-227. Abstract Only.
International Search Report and Written Opinion dated Oct. 30, 2020 for Application No. PCT/US2020/041382.
Taiwan Office Action dated Dec. 21, 2020 for Application No. 109113600.
International Search Report and Written Opinion dated Feb. 2, 2021 for Application No. PCT/US2020/056618.
Related Publications (1)
Number Date Country
20200064121 A1 Feb 2020 US
Provisional Applications (2)
Number Date Country
62770129 Nov 2018 US
62722008 Aug 2018 US