The invention relates to inspection of mechanical systems for detecting damage.
It is well known that mechanical systems undergo wear and degradation during use. Where such systems are repairable, it is common to inspect them periodically or aperiodically to determine if any damage is sufficient to warrant repair either at the time of inspection or to predict the future time when repair will be needed. Most of these inspections are performed by human inspectors, possibly using inspection equipment. The inspection equipment may provide metrology, but often does not, and it may record any data for subsequent use. The saving of data for additional analysis is especially important as it enables diagnosis of degradation rates and Condition-Based Maintenance (CBM).
In the non-limiting example of turbines, and even more particularly gas turbine engines, the inspection equipment is typically a borescope. The borescope typically is able to save images or videos of internal components and structures. However, a borescope typically records an image or video from a limited area and gradually broadens or alters its field of view as it “snakes” or is otherwise positioned through a component or chamber. In order to know where the borescope is looking, the human inspector needs to look at the image/video data and use his/her knowledge of the internal structure. Such an inspection is generally illustrated in
What is needed is an efficient and effective method of automatically or semi-automatically analyzing inspection data to determine the imaged location and perspective, to detect the presence of damage, to categorize, parameterize, or otherwise measure any damage, and to archive the inspection results for future use.
It is well known in the field of multi-view geometry in computer vision that images of a planar scene, such as a large wall, or a remote scene (scene at infinity), or images captured with the camera rotating around its center of projection, can be stitched together to form a single big image of the scene. This process is called image mosaicking. However, when the scene is not planar or when the camera is not rotating around its center of projection, this process is negatively impacted by the parallax effect, causing inaccuracy in the mosaicked image.
It is also well known in the field of machine perception and robotic navigation that Structure from Motion (SFM) and Simultaneous Localization and Mapping (SLAM) techniques can estimate three-dimensional structures from two-dimensional image sequences.
A variety of processes that include use of borescope video of blades in an engine to determine damage are disclosed in U.S. Pat. Nos. 8,781,210; 8,781,209; 8,792,705; 8,744,166; and 8,761,490. These approaches generally analyze two-dimensional (2D) images for differences between the current image and a model learned from other two-dimensional images in a blade row.
The above methods operate on images/video only and if an approximate 3D model is constructed, it is subject to a “drifting” error as more images are used. This drifting occurs from the accumulation of small errors and eventually results in large location or orientation errors.
In one non-limiting embodiment, a method for inspection of a mechanical system, comprises the steps of: obtaining a two-dimensional image sequence of the mechanical system; generating a three-dimensional structure model from the two-dimensional image sequence; refining the three-dimensional structure model with an existing three-dimensional model of the mechanical system to produce a refined model having intensity and/or color information from the two-dimensional image sequence and structural accuracy of the existing three-dimensional model.
In a further non-limiting embodiment, the two-dimensional image sequence is obtained with a borescope.
In a further non-limiting embodiment, the mechanical system is a turbine.
In a further non-limiting embodiment, the mechanical system is a gas turbine engine.
In a further non-limiting embodiment, the refining step comprises matching the three-dimensional structure model with the existing three-dimensional model.
In a further non-limiting embodiment, the refining step comprises regression between the three-dimensional structure model and the existing three-dimensional model.
In a further non-limiting embodiment, the existing three-dimensional model comprises at least one of an as-designed CAD model, an as-built model, and a previous condition model.
In a further non-limiting embodiment, the method further comprises, before the refining step, mapping the two-dimensional image sequence to the existing three-dimensional model to obtain an augmented existing three-dimensional model, and wherein the refining step comprises refining the three-dimensional structure model with the augmented existing three-dimensional model.
In a further non-limiting embodiment, a system for inspection of a mechanical system, comprises: a camera positionable through the mechanical system to obtain a two-dimensional image sequence of the mechanical system; a processor system in communication with the camera to receive the two-dimensional image sequence and configured to generate a three-dimensional structure model from the two-dimensional image sequence, and configured to refine the three-dimensional structure model with an existing three-dimensional model of the mechanical system to produce a refined model having intensity and/or color information from the two-dimensional image sequence and structural accuracy of the existing three-dimensional model.
In a further non-limiting embodiment, the processor system is in communication with a storage containing the two-dimensional image sequence and the existing three-dimensional model.
In a further non-limiting embodiment, the camera is a borescope.
In a further non-limiting embodiment, the two-dimensional image sequence is an image sequence of a turbine.
In a further non-limiting embodiment, the two-dimensional image sequence is an image sequence of a gas turbine engine.
In a further non-limiting embodiment, the processor system is configured to refine by matching the three-dimensional structure model with the existing three-dimensional model.
In a further non-limiting embodiment, the processor system is configured to refine by regression between the three-dimensional structure model and the existing three-dimensional model.
In a further non-limiting embodiment, the existing three-dimensional model comprises at least one of an as-designed CAD model, an as-built model, and a previous condition model.
In a further non-limiting embodiment, the processor system is configured to map the two-dimensional image sequence to the existing three-dimensional model to obtain an augmented existing three-dimensional model, and to refine the three-dimensional structure model with the augmented existing three-dimensional model.
The details of one or more embodiments of the disclosure are set forth in the accompanying drawings and the description below. Other features, objects, and advantages will be apparent from the description and drawings, and from the claims.
Like reference numbers and designations in the various drawings indicate like elements.
The invention relates to a system and method whereby improved inspection of a mechanical system is provided by combining a three-dimensional structure model generated from the two-dimensional sequence of images with an existing three-dimensional model to produce a refined three-dimensional image wherein the issues of parallax effect and drifting can be minimized.
In many instances, there will be available an existing three-dimensional model, for example an as-designed CAD model, an as-built model, and/or a previous condition model (as might result from the system and method disclosed herein). These existing models do not have the same issues of parallax effect or drifting and are therefore structurally more accurate than the SFM or SLAM generated three-dimensional structure model. By refining the generated three-dimensional structure model with the existing three-dimensional model, a refined model can be produced, for example having the intensity, color and/or detail of the two-dimensional sequence of images and having the structural accuracy of the existing three-dimensional model.
According to the disclosed method, an existing three-dimensional model (an as-designed CAD model, an as-built model, a previous condition model, etc.) of a component interior (high pressure turbine chamber, combustor chamber, etc.) can be used to assist the mosaicking of multiple images. As a result, the existing three-dimensional model is augmented with image details such as pixel intensity/color on its vertices and surfaces. Potential damage, with accurate metrology, can then be viewed and detected using the augmented three-dimensional model. A “fly-through” of the 3D model is also possible to get a global view of the entire structure interior. Inspection of a local area and detection of potential damage can then be performed with a clear understanding of location with respect to the global 3D model of the entire structure interior, accurate metrology, and, optionally, with a clear understanding of damage progression.
In
Next, the accuracy of the three-dimensional structure model 108 is improved using an existing three-dimensional model such as a three-dimensional CAD model 112. Accuracy of the three-dimensional structure model 108 is improved through a model refinement step 114. Refinement step 114 can be configured to run on processing system 110, and generates a refined model 115 having, for example, intensity and/or color information from the two-dimensional image sequence and structural accuracy from the existing three-dimensional model.
The refining step can be an interpolation or averaging of features from the three-dimensional structure model 108 and the existing three-dimensional model 112.
Another example of this model refinement method carried out in the refining step can be regression between the two models to estimate a transformation from the first model to the second using vertex position information alone. The refined model obtains corresponding intensity and/or color information from the SFM-/SLAM-generated 3D structure model. The regression may be linear or non-linear (e.g., optical flow).
In one embodiment, the transformation can be formulated as a 4×4 matrix containing 15 independent parameters when vertices of the two models are represented in homogeneous coordinates. These parameters define relative translation, rotation, stretching, squeezing, and shearing between the two models. A minimum of 5 pairs of 3D vertices are identified to be corresponding to each other from the two models. The identification can be either manual, semi-automatic, or fully automatic. The coordinates of these pairs of vertices can be used first to solve a linear regression problem to get an initial estimate of the 15 parameters. This initial estimate can then be used as starting point to solve a non-linear regression problem using an algorithm such as Gauss-Newton or Levenberg-Marquardt. As a result, the refined values of the 15 parameters can be used to transform one model to match closely with the other.
When the two three-dimensional models are combined and refined, a refined three-dimensional model results, shown at 122, and this model combines the color and intensity of the image sequence obtained with the borescope with the structural accuracy of the existing three-dimensional model.
It should be noted that the images or sequence of images or video can be images of various different type. One non-limiting example of an alternative type of image is a thermal intensity (or equivalently, temperature) image obtained from a hot high-pressure turbine blade is shown in
The temperature or thermal image 124 obtained in this manner can be transformed (mapped) to the surface of the existing three-dimensional model 128 as shown in
The mapping of the two-dimensional temperature image to the three-dimensional model may be performed, in another embodiment, using backward projection. First the three-dimensional vertices are triangulated to form three-dimensional planar triangular surfaces. Then camera center coordinates are calculated from the transformation matrix. Next every image pixel is back-projected though finding an intersection between the line connecting the pixel and the camera center with a three-dimensional planar triangular surface patch. In this way, not only the three-dimensional model vertices obtain temperature values, but also the triangular surface patches, increasing the resolution of the three-dimensional model in terms of temperature mapping.
The mapping of the two-dimensional temperature image to the three-dimensional model may be performed, in yet another embodiment, by combining the above two methods.
In yet another non-limiting embodiment, if the image/video sensor location and pose are known, for example when exact location and pose of the camera or borescope are known, the images or video frames may be mapped directly to the existing three-dimensional model with local interpolation.
Once a refined three-dimensional model is obtained, either through the process of
One or more embodiments of the present invention have been described. Nevertheless, it will be understood that various modifications may be made without departing from the spirit and scope of the invention. For example, different sources of images and/or types of images can be utilized. Accordingly, other embodiments are within the scope of the following claims.
Number | Name | Date | Kind |
---|---|---|---|
3804397 | Neumann | Apr 1974 | A |
4402053 | Kelley et al. | Aug 1983 | A |
4403294 | Hamada et al. | Sep 1983 | A |
4873651 | Raviv | Oct 1989 | A |
5064291 | Reiser | Nov 1991 | A |
5119678 | Bashyam et al. | Jun 1992 | A |
5345514 | Mandavieh et al. | Sep 1994 | A |
5345515 | Nishi et al. | Sep 1994 | A |
5351078 | Lemelson | Sep 1994 | A |
5963328 | Yoshida et al. | Oct 1999 | A |
6023637 | Liu et al. | Feb 2000 | A |
6153889 | Jones | Nov 2000 | A |
6177682 | Bartulovic et al. | Jan 2001 | B1 |
6271520 | Tao et al. | Aug 2001 | B1 |
6399948 | Thomas | Jun 2002 | B1 |
6434267 | Smith | Aug 2002 | B1 |
6462813 | Haven et al. | Oct 2002 | B1 |
6690016 | Watkins et al. | Feb 2004 | B1 |
6737648 | Fedder et al. | May 2004 | B2 |
6759659 | Thomas et al. | Jul 2004 | B2 |
6804622 | Bunker et al. | Oct 2004 | B2 |
6907358 | Suh et al. | Jun 2005 | B2 |
6965120 | Beyerer et al. | Nov 2005 | B1 |
7026811 | Roney, Jr. et al. | Apr 2006 | B2 |
7064330 | Raulerson et al. | Jun 2006 | B2 |
7119338 | Thompson et al. | Oct 2006 | B2 |
7122801 | Favro et al. | Oct 2006 | B2 |
7129492 | Saito et al. | Oct 2006 | B2 |
7164146 | Weir et al. | Jan 2007 | B2 |
7190162 | Tenley et al. | Mar 2007 | B2 |
7220966 | Saito et al. | May 2007 | B2 |
7233867 | Pisupati et al. | Jun 2007 | B2 |
7240556 | Georgeson et al. | Jul 2007 | B2 |
7272529 | Hogan et al. | Sep 2007 | B2 |
7313961 | Tenley et al. | Jan 2008 | B2 |
7415882 | Fetzer et al. | Aug 2008 | B2 |
7446886 | Aufmuth et al. | Nov 2008 | B2 |
7489811 | Brummel et al. | Feb 2009 | B2 |
7602963 | Nightingale et al. | Oct 2009 | B2 |
7689030 | Suh et al. | Mar 2010 | B2 |
7724925 | Shepard | May 2010 | B2 |
7738725 | Raskar et al. | Jun 2010 | B2 |
7823451 | Sarr | Nov 2010 | B2 |
7966883 | Lorraine et al. | Jun 2011 | B2 |
8050491 | Vaidyanathan | Nov 2011 | B2 |
8204294 | Alloo et al. | Jun 2012 | B2 |
8208711 | Venkatachalam et al. | Jun 2012 | B2 |
8221825 | Reitz et al. | Jul 2012 | B2 |
8239424 | Haigh et al. | Aug 2012 | B2 |
8431917 | Wang et al. | Apr 2013 | B2 |
8449176 | Shepard | May 2013 | B2 |
8520931 | Tateno | Aug 2013 | B2 |
8528317 | Gerez et al. | Sep 2013 | B2 |
8692887 | Ringermacher et al. | Apr 2014 | B2 |
8744166 | Scheid et al. | Jun 2014 | B2 |
8761490 | Scheid et al. | Jun 2014 | B2 |
8781209 | Scheid et al. | Jul 2014 | B2 |
8781210 | Scheid et al. | Jul 2014 | B2 |
8792705 | Scheid et al. | Jul 2014 | B2 |
8913825 | Taguchi et al. | Dec 2014 | B2 |
8983794 | Motzer et al. | Mar 2015 | B1 |
9037381 | Care | May 2015 | B2 |
9046497 | Kush et al. | Jun 2015 | B2 |
9066028 | Koshti | Jun 2015 | B1 |
9080453 | Shepard et al. | Jul 2015 | B2 |
9116071 | Hatcher, Jr. et al. | Aug 2015 | B2 |
9134280 | Cataldo et al. | Sep 2015 | B2 |
9146205 | Renshaw et al. | Sep 2015 | B2 |
9151698 | Jahnke et al. | Oct 2015 | B2 |
9154743 | Hatcher, Jr. et al. | Oct 2015 | B2 |
9240049 | Ciurea et al. | Jan 2016 | B2 |
9251582 | Lim et al. | Feb 2016 | B2 |
9300865 | Wang et al. | Mar 2016 | B2 |
9305345 | Lim et al. | Apr 2016 | B2 |
9458735 | Diwinsky et al. | Oct 2016 | B1 |
9465385 | Kamioka et al. | Oct 2016 | B2 |
9467628 | Geng et al. | Oct 2016 | B2 |
9471057 | Scheid et al. | Oct 2016 | B2 |
9476798 | Pandey et al. | Oct 2016 | B2 |
9476842 | Drescher et al. | Oct 2016 | B2 |
9483820 | Lim et al. | Nov 2016 | B2 |
9488592 | Maresca et al. | Nov 2016 | B1 |
9519844 | Thompson et al. | Dec 2016 | B1 |
9594059 | Brady et al. | Mar 2017 | B1 |
9734568 | Vajaria et al. | May 2017 | B2 |
9785919 | Diwinsky et al. | Oct 2017 | B2 |
9804997 | Sharp et al. | Oct 2017 | B2 |
9808933 | Lin et al. | Nov 2017 | B2 |
9981382 | Strauss et al. | May 2018 | B1 |
10438036 | Reome et al. | Oct 2019 | B1 |
20020121602 | Thomas et al. | Sep 2002 | A1 |
20020167660 | Zaslavsky | Nov 2002 | A1 |
20030117395 | Yoon | Jun 2003 | A1 |
20030205671 | Thomas et al. | Nov 2003 | A1 |
20040089811 | Lewis et al. | May 2004 | A1 |
20040089812 | Favro et al. | May 2004 | A1 |
20040139805 | Antonelli et al. | Jul 2004 | A1 |
20040201672 | Varadarajan et al. | Oct 2004 | A1 |
20040240600 | Freyer et al. | Dec 2004 | A1 |
20040245469 | Favro et al. | Dec 2004 | A1 |
20040247170 | Furze et al. | Dec 2004 | A1 |
20050008215 | Shepard | Jan 2005 | A1 |
20050113060 | Lowery | May 2005 | A1 |
20050151083 | Favro et al. | Jul 2005 | A1 |
20050167596 | Rothenfusser et al. | Aug 2005 | A1 |
20050276907 | Harris et al. | Dec 2005 | A1 |
20060012790 | Furze et al. | Jan 2006 | A1 |
20060078193 | Brummel et al. | Apr 2006 | A1 |
20060086912 | Weir et al. | Apr 2006 | A1 |
20070007733 | Hogarth et al. | Jan 2007 | A1 |
20070017297 | Georgeson et al. | Jan 2007 | A1 |
20070045544 | Favro et al. | Mar 2007 | A1 |
20080022775 | Sathish et al. | Jan 2008 | A1 |
20080053234 | Staroselsky et al. | Mar 2008 | A1 |
20080075484 | Komiya | Mar 2008 | A1 |
20080111074 | Weir et al. | May 2008 | A1 |
20080183402 | Malkin et al. | Jul 2008 | A1 |
20080229834 | Bossi et al. | Sep 2008 | A1 |
20080247635 | Davis et al. | Oct 2008 | A1 |
20080247636 | Davis et al. | Oct 2008 | A1 |
20090000382 | Sathish et al. | Jan 2009 | A1 |
20090010507 | Geng | Jan 2009 | A1 |
20090066939 | Venkatachalam et al. | Mar 2009 | A1 |
20090128643 | Kondo et al. | May 2009 | A1 |
20090252987 | Greene, Jr. | Oct 2009 | A1 |
20090279772 | Sun et al. | Nov 2009 | A1 |
20090312956 | Zombo et al. | Dec 2009 | A1 |
20100124369 | Wu et al. | May 2010 | A1 |
20100212430 | Murai et al. | Aug 2010 | A1 |
20100220910 | Kaucic et al. | Sep 2010 | A1 |
20110062339 | Ruhge et al. | Mar 2011 | A1 |
20110083705 | Stone et al. | Apr 2011 | A1 |
20110119020 | Key | May 2011 | A1 |
20110123093 | Alloo et al. | May 2011 | A1 |
20110299752 | Sun | Dec 2011 | A1 |
20110302694 | Wang et al. | Dec 2011 | A1 |
20120154599 | Huang | Jun 2012 | A1 |
20120188380 | Drescher et al. | Jul 2012 | A1 |
20120249959 | You et al. | Oct 2012 | A1 |
20120275667 | Lu | Nov 2012 | A1 |
20120293647 | Singh | Nov 2012 | A1 |
20130028478 | St-Pierre et al. | Jan 2013 | A1 |
20130041614 | Shepard et al. | Feb 2013 | A1 |
20130070897 | Jacotin | Mar 2013 | A1 |
20130113914 | Scheid et al. | May 2013 | A1 |
20130113916 | Scheid et al. | May 2013 | A1 |
20130163849 | Jahnke et al. | Jun 2013 | A1 |
20130235897 | Bouteyre et al. | Sep 2013 | A1 |
20130250067 | Laxhuber et al. | Sep 2013 | A1 |
20140022357 | Yu et al. | Jan 2014 | A1 |
20140056507 | Doyle et al. | Feb 2014 | A1 |
20140098836 | Bird | Apr 2014 | A1 |
20140184786 | Georgeson et al. | Jul 2014 | A1 |
20140185912 | Lim et al. | Jul 2014 | A1 |
20140198185 | Haugen et al. | Jul 2014 | A1 |
20140200832 | Troy et al. | Jul 2014 | A1 |
20140350338 | Tanaka et al. | Nov 2014 | A1 |
20150041654 | Barychev et al. | Feb 2015 | A1 |
20150046098 | Jack et al. | Feb 2015 | A1 |
20150086083 | Chaudhry et al. | Mar 2015 | A1 |
20150128709 | Stewart et al. | May 2015 | A1 |
20150138342 | Brdar et al. | May 2015 | A1 |
20150185128 | Chang et al. | Jul 2015 | A1 |
20150233714 | Kim | Aug 2015 | A1 |
20150253266 | Lucon et al. | Sep 2015 | A1 |
20150314901 | Murray et al. | Nov 2015 | A1 |
20160012588 | Taguchi et al. | Jan 2016 | A1 |
20160043008 | Murray et al. | Feb 2016 | A1 |
20160109283 | Broussais-Colella et al. | Apr 2016 | A1 |
20160178532 | Lim et al. | Jun 2016 | A1 |
20160241793 | Ravirala et al. | Aug 2016 | A1 |
20160284098 | Okumura et al. | Sep 2016 | A1 |
20160314571 | Finn et al. | Oct 2016 | A1 |
20160328835 | Maresca, Jr. et al. | Nov 2016 | A1 |
20160334284 | Kaplun Mucharrafille et al. | Nov 2016 | A1 |
20170011503 | Newman | Jan 2017 | A1 |
20170023505 | Maione et al. | Jan 2017 | A1 |
20170052152 | Tat et al. | Feb 2017 | A1 |
20170085760 | Ernst et al. | Mar 2017 | A1 |
20170090458 | Lim et al. | Mar 2017 | A1 |
20170122123 | Kell et al. | May 2017 | A1 |
20170142302 | Shaw et al. | May 2017 | A1 |
20170167289 | Diwinsky et al. | Jun 2017 | A1 |
20170184469 | Chang et al. | Jun 2017 | A1 |
20170184549 | Reed et al. | Jun 2017 | A1 |
20170184650 | Chang et al. | Jun 2017 | A1 |
20170211408 | Ahmadian et al. | Jul 2017 | A1 |
20170219815 | Letter et al. | Aug 2017 | A1 |
20170221274 | Chen | Aug 2017 | A1 |
20170234837 | Hall et al. | Aug 2017 | A1 |
20170241286 | Roberts et al. | Aug 2017 | A1 |
20170258391 | Finn et al. | Sep 2017 | A1 |
20170262965 | Xiong et al. | Sep 2017 | A1 |
20170262977 | Finn et al. | Sep 2017 | A1 |
20170262979 | Xiong et al. | Sep 2017 | A1 |
20170262985 | Finn et al. | Sep 2017 | A1 |
20170262986 | Xiong et al. | Sep 2017 | A1 |
20170270651 | Bailey et al. | Sep 2017 | A1 |
20170297095 | Zalameda et al. | Oct 2017 | A1 |
20170284971 | Hall | Nov 2017 | A1 |
20180002039 | Finn et al. | Jan 2018 | A1 |
20180005362 | Wang et al. | Jan 2018 | A1 |
20180013959 | Slavens et al. | Jan 2018 | A1 |
20180019097 | Harada et al. | Jan 2018 | A1 |
20180098000 | Park et al. | Apr 2018 | A1 |
20180111239 | Zak et al. | Apr 2018 | A1 |
20190299542 | Webb | Oct 2019 | A1 |
20190338666 | Finn et al. | Nov 2019 | A1 |
20190339131 | Finn et al. | Nov 2019 | A1 |
20190339165 | Finn et al. | Nov 2019 | A1 |
20190339206 | Xiong et al. | Nov 2019 | A1 |
20190339207 | Finn et al. | Nov 2019 | A1 |
20190339234 | Finn et al. | Nov 2019 | A1 |
20190339235 | Finn et al. | Nov 2019 | A1 |
20190340721 | Finn et al. | Nov 2019 | A1 |
20190340742 | Finn et al. | Nov 2019 | A1 |
20190342499 | Xiong et al. | Nov 2019 | A1 |
Number | Date | Country |
---|---|---|
2820732 | Dec 2014 | CA |
19710743 | Sep 1998 | DE |
1961919 | Aug 2008 | EP |
2545271 | Jun 2017 | GB |
06235700 | Aug 1994 | JP |
2015161247 | Sep 2015 | JP |
191452 | Jul 2013 | SG |
2013088709 | Jun 2013 | WO |
2016112018 | Jul 2016 | WO |
2016123508 | Aug 2016 | WO |
2016176524 | Nov 2016 | WO |
Entry |
---|
Gao et al., ‘A Statistical Method for Crack Detection from Vibrothermography Inspection Data’,(2010) Statistics Preprints. Paper 68. http://lib.dr.iastate.edu/stat_las_preprints/68. |
Li1 Ming; Holland1 Stephen D.; and Meeker1 William Q.1 “Statistical Methods for Automatic Crack Detection Based on Vibrothermography Sequence-of-Images Data” (2010). Statistics Preprints. 69. |
Henneke et al. ‘Detection of Damage in Composite Materials by Vibrothermography’, ASTM special technical publication (696), American Society for Testing and Materials, 1979, pp. 83-95. |
http://www.npl.co.uk/commercial-services/sector-case-studies/thermal-imaging-reveals-the-invisible; Apr. 17, 2012. |
Tian et al., ‘A Statistical Framework for Improved Automatic Flaw Detection in Nondestructive Evaluation Images’, Technometrics, 59, 247-261. Feb. 1, 2017. |
Emmanuel J. Cand'es1,2, Xiaodong LI2, Yi Ma3,4, and John Wright4, “Robust Principal Component Analysis”, (1)Department of Statistics, Stanford University, Stanford, CA; (2)Department of Mathematics, Stanford University, Stanford, CA; (3, 4) Electrical and Computer Engineering, UIUC, Urbana, IL (4) Microsoft Research Asia, Beijing, China, Dec. 17, 2009. |
Sebastien Parent; “From Human to Machine: How to Be Prepared for Integration of Automated Visual Inspection”Quality Magazine, https://www.qualitymag.com/articles/91976. Jul. 2, 2014. |
http://www.yxlon.com/products/x-ray-and-ct-inspection-systems/yxlon-mu56-tb, 2016. |
U.S. Office action dated Jul. 23, 2018 issued in corresponding U.S. Appl. No. 15/971,254. |
Blachnio et al, “Assessment of Technical Condition Demonstrated by Gas Turbine Blades by Processing of Images of Their Surfaces”, Journal of KONBiN, 1(21), 2012, pp. 41-50. |
Raskar et al., ‘A Non-photorealistic Camera: Depth Edge Detection and Stylized Rendering using Multi-flash Imaging’ ACM Transactions on Graphics, 2004 http://www.merl.com/publications/docs/TR2006-107.pdf. |
Feris et al., ‘Specular Reflection Reduction with Multi-Flash Imaging’, 17th Brazilian Symposium on Computer Graphics and Image Processing, 2004. http://rogerioferis.com/publications/FerisSIB04.pdf. |
Holland, “First Measurements from a New Broadband Vibrothermography Measurement System”, AIP Conference Proceedings, 894 (2007), pp. 478-483. http://link.aip.org/link/doi/10.1063/1.2718010 \. |
Gao et al., ‘Detecting Cracks in Aircraft Engine Fan Blades Using Vibrothermography Nondestructive Evaluation’, RESS Special Issue on Accelerated Testing, 2014, http://dx.doi.org/10.1016/j.ress.2014.05.009. |
Gao et al., ‘A Statistical Method for Crack Detection from Vibrothermography Inspection Data’, Statistics Preprints. Paper 68. http://lib.dr.iastate.edu/stat_las_preprints/68. |
Holland, ‘Thermographic Signal Reconstruction for Vibrothermography’, Infrared Physics & Technology 54 (2011) 503-511. |
Li et al., ‘Statistical Methods for Automatic Crack Detection Based on Vibrothermography Sequence-of-Images Data’, Statistics Preprints. Paper 69. http://lib.dr.iastate.edu/stat_las_preprints/69. |
Tian et al., ‘A Statistical Framework for Improved Automatic Flaw Detection in Nondestructive Evaluation Images’, Technometrics, 59, 247-261. |
Henneke et al. ‘Detection of Damage in Composite Materials by Vibrothermography’, ASTM special technical publication (696), 1979, pp. 83-95. |
http://www.npl.co.uk/commercial-services/sector-case-studies/thermal-imaging-reveals-the-invisible. |
E. J. Candès, X. Li, Y. Ma, and J. Wright, “Robust Principal Component Analysis”, submitted. http://www-stat.stanford.edu/˜candes/papers/RobustPCA.pdf. |
M. Sznaier, O. Camps, N. Ozay, T. Ding, G. Tadmor and D. Brooks, “The Role of Dynamics in Extracting Information Sparsely Encoded in High Dimensional Data Streams”, in Dynamics of Information Systems, Hirsch, M.J.; Pardalos, P.M.; Murphey, R. (Eds.), pp. 1-28, Springer Verlag, 2010. |
M. Fazel, H. Hindi, and S. Boyd, “A Rank Minimization Heuristic with Application to Minimum Order System Approximation”, American Control Conference, Arlington, Virginia, pp. 4734-4739, Jun. 2001. |
Meola et al., ‘An Excursus on Infrared Thermography Imaging’, J. Imaging 2016, 2, 36 http://www.mdpi.com/2313-433X/2/4/36/pdf. |
Yu et al., ‘ASIFT: An Algorithm for Fully Affine Invariant Comparison’, Image Processing on Line on Feb. 24, 2011. http://www.ipol.im/pub/art/2011/my-asift/article.pdf. |
Schemmel et al., ‘Measurement of Direct Strain Optic Coefficient of YSZ Thermal Barrier Coatings at Ghz Frequencies’, Optics Express, v.25, n. 17, Aug. 21, 2017, https://doi.org/10.1364/OE.25.019968. |
Jean-Yves Bouguet, “Camera Calibration Toolbox for Matlab”, http://www.vision.caltech.edu/bouguetj/calib_doc/, accessed on Nov. 10, 2017. |
https://www.qualitymag.com/articles/91976-from-human-to-machine-how-to-be-prepared-for-integration-of-automated-visual-inspection. |
http://www.yxlon.com/products/x-ray-and-ct-inspection-systems/yxlon-mu56-tb. |
Yu et al. ‘Shadow Graphs and 3D Texture Reconstruction’, IJCV, vol. 62, No. 1-2, 2005, pp. 35-60. |
U.S. Final Office Action dated Jan. 3, 2019 for corresponding U.S. Appl. No. 15/971,254. |
U.S. Non-Final Office Action dated Mar. 5, 2019 for corresponding U.S. Appl. No. 15/971,227. |
U.S. Non-Final Office Action dated May 28, 2019 for corresponding U.S. Appl. No. 15/971,214. |
U.S. Non-Final Office Action dated Nov. 29, 2019 for corresponding U.S. Appl. No. 15/971,242. |
U.S. Non-Final Office Action dated Nov. 26, 2019 for corresponding U.S. Appl. No. 15/971,194. |
U.S. Non-Final Office Action dated Feb. 25, 2020 for corresponding U.S. Appl. No. 15/971,214. |
U.S. Non-Final Office Action dated Apr. 30, 2020 issued for corresponding U.S. Appl. No. 15/970,944. |
U.S. Final Office Action dated Aug. 27, 2020 issued for corresponding U.S. Appl. No. 15/970,944. |
U.S. Non-Final Office Action dated May 21, 2020 issued for corresponding U.S. Appl. No. 15/971,236. |
U.S. Non-Final Office Action dated Aug. 28, 2020 issued for corresponding U.S. Appl. No. 15/971,194. |
U.S. Final Office Action dated Jun. 23 2020 issued for corresponding U.S. Appl. No. 15/971,205. |
U.S. Final Office Action dated Jul. 28, 2020 issued for corresponding U.S. Appl. No. 15/971,214. |
U.S. Final Office Action dated Mar. 12, 2020 for corresponding U.S. Appl. No. 15/971,194. |
U.S. Notice of Allowance dated Oct. 19, 2020 issued for corresponding U.S. Appl. No. 15/971,270. |
US office action dated Dec. 8, 2020 issued for corresponding U.S. Appl. No. 15/971,205. |
Number | Date | Country | |
---|---|---|---|
20190340805 A1 | Nov 2019 | US |