The present disclosure relates generally to topographical imaging and, more specifically, to topographical imaging using optical coherence tomography (OCT) and an image sensor.
Scheimpflug corneal topography is an imaging technique. This imaging technique is typically used in a medical setting for imaging corneal shape. A camera is positioned to observe a profile of the corneal shape from one perspective. The camera is rotated about the cornea until the camera observes the profile from approximately the full 360° view. Due to the mechanical movement of the camera, Scheimpflug corneal topography can suffer from drawbacks including, for instance, distortion as a result of eye movement.
Optical coherence tomography (OCT) is another imaging technique. OCT imaging techniques are often used in a medical setting. The techniques are capable of producing three dimensional images from within optical scattering samples, such as biological tissue. In other words, light scattered by a sample can be detected in order to form an image of the sample. When imaging a sample, parts of the sample below its surface can be imaged. Examples of biological tissue that may be imaged using OCT include coronary arteries, skin, and an eye. In another example, OCT may be used for art conservation to analyze layers of a painting.
The present technology provides improved topographical imaging.
In one implementation of the present technology, an optical system is disclosed. The optical system includes an optical coherence tomography (OCT) system arranged to project a beam scan towards a target and configured to generate data corresponding to the beam scan. The optical system also includes an image sensor arranged to capture an image of a trace of the beam scan on a surface of the target. The optical system also includes a computing system including a processing circuit including a processor and memory. The memory is structured to store instructions that, when executed by the processor, cause the processor to process the image captured by the image sensor to determine one or more characteristics of the target based, at least in part, on the trace of the beam scan.
In another implementation of the present disclosure, an optical system is disclosed. The optical system includes an optical coherence tomography (OCT) system arranged to project a beam scan towards a target and configured to generate data corresponding to the beam scan. The optical system also includes an image sensor arranged to capture an image of a trace of the beam scan on a surface of the target. The optical system also includes a computing system including a processing circuit including a processor and memory. The memory is structured to store instructions that, when executed by the processor, cause the processor to process beam scan data received from the OCT system to identify a coordinate map for the beam scan data including a first set of coordinates of the surface for the target. The memory is also structured to store instructions that, when executed by the processor, cause the processor to process the image captured by the image sensor to identify a second set of coordinates for the surface of the target. The memory is also structured to store instructions that, when executed by the processor, cause the processor to modify the relative coordinate map for the beam scan data based on the second set of coordinates.
In another implementation of the present technology, a method of topographical imaging is disclosed. The method includes receiving beam scan data from an optical coherence tomography (OCT) system arranged to project a beam scan towards a target and configured to generate the beam scan data. The method also includes processing the beam scan data to identify a coordinate map of the target including a first set of coordinates for a surface of the target. The method also includes receiving an image of a trace of the beam scan on the surface of the target from an image sensor. The method also includes processing the image to identify a second set of coordinates for the surface of the target. The method also includes modifying the coordinate map according to the second set of coordinates.
The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the following drawings and the detailed description.
The foregoing and other features of the present disclosure will become more fully apparent from the following description and appended claims, taken in conjunction with the accompanying drawings. Understanding that these drawings depict only several embodiments in accordance with the disclosure and are, therefore, not to be considered limiting of its scope, the disclosure will be described with additional specificity and detail through use of the accompanying drawings.
In the following detailed description, reference is made to the accompanying drawings, which form a part hereof. In the drawings, similar symbols typically identify similar components, unless context dictates otherwise. The illustrative embodiments described in the detailed description, drawings, and claims are not meant to be limiting. Other embodiments may be utilized, and other changes may be made, without departing from the spirit or scope of the subject matter presented here. It will be readily understood that the aspects of the present disclosure, as generally described herein, and illustrated in the figures, can be arranged, substituted, combined, and designed in a wide variety of different configurations, all of which are explicitly contemplated and make part of this disclosure.
Described herein are systems and methods for topographical imaging. The aspects described herein leverage the benefits of OCT systems and the accuracy of video-Placido-keratometry.
As will be discussed in further detail below, an OCT system is arranged to capture a three-dimensional (3D) OCT scan of a target. The OCT system generally projects a beam scan towards the target, and the OCT system generates data corresponding to the beam scan. Additionally, an image sensor is arranged to capture an image of a trace of the beam scan on a surface of the target. Using the image of the trace, various characteristics of the target can be determined.
Referring now to
In one or more implementations, the OCT system 100 may be a time-domain OCT system. In these implementations, the light source 102 may be a low coherence light source. Light from the light source 102 may be directed to the target 110 through an interferometer. The beam splitter 104 splits the light from the light source 102 into a sample arm and a reference arm. The sample arm is projected onto the target 110, and the reference arm is projected onto the reference mirror 106. The sample arm penetrates the target 110, and back-scattered light from the sample arm is detected by the interferometer. Additionally, the reference arm (which is reflected off the reference mirror 106) is detected by the interferometer. The combination of the reflected light from the reference mirror 106 and the reflected light from the target 110 is used to form an interference pattern (e.g., a difference in optical distance traveled by reflected photons from both the reference arm and the sample arm). Based on the interference, an amplitude and position of reflection (e.g., a depth) from inside the target 110 can be determined. Additionally, the reference mirror 106 may be moved, which thereby changes the optical path length of the reference path. The amplitude and position of reflection may be recorded along the depth direction, which forms an OCT signal (e.g., an A-scan). In this regard, the OCT signal is a two-dimensional (2D) scan of the target 110 including depths to various surfaces. The 2D scan of the target 110 may have coordinates of various surfaces of the target 110 associated therewith. To form a 3D scan, the light projected onto the target 110 is geometrically moved about the sample 110. The 2D scans following movement collectively form a 3D OCT scan (e.g., a B-scan). Accordingly, the 3D scan may have a plurality of coordinates associated with the various surfaces and layers of the target 110, each of which corresponds to individual 2D scans.
In one or more implementations, the OCT system 100 may be a swept-source OCT system. In these implementations, the light source 102 may be a high coherence light source. Light from the light source 102 may be directed to the target 110 through an interferometer. The beam splitter 104 splits the light from the light source 102 into a sample arm and a reference arm. The sample arm is projected onto the target 110, and the reference arm is projected onto the reference mirror 106. The sample arm penetrates the target 110, and back-scattered light from the sample arm is detected by the interferometer. Additionally, the reference arm (which is reflected off the reference mirror 106) is detected by the interferometer. The combination of the reflected light from the reference mirror 106 and the reflected light from the target 110 is used to form an interference pattern (e.g., a difference in optical distance traveled by reflected photons from both the reference arm and the sample arm). Based on the interference, an amplitude and position of reflection (e.g., a depth) from inside the target 110 can be determined. In the swept-source OCT system, rather than the reference mirror 106 being moved, the light from the light source is tuned over a wavelength range. The detector 108 receives a temporal signal having different frequency components. A Fourier transform is operated on the temporal signal to form the OCT signal (e.g., the 2D scan), with the frequency component corresponding to the depth position of various surfaces and the amplitude corresponding to reflectivity. Similar to the time-domain OCT, the 2D scan of the target 110 may have coordinates for the various surfaces associated therewith. Additionally, to form a 3D scan, the light projected onto the target 110 is geometrically moved about the sample 110. The 2D scans following movement collectively form a 3D OCT scan (e.g., a B-scan). Accordingly, the 3D scan may have a plurality of coordinates associated with the various surfaces and layers of the target 110, each of which corresponds to individual 2D scans.
Referring now to
In some embodiments, the first B-scan 200 and the second B-scan 202 may be merged to form B-scan 204. In one implementation, global coordinates from the first B-scan 200 may be used to adjust global coordinates and distortions of the second B-scan 202. In so doing, the benefits of both the first and second B-scans 200, 202 may be leveraged to form B-scan 204.
In some embodiments, the first B-scan 200 and/or second B-scan 202 may be in a pattern and/or have a particular size. For instance, the first B-scan 200 and/or second B-scan 202 may have a pattern similar to a pattern from a Placido cone (e.g., concentric circles). As another example, the first B-scan 200 and/or second B-scan 202 may have a spiral pattern, a raster pattern, or a radial pattern, to name a few possibilities.
In some instances, the first B-scan 200 and/or second B-scan 202 may scan a predetermined area of the target 110. As one example, the predetermined area may be a circle defined by a predetermined diameter or radius. This example is of particular relevance where the target 110 is a cornea. For instance, the predetermined radius may be 2.00 mm, 2.25 mm, 2.50 mm, 2.75 mm, 3.00 mm, 3.25 mm, to name a few possibilities. In instances such as these, the predetermined radius may correspond to an average corneal size or keratometrical zone size. Accordingly, the B-scans 200, 202 may have increased accuracy by leveraging more data samples across a smaller area.
In some embodiments, the OCT system 100 may process and/or refine B-scan 200 and/or B-scan 202. For instance, the OCT system 100 may apply one or more filters to the data from B-scan 200 and/or B-scan 202. The OCT system 100 may filter the data to remove outliers. As one example, the OCT system 100 may filter the data according to average curvature for the target 110. For instance, where the target 110 is a cornea, the OCT system 100 may filter the data based on average corneal curvature. The filter may be a range of average corneal curvatures (for instance, a 5.00-10.00 mm radius curvature, to name one example). The OCT system 100 may remove data according to the filter (e.g., data outside the range).
In some embodiments, the light projected from the light source 102 onto the target 110 may be telecentric. Accordingly, in some embodiments, the light projected onto the target 110 may be perpendicular to one or more surface within the OCT system 100. In some instances, the light may be perpendicular to a lens (e.g., lens 334 of
While these examples are provided, a number of different and/or modified OCT systems and scan patterns may be implemented, substituted, and/or incorporated into the disclosed systems and methods. Accordingly, the present disclosure is not limited to a particular OCT system. While the light source 102 indirectly projects light onto the target 110, hereinafter any reference to light projected onto the target 110 is generally referring to configurations and variations of light projections in OCT systems 100 for purposes of readability.
In one or more embodiments, the light source 102 may project light having a specific wavelength. For instance, the light source 102 may project light in the infrared spectrum, to name one example.
Referring to
As stated above, the light from light source 102 of the OCT system 100 may project light having a wavelength. In some embodiments, the image sensor 304 may be sensitive to light in a spectrum of wavelengths including, at least, the wavelength of light projected by the light source 102. Accordingly, the image captured by image sensor 304 may include light projected by the light source 102 onto the target 110. In arrangements where the first and/or second B-scan 200, 202 have a pattern, the pattern may be reflected in the image of the target 110. For instance, where the pattern includes concentric circles, the image of the target 110 may include features that correspond to the concentric circles.
As stated above, the pattern from the light source 102 may be projected at a rate Δt. Additionally, the image sensor 304 may have an exposure rate. In some implementations, the exposure rate for may be less than rate Δt. In implementations such as these, the image sensor 304 may acquire a series of images. The series of images may be superimposed upon one another to form an image that captures an image of the target 110 that spans, at least, Δt. In other implementations, the rate Δt may be less than or equal to the exposure rate for the image sensor 304. In implementations such as these, the image sensor 304 may acquire an image of the target for a duration greater than or equal to the rate Δt.
Referring briefly to
Referring now to
Referring back to
In some embodiments, the optical system 300 can include an optical analysis processing circuit. The optical analysis processing circuit can include a processor 316 and memory 318. The processor 316 may include any component or group of components that are configured to execute, implement, and/or perform any of the processes or functions described herein or any form of instructions to carry out such processes or cause such processes to be performed. In one or more arrangements, the processor 316 can be a main processor of the optical system 300. Examples of suitable processors include microprocessors, microcontrollers, DSP processors, and other circuitry that can execute software. Further examples of suitable processors include, but are not limited to, a central processing unit (CPU), an array processor, a vector processor, a digital signal processor (DSP), a field-programmable gate array (FPGA), a programmable logic array (PLA), an application specific integrated circuit (ASIC), programmable logic circuitry, and a controller. The processor 316 can include at least one hardware circuit (e.g., an integrated circuit) configured to carry out instructions contained in program code. In arrangements in which there is a plurality of processors, such processors can work independently from each other or one or more processors can work in combination with each other.
The memory 318 can be structured for storing one or more types of data. The memory 318 store can include volatile and/or non-volatile memory. Examples of suitable memory 318 include RAM (Random Access Memory), flash memory, ROM (Read Only Memory), PROM (Programmable Read-Only Memory), EPROM (Erasable Programmable Read-Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), registers, magnetic disks, optical disks, hard drives, or any other suitable storage medium, or any combination thereof. The memory 318 can be a component of the processor 316, or the memory 318 can be operatively connected to the processor 316 for use thereby. In some arrangements, the memory 318 can be located remotely and accessible by the processor 316, such as via a suitable communications device.
In some embodiments, the optical analysis processing circuit can include an image processor 320. The image processor 320 can be or include various devices, components, and/or software configured to identify features within an image. For instance, the image processor 320 can include a feature extraction module 322, a coordinate generation module 324, and an auto-refractometer module 330. Each of these modules 322, 324, 330 can be or include a set of instructions stored on memory (e.g., memory 318, for instance) that, when executed by the image processor 320, cause the image processor 320 to execute various functions. In some implementations, the image processor 320 and memory may be stored locally (e.g., within the optical analysis processing circuit). Additionally or alternatively, the image processor 320 and memory may be stored remotely. Additionally or alternatively, the image processor 320 may be stored locally (e.g., within the optical analysis processing circuit) and the memory may be stored remotely (and vice versa).
In one or more embodiments, the feature extraction module 322 can include instructions that, when executed by the image processor 320, cause the image processor 320 to extract features that are represented in the image. The feature extraction module 322 can include instructions for identifying edges of various features contained in the image, instructions for segmentation and block matching, and/or other techniques for extracting features within an image. The image processor 320 may receive an image (e.g., image 400, for instance) captured by the image sensor 304. The image processor 320 can process the image to extract features within the image received from the image sensor 304. Continuing the example, the image processor 320 can process image 400 to extract concentric circles contained in the image 400.
Referring now to
As stated above, image sensor 304 may capture an image of the target 110. The image may include light from light source 102 that is reflected off target 110. Continuing the example of
In one or more embodiments, the feature extraction module 322 can include instructions for identifying the size, location, and/or orientation of pattern B at location P2. For instance, the feature extraction module 322 can include instructions to identify the location of pattern B at location P1 using techniques similar to those described above with respect to identifying the concentric circles. The feature extraction module 322 can include instructions to identify the pattern B at location P2 based on its expected location with respect to location P1. The feature extraction module 322 can include instructions to identify the location of pattern B at location P2. Once pattern B is located at location P2, various features of pattern B can be extracted. For instance, the feature extraction module 322 can include instructions for determining a size of the pattern B. The feature extraction module 322 can include instructions for determining an orientation of the pattern B (for instance, with respect to axes X and Y). The feature extraction module 322 can include instructions for determining a location of a portion (e.g., a center, a particular space between two segments, etc.) of pattern B at location P2 with respect to a location of a portion of pattern B at location P1 (e.g., the center, the particular space between the same two segments, etc.).
Referring now to
In some embodiments, the memory 318 may store an OCT analysis module 326. The OCT analysis module 326 can include instructions that, when executed by processor 316, cause the processor 316 to process data received from the OCT system 100. For instance, the OCT analysis module 326 can include instructions to identify various surfaces and/or thicknesses associated with various surfaces in a B-scan (e.g., B-scan 200, 202, 204). The OCT analysis module 326 can include instructions to construct a coordinate map (e.g., similar to the map shown in B-scan 200, 202, and/or 204) for the B-scan. Each point in the coordinate map can have coordinates associated therewith. Accordingly, the coordinate map can include the coordinates received from the OCT system 100 for surfaces mapped on and within the target 110.
In some embodiments, the memory 318 may store a coordinate modification module 328. The coordinate modification module 328 can include instructions, that, when executed by processor 316, cause the processor 316 to modify and/or replace coordinates within the coordinate map identified via OCT analysis module 326. For instance, the coordinate modification module 328 can include instructions for modifying and/or replacing coordinates within the coordinate map with coordinates identified via the coordinate generation module 324 of the image processor 320. As one example, the coordinate modification module 328 can include instructions for replacing coordinates of the coordinate map with coordinates for the surface of the target 110 identified within the image processed via image processor 320. The coordinate modification module 328 can include instructions for modifying the remaining coordinates for other surfaces contained in the coordinate map according to the coordinates within the image processed via the image processor 320. In this and other examples, the disclosed system leverages the accuracy of the coordinates identified within the image and the accuracy of the measurements and resulting coordinates obtained via the OCT system 100.
In some embodiments, the coordinate modification module 328 can include instructions that, when executed by the processor 316, cause the processor 316 to calculate one or more properties of the target 110 based on the coordinate map. For instance, the coordinate modification module 328 can include instructions to calculate a curvature of one or more surfaces of the target based on data from the coordinate map. In some examples, the coordinate modification module 328 can include instructions to calculate the curvature of one or more surfaces using at least three points for a surface of the target 110 as represented within the coordinate map. In examples where the target 110 is an eye, the curvature may be the anterior and/or posterior corneal curvature. Accordingly, where the coordinate map is modified according to the coordinates obtained via the coordinate generation module 324, any of the anterior or posterior corneal curvature may be calculated by identifying the curvature according to coordinates obtained via the coordinate generation module 324, and adding (or subtracting) depth for the other corneal curvature. In this example, the corneal curvature (e.g., anterior and posterior) may be calculated more accurately than traditional calculations by leveraging accuracy from the image obtained via image sensor 304 and accuracy from the OCT system 100.
Referring back to
In some embodiments, the auto-refractometer module 330 can include instructions to determine a refractive power associated with each condition based on the comparison of patterns B at location P2 within the image captured by image sensor 304 and the calibrated image 502. For instance, the auto-refractometer module 330 can include instructions to determine a refractive power based on the degree of which the sizes are different for patterns B at location P2 within the image and the calibrated image 502.
In some embodiments, the auto-refractometer module 330 can include instructions to identify astigmatism of the eye based on a comparison of the image captured by image sensor 304 and the calibrated image 502. For instance, the auto-refractometer module 330 can include instructions to compare the location of the pattern B at location P2 with respect to quadrants defined by axes X and Y and the calibrated image 502. Where the location of pattern B at location P2 within the image captured by image sensor 304 is different from the location of pattern B within the calibrated image 502, the auto-refractometer module 330 can include instructions to generate data indicating astigmatism of eye.
In some embodiments, the optical system 300 may include a display 332. The display 332 can be any device and/or component configured to display an image to a user. For instance, the display 332 can be a cathode ray tube (CRT) display, a light-emitting diode (LED) display, an liquid crystal display (LCD), a plasma display panel (PDP), to name a few possibilities. The display 332 may be operatively connected to the processor 316 and controlled thereby. The processor 316 may control the display 332 to display an image of the target 110. The image may be or include the coordinate map generated and/or modified via the coordinate modification module 328. Additionally or alternatively, the image may be or include the beam scan data. Additionally or alternatively, the image may be or include the image captured by the image sensor 304. In some arrangements, the processor 316 can control the display 332 to display additional data. For instance, the processor 316 can control the display 332 to display data corresponding to the various conditions indicated via the auto-refractometer module 330. As another example, the processor 316 can control the display 332 to display other calculated characteristics of the target (e.g., anterior and/or posterior curvature, for example).
Now that various aspects of the disclosed systems and components have been described, a method of topographical imaging a target will be described with reference to
Referring now to
The method 600 can include operation 602. At operation 602, the method 600 may begin. The method 600 may begin when an initialization signal (e.g., from a user) is received by the various components/systems described herein. The method 600 can proceed to operation 604.
At operation 604, the method 600 can include receiving beam scan data from OCT system 100. The beam scan may be of a target 110. The beam scan data can include one or more B-scans 200, 202. The B-scans 200 may have a pattern (e.g., pattern B of
At operation 606, the method 600 can include processing the beam scan data from the OCT system 100 to identify a first set of coordinates. In some implementations, the first set of coordinates may be the global coordinates and/or the local coordinates contained in B-scan 200, 202. In one or more arrangements, the processor 316 can process the beam scan data using instructions from the OCT analysis module 326. The processor 316 can generate a coordinate map based on the beam scan data received from the OCT system 100. The method 600 can proceed to operation 608.
At operation 608, the method 600 can include receiving an image of a trace of the beam scan. The image may be received from image sensor 304. The image may be of the target 110. The trace of the beam scan may be on a surface of the target 110. In some examples, the image may be similar to images 500, 502, 504. In other examples, the image may be similar to image 400. The method 600 can proceed to operation 610.
At operation 610, the method 600 can include processing the image to identify another set of coordinates. In some implementations, the coordinates may be identified by image processor 320 using instructions from the coordinate generation module 324. The coordinates identified via coordinate generation module 324 may be of a surface of target 110. The method 600 can proceed to operation 612.
At operation 612, the method 600 can include modifying the first set of coordinates identified at operation 606 based on the coordinates identified at operation 610. As stated above, the coordinates identified at operation 606 for the surface of target 110 can be replaced and/or modified based on the coordinates identified via coordinate generation module 324. Additionally, the remaining coordinates identified at operation 606 can be modified in accordance with to the modified coordinates. Accordingly, the coordinates for target 110 can include some coordinates identified within the image received at operation 608 and others identified at operation 606. In so doing, the coordinate map for target 110 leverage the accuracy of the coordinates identified in the image received at operation 610 with other measurements and coordinates associated therewith identified at operation 606. These coordinates (for instance, the coordinate map) can be displayed to a user via display 332.
The foregoing description of illustrative embodiments has been presented for purposes of illustration and of description. It is not intended to be exhaustive or limiting with respect to the precise form disclosed, and modifications and variations are possible in light of the above teachings or may be acquired from practice of the disclosed embodiments.
While certain embodiments have been illustrated and described, it should be understood that changes and modifications can be made therein in accordance with ordinary skill in the art without departing from the technology in its broader aspects as defined in the following claims.
The embodiments, illustratively described herein may suitably be practiced in the absence of any element or elements, limitation or limitations, not specifically disclosed herein. Thus, for example, the terms “comprising,” “including,” “containing,” etc. shall be read expansively and without limitation. Additionally, the terms and expressions employed herein have been used as terms of description and not of limitation, and there is no intention in the use of such terms and expressions of excluding any equivalents of the features shown and described or portions thereof, but it is recognized that various modifications are possible within the scope of the claimed technology. Additionally, the phrase “consisting essentially of” will be understood to include those elements specifically recited and those additional elements that do not materially affect the basic and novel characteristics of the claimed technology. The phrase “consisting of” excludes any element not specified.
The present disclosure is not to be limited in terms of the particular embodiments described in this application. Many modifications and variations can be made without departing from its spirit and scope, as will be apparent to those skilled in the art. Functionally equivalent methods and compositions within the scope of the disclosure, in addition to those enumerated herein, will be apparent to those skilled in the art from the foregoing descriptions.
Such modifications and variations are intended to fall within the scope of the appended claims. The present disclosure is to be limited only by the terms of the appended claims, along with the full scope of equivalents to which such claims are entitled. It is to be understood that this disclosure is not limited to particular methods, reagents, compounds compositions or biological systems, which can of course vary.
It is also to be understood that the terminology used herein is for the purpose of describing particular embodiments only, and is not intended to be limiting.
As will be understood by one skilled in the art, for any and all purposes, particularly in terms of providing a written description, all ranges disclosed herein also encompass any and all possible subranges and combinations of subranges thereof. Any listed range can be easily recognized as sufficiently describing and enabling the same range being broken down into at least equal halves, thirds, quarters, fifths, tenths, etc. As a non-limiting example, each range discussed herein can be readily broken down into a lower third, middle third and upper third, etc. As will also be understood by one skilled in the art all language such as “up to,” “at least,” “greater than,” “less than,” and the like, include the number recited and refer to ranges which can be subsequently broken down into subranges as discussed above. Finally, as will be understood by one skilled in the art, a range includes each individual member.
All publications, patent applications, issued patents, and other documents referred to in this specification are herein incorporated by reference as if each individual publication, patent application, issued patent, or other document was specifically and individually indicated to be incorporated by reference in its entirety.
Definitions that are contained in text incorporated by reference are excluded to the extent that they contradict definitions in this disclosure.
Other embodiments are set forth in the following claims.
Number | Name | Date | Kind |
---|---|---|---|
4466699 | Droessler et al. | Aug 1984 | A |
5022745 | Zayhowski et al. | Jun 1991 | A |
5319668 | Luecke | Jun 1994 | A |
5372135 | Mendelson et al. | Dec 1994 | A |
5430574 | Tehrani | Jul 1995 | A |
5491524 | Hellmuth | Feb 1996 | A |
5537162 | Hellmuth et al. | Jul 1996 | A |
5561523 | Blomberg et al. | Oct 1996 | A |
5979760 | Freyman et al. | Nov 1999 | A |
5982963 | Feng et al. | Nov 1999 | A |
6070093 | Oosta et al. | May 2000 | A |
6111645 | Tearney et al. | Aug 2000 | A |
6134003 | Tearney et al. | Oct 2000 | A |
6160826 | Swanson et al. | Dec 2000 | A |
6275718 | Lempert | Aug 2001 | B1 |
6282011 | Tearney et al. | Aug 2001 | B1 |
6373632 | Flanders | Apr 2002 | B1 |
6421164 | Tearney et al. | Jul 2002 | B2 |
6485413 | Boppart et al. | Nov 2002 | B1 |
6501551 | Tearney et al. | Dec 2002 | B1 |
6556853 | Cabib et al. | Apr 2003 | B1 |
6564087 | Pitris et al. | May 2003 | B1 |
6725073 | Motamedi et al. | Apr 2004 | B1 |
7099358 | Chong | Aug 2006 | B1 |
7231243 | Tearney et al. | Jun 2007 | B2 |
7323680 | Chong | Jan 2008 | B2 |
7324214 | De Groot et al. | Jan 2008 | B2 |
7352783 | Chong | Apr 2008 | B2 |
7382809 | Chong et al. | Jun 2008 | B2 |
7388891 | Uehara et al. | Jun 2008 | B2 |
7400410 | Baker et al. | Jul 2008 | B2 |
7414779 | Huber et al. | Aug 2008 | B2 |
7428057 | De Lega et al. | Sep 2008 | B2 |
7489713 | Chong et al. | Feb 2009 | B2 |
7701588 | Chong | Apr 2010 | B2 |
7725169 | Boppart et al. | May 2010 | B2 |
7835010 | Morosawa et al. | Nov 2010 | B2 |
7865231 | Tearney et al. | Jan 2011 | B2 |
7869057 | De Groot | Jan 2011 | B2 |
7884945 | Srinivasan et al. | Feb 2011 | B2 |
7961312 | Lipson et al. | Jun 2011 | B2 |
8036727 | Schurman et al. | Oct 2011 | B2 |
8115934 | Boppart et al. | Feb 2012 | B2 |
8315282 | Huber et al. | Nov 2012 | B2 |
8405834 | Srinivasan et al. | Mar 2013 | B2 |
8427649 | Hays | Apr 2013 | B2 |
8500279 | Everett et al. | Aug 2013 | B2 |
8625104 | Izatt et al. | Jan 2014 | B2 |
8690328 | Chong | Apr 2014 | B1 |
9163930 | Buckland et al. | Oct 2015 | B2 |
9335154 | Wax et al. | May 2016 | B2 |
9851433 | Sebastian | Dec 2017 | B2 |
20010034478 | Lambert et al. | Oct 2001 | A1 |
20020163948 | Yoshida et al. | Nov 2002 | A1 |
20040036838 | Podoleanu et al. | Feb 2004 | A1 |
20040257581 | Hogan | Dec 2004 | A1 |
20050171438 | Chen et al. | Aug 2005 | A1 |
20050201432 | Uehara et al. | Sep 2005 | A1 |
20050213103 | Everett et al. | Sep 2005 | A1 |
20060105209 | Thyroff et al. | May 2006 | A1 |
20060109872 | Sanders | May 2006 | A1 |
20060215713 | Flanders et al. | Sep 2006 | A1 |
20070040033 | Rosenberg | Feb 2007 | A1 |
20070076217 | Baker et al. | Apr 2007 | A1 |
20070081166 | Brown et al. | Apr 2007 | A1 |
20070133647 | Daiber | Jun 2007 | A1 |
20070141418 | Ota et al. | Jun 2007 | A1 |
20070263226 | Kurtz et al. | Nov 2007 | A1 |
20070291277 | Everett et al. | Dec 2007 | A1 |
20080097194 | Milner | Apr 2008 | A1 |
20080269575 | Iddan | Oct 2008 | A1 |
20090022181 | Atkins et al. | Jan 2009 | A1 |
20090079993 | Yatagai et al. | Mar 2009 | A1 |
20090103050 | Michaels et al. | Apr 2009 | A1 |
20090169928 | Nishimura et al. | Jul 2009 | A1 |
20090247853 | Debreczeny | Oct 2009 | A1 |
20090268020 | Buckland et al. | Oct 2009 | A1 |
20090290613 | Zheng et al. | Nov 2009 | A1 |
20100110171 | Satake | May 2010 | A1 |
20100157308 | Xie | Jun 2010 | A1 |
20100246612 | Shimizu | Sep 2010 | A1 |
20100253908 | Hammer et al. | Oct 2010 | A1 |
20100284021 | Hacker | Nov 2010 | A1 |
20110080561 | Hayashi | Apr 2011 | A1 |
20110112385 | Aalders | May 2011 | A1 |
20110228218 | Hauger et al. | Sep 2011 | A1 |
20110235045 | Koerner | Sep 2011 | A1 |
20110255054 | Hacker et al. | Oct 2011 | A1 |
20110299034 | Walsh et al. | Dec 2011 | A1 |
20120013849 | Podoleanu et al. | Jan 2012 | A1 |
20120026466 | Zhou et al. | Feb 2012 | A1 |
20120133950 | Suehira et al. | May 2012 | A1 |
20120136259 | Milner et al. | May 2012 | A1 |
20120188555 | Izatt et al. | Jul 2012 | A1 |
20130265545 | Buckland et al. | Oct 2013 | A1 |
20140051952 | Reichgott et al. | Feb 2014 | A1 |
20140055749 | Zhou | Feb 2014 | A1 |
20140111774 | Komine | Apr 2014 | A1 |
20140228681 | Jia et al. | Aug 2014 | A1 |
20140268163 | Milner et al. | Sep 2014 | A1 |
20140293290 | Kulkarni | Oct 2014 | A1 |
20140336479 | Ando | Nov 2014 | A1 |
20150223681 | Kuranov et al. | Aug 2015 | A1 |
20150342508 | Chong | Dec 2015 | A1 |
20150348287 | Yl et al. | Dec 2015 | A1 |
20160178346 | Kulkarni | Jun 2016 | A1 |
20160183780 | Docherty | Jun 2016 | A1 |
20160324593 | El-Haddad | Nov 2016 | A1 |
20170090031 | Bondy et al. | Mar 2017 | A1 |
20180088236 | Eichenholz et al. | Mar 2018 | A1 |
20180128594 | Lee et al. | May 2018 | A1 |
Number | Date | Country |
---|---|---|
10 2011 114 797 | Apr 2013 | DE |
2006-202543 | Aug 2006 | JP |
2008-188047 | Aug 2008 | JP |
2010-172538 | Aug 2010 | JP |
WO-2012075126 | Jun 2012 | WO |
WO-2013168149 | Nov 2013 | WO |
WO-2015121756 | Aug 2015 | WO |
WO-2017176901 | Oct 2017 | WO |
Entry |
---|
Final Office Action on U.S. Appl. No. 15/139,579 dated May 15, 2019. |
Final Office Action on U.S. Appl. No. 15/630,654 dated Sep. 12, 2018. |
International Search Report and Written Opinion in PCT/US2019/027671 dated Jul. 1, 2019. |
Non-Final Office Action on U.S. Appl. No. 14/723,325 dated Jan. 29, 2019. |
Non-Final Office Action on U.S. Appl. No. 15/611,515 dated Oct. 5, 2018. |
Notice of Allowance on U.S. Appl. No. 15/086,520 dated Jul. 9, 2019. |
Notice of Allowance on U.S. Appl. No. 15/578,421 dated May 1, 2019. |
Notice of Allowance on U.S. Appl. No. 15/611,515 dated May 21, 2019. |
Notice of Allowance on U.S. Appl. No. 15/630,654 dated Apr. 22, 2019. |
Notice of Allowance on U.S. Appl. No. 15/648,239 dated Oct. 30, 2018. |
U.S. Office Action on U.S. Appl. No. 14/723,325 dated Apr. 19, 2019. |
Chong, et al. “Large Coherence Length Swept Source for Axial Length Measurement of the Eye,” Applied Optics, 2009, pp. D145-D150, vol. 48, Issue 10. |
Chowdhury, et al., “Challenges & Countermeasures in Optical Noninvasive Blood Glucose Detection,” International Journal of Innovative Research in Science, Engineering and Technology, Jan. 2013, pp. 329-334, vol. 2, Issue 1. |
Dai, et al., “Optical coherence tomography for whole eye segment imaging,” Optics Express, Mar. 2012, pp. 6109-6115, vol. 20, Issue 6. |
Dhalla, et al., “Simultaneous swept source optical coherence tomography of the anterior segment and retina using coherence revival,” Optics Letters, 2012, pp. 1883-1885, vol. 37, No. 11. |
Fainman, et al., “Nanophotonics for Information Systems,” Information Optics and Photonics, Oct. 1, 2010, pp. 13-37, T. Fournel and B. Javidi eds., Springer New York. |
Final Rejection Office Action on U.S. Appl. No. 14/641,200 dated Dec. 7, 2015 (13 pages). |
Final Rejection Office Action on U.S. Appl. No. 14/723,325 dated Apr. 24, 2017 (8 pages). |
International Preliminary Report on Patentability on International Application No. PCT/IB2015/000808 dated Aug. 4, 2016 (7 pages). |
International Preliminary Report on Patentability on International Application No. PCT/US2015/019299 dated Sep. 13, 2016 (8 pages). |
International Preliminary Report on Patentability on International Application No. PCT/US2015/032727 dated Dec. 8, 2016 (7 pages). |
International Preliminary Report on Patentability on International Application No. PCT/US2016/035012 dated Dec. 14, 2017 (11 pages). |
International Search Report and Written Opinion dated Aug. 26, 2015 for PCT/US15/32727 (8 pages). |
International Search Report and Written Opinion on International Application No. PCT/IB2015/000808 dated Oct. 20, 2015 (12 pages). |
International Search Report and Written Opinion on International Application No. PCT/US2015/19299 dated Nov. 2, 2015(10 pages). |
International Search Report and Written Opinion on International application No. PCT/US2016/035012 dated Aug. 18, 2016 (13 pages). |
Jeong, et al., “Spectral-domain OCT with dual illumination and interlaced detection for simultaneous anterior segment and retina imaging,” Optics Express, Aug. 2012, pp. 19148-19159, vol. 20, Issue 17. |
Jia, et al., “Split-Spectrum Amplitude-Decorrelation Angiography with Optical Coherence Tomography,” Optics Express, Feb. 2012, pp. 4710-4725, vol. 20 No. 4. |
Lexer, et al., “Wavelength-tuning interferometry of intraocular distances,” Applied Optics, 1997, pp. 6548-6553, vol. 36, Issue 25. |
Mariampillai, et al., “Speckle Variance Detection of Microvasculature Using Swept-Source Optical Coherence Tomography,” Optics Letters, Jul. 2008, pp. 1530-1532, vol. 33 No. 13. |
Nankivil, et al., “Handheld, rapidly switchable, anterior/posterior segment swept source optical coherence tomography probe,” Biomedical Optics Express, Nov. 2015, pp. 4516-4528, vol. 6, Issue 11. |
Non-Final Office Action on U.S. Appl. No. 14/641,200 dated Aug. 19, 2015 (12 pages). |
Non-Final Rejection Office Action on U.S. Appl. No. 13/892,997 dated Sep. 12, 2013 (15 pages). |
Non-Final Rejection Office Action on U.S. Appl. No. 14/601,945 dated Mar. 2, 2016 (13 pages). |
Non-Final Rejection Office Action on U.S. Appl. No. 14/613,644 dated Jun. 8, 2016 (8 pages). |
Non-Final Rejection Office Action on U.S. Appl. No. 14/641,200 dated Mar. 14, 2016 (13 pages). |
Non-Final Rejection Office Action on U.S. Appl. No. 14/723,325 dated Nov. 18, 2016 (8 pages). |
Non-Final Rejection Office Action on U.S. Appl. No. 14/723,325 dated Dec. 7, 2017 (11 pages). |
Non-Final Rejection Office Action on U.S. Appl. No. 15/202,925 dated Jul. 27, 2017 (8 pages). |
Notice of Allowance on U.S. Appl. No. 13/892,997 dated Dec. 6, 2013 (9 pages). |
Notice of Allowance on U.S. Appl. No. 14/601,945 dated Sep. 13, 2016 (10 pages). |
Notice of Allowance on U.S. Appl. No. 14/613,644 dated Nov. 7, 2016 7 pages). |
Notice of Allowance on U.S. Appl. No. 14/613,644 dated Nov. 18, 2016 (4 pages). |
Notice of Allowance on U.S. Appl. No. 14/641,200 dated Jul. 12, 2016 (10 pages). |
Notice of Allowance on U.S. Appl. No. 15/202,925 dated Feb. 13, 2018 (9 pages). |
Ortiz, et al., “Corneal Topography From Spectral Optical Coherence Tomography (sOCT),” Biomedical Optics Express, Dec. 2011, pp. 3232-3247, vol. 2, No. 12. |
Poddar, et al., “Non-Invasive Glucose Monitoring Techniques: A Review and Current Trends,” Oct. 2008, pp. 1-47. |
Sarlet, et al., “Wavelength and Mode Stabilization of Widely Tunable SG-DBR and SSG-DBR Lasers,” IEEE Photonics Technology Letters, Nov. 1999, pp. 1351-1353, vol. 11, Issue 11. |
Segawa, et al., “Semiconductor Double-Ring-Resonator-Coupled Tunable Laser for Wavelength Routing,” IEEE Journal of Quantum Electronics, Jul. 2009, pp. 892-899, vol. 45, Issue 7. |
Tayebati, et al., “Microelectromechanical tunable filter with stable half symmetric cavity,” Electronics Letters, Oct. 1998, pp. 1967-1968, vol. 34, Issue 20. |
Chopra et al., Topographical Thickness of the Skin in the Human Face, Aesthetic Surgery Journal, vol. 35(8), 2015, pp. 1007-1013. |
Final Office Action on U.S. Appl. No. 14/723,325 dated Jul. 26, 2018. |
Non-Final Office Action on U.S. Appl. No. 15/086,520 dated Aug. 6, 2018. |
Non-Final Office Action on U.S. Appl. No. 15/139,579 dated Jul. 17, 2018. |
Non-Final Office Action on U.S. Appl. No. 15/648,239 dated Jun. 6, 2018. |
U.S. Notice of Allowance on U.S. Appl. No. 15/202,925 dated May 17, 2018. |
U.S. Office Action on U.S. Appl. No. 15/630,654 dated Apr. 4, 2018. |
Number | Date | Country | |
---|---|---|---|
20190290123 A1 | Sep 2019 | US |