The present disclosure relates to techniques for imaging and/or measuring a subject's eye, including the subject's retina fundus.
Techniques for imaging and/or measuring a subject's eye would benefit from improvement.
Some aspects of the present disclosure relate to a method generating a graph from an image and/or measurement of a subject's retina fundus, wherein generating the graph comprises generating, by at least one processor a plurality of nodes corresponding to a plurality of pixels of the image and/or measurement and a plurality of edges connecting the plurality of nodes, at least one auxiliary node, and an auxiliary edge connecting the auxiliary node to a first node of the plurality of nodes.
In some embodiments, the auxiliary edge is a first auxiliary edge, generating the graph further comprises, by the at least one processor, generating a second auxiliary edge connecting the at least one auxiliary node to a second node of the plurality of nodes, and the first and second nodes correspond to respective first and second pixels in a first column of the image and/or measurement.
In some embodiments, the at least one auxiliary node comprises a first auxiliary node, which is a start node of the graph, and a second auxiliary node, which is an end node of the graph.
In some embodiments, the auxiliary edge is a first auxiliary edge and the first node corresponds to a first pixel of the plurality of pixels in a first column of the image and/or measurement, generating the graph further comprises, by the at least one processor, generating a second auxiliary edge connecting the first auxiliary node to a second node of the plurality of nodes corresponding to a second pixel of the plurality of pixels in the first column, a third auxiliary edge connecting the second auxiliary node to a third node of the plurality of nodes corresponding to a third pixel of the plurality of pixels in a second column of the image and/or measurement, and a fourth auxiliary edge connecting the second auxiliary node to a fourth node of the plurality of nodes corresponding to a fourth pixel of the plurality of pixels in the second column.
In some embodiments, the method may further comprise locating, by the at least one processor, a boundary between first and second layers of the subject's retina fundus in the image and/or measurement using the graph.
In some embodiments, the at least one auxiliary node comprises a start node and/or an end node of the graph and locating the boundary comprises determining a plurality of paths from the start node to the at least one auxiliary node and/or from the at least one auxiliary node to the end node and selecting a path from among the plurality of paths.
In some embodiments, generating the graph further comprises assigning, to at least some of the plurality of nodes and/or edges, weighted values; generating the auxiliary edge comprises assigning, to the auxiliary node and/or edge, a preset weighted value; and selecting the path from among the plurality of paths comprises executing a cost function using the weighted values and the preset weighted value and determining that the path has and/or shares a lowest cost among the plurality of paths.
In some embodiments, the weighted values are assigned to the plurality of nodes based on pixel intensity of pixels corresponding to the plurality of nodes and/or assigned to the plurality of edges based on pixel intensity of pixels that correspond to pairs of the plurality of nodes connected by the plurality of edges.
In some embodiments, the weighted values are assigned to the plurality of nodes based on frequency and/or phase of pixels corresponding to the plurality of nodes and/or assigned to the plurality of edges based on frequency and/or phase of pixels that correspond to pairs of the plurality of nodes connected by the plurality of edges.
In some embodiments, executing the cost function comprises determining a cost for each of the plurality of paths, the cost of each path being based at least in part on of the weighted values and/or preset weighted value assigned to nodes and/or edges in each path.
In some embodiments, the preset weighted value has a minimum cost.
Some aspects of the present disclosure relate to a method comprising generating a graph from an image and/or measurement of a subject's retina fundus, wherein generating the graph comprises, by at least one processor generating a plurality of nodes corresponding to a plurality of pixels of the image and/or measurement and a plurality of edges connecting the plurality of nodes selecting a start node and/or an end node of the graph from the plurality of nodes, and generating, connecting the start and/or end node to a first node of the plurality of nodes, at least one auxiliary edge.
In some embodiments, the method may further comprise, by the at least one processor, assigning weighted values to at least some of the plurality of nodes and/or plurality of edges and assigning a preset weighted value to the at least one auxiliary edge and/or start node and/or end node.
In some embodiments, the weighted values are assigned to the plurality of nodes based on pixel intensity of pixels corresponding to the plurality of nodes and/or assigned to the plurality of edges based on pixel intensity of pixels that correspond to pairs of the plurality of nodes connected by the plurality of edges.
In some embodiments, the weighted values are assigned to the plurality of nodes based on derivatives corresponding to the plurality of nodes and/or assigned to the plurality of edges based on derivatives of pixels that correspond to pairs of the plurality of nodes connected by the plurality of edges.
In some embodiments, the start node is selected to correspond to a first corner pixel of the image and/or measurement and the end node is selected to correspond to a second corner pixel of the image and/or measurement, and wherein the first and second corner pixels are in different columns of the image and/or measurement.
In some embodiments, generating the at least one auxiliary edge comprises generating a first plurality of auxiliary edges connecting the start node to respective ones of a first plurality of perimeter nodes of the plurality of nodes that correspond to pixels of a first column of pixels of the image and/or measurement, and generating a second plurality of auxiliary edges connecting the end node to respective ones of a second plurality of perimeter nodes of the plurality of nodes correspond to pixels of a second column of pixels of the image and/or measurement.
In some embodiments, the method may further comprise locating, by the at least one processor, a boundary between first and second layers of the subject's retina fundus in the image and/or measurement using the graph.
In some embodiments, locating the boundary comprises determining a plurality of paths from the start node to the end node via the auxiliary edge and selecting a path from among the plurality of paths.
In some embodiments, selecting the path comprises executing a cost function based on the weighted values and the preset weighted value and determining that the path has and/or shares a lowest cost among the plurality of paths.
In some embodiments, the preset weighted value has a minimum cost.
In some embodiments, executing a cost function comprises minimizing the cost function.
The foregoing summary is not intended to be limiting. Moreover, in accordance with various embodiments, aspects of the present disclosure may be implemented alone or in combination with other aspects.
The accompanying drawings are not intended to be drawn to scale. In the drawings, each identical or nearly identical component that is illustrated in various figures is represented by a like numeral. For purposes of clarity, not every component may be labeled in every drawing. In the drawings:
I. Introduction
The inventors have recognized and appreciated that a subject's (e.g., person's) eyes provide a window into the body that may be used to not only to determine whether the subject has an ocular disease, but to determine the general health of the subject. The retina fundus in particular can provide valuable information via imaging for use in various health determinations. However, conventional systems of imaging, measuring, and/or processing images and/or measurements of the fundus are limited in multiple respects.
The inventors recognized that conventional imaging and/or measurement systems do not accurately locate certain features of a subject's retina fundus in an image and/or measurement. For example, imaging and/or measurement systems do not accurately locate boundaries between retinal layers, let alone do so in a manner that is computationally efficient. In a clinical setting, when an image and/or measurement is captured, a clinician may have to inspect each image and/or measurement to locate features such as boundaries between retinal layers by segmentation in each image. In addition to being time consuming, this process is imperfect due to the practical limits of human eyesight, which can result in measurements that may be inaccurate. These measurements may be used to subsequently determine a health status of the subject, which may be inaccurate and/or incorrect. Similarly, existing systems for locating features of a subject's retina fundus in an image and/or measurement are not accurate or computationally efficient at doing so.
To solve the above problems, the inventors developed improved techniques and methods for generating, by one or more processors, a graph from an image and/or measurement (e.g., an optical coherence tomography image and/or measurement), which can be useful for locating features in the image and/or measurement. For example, the image and/or measurement can include a subject's retina fundus and the features may include one or more layers and/or boundaries between layers of the subject's retina fundus in the image and/or measurement.
In some embodiments, generating the graph may include generating nodes corresponding to pixels of the image and/or measurement and edges connecting the nodes. For example, nodes can be generated for some or all pixels of the image and/or measurement. In some embodiments, generating the graph may also include generating at least one auxiliary node. For example, the auxiliary node(s) can be generated in addition to the nodes of the graph that correspond pixels of the image and/or measurement, and can be a start node and/or end node of the graph. In some embodiments, generating the graph may also include generating an auxiliary edge connecting a first auxiliary node to a first node of the graph. For example, the auxiliary edge can be generated in addition to any edges generated that connect nodes of the graph corresponding to pixels.
The inventors recognized that generating the auxiliary node(s) and/or auxiliary edge(s) can increase computational efficiency of locating features using the graph. For example, feature location techniques described herein can include determining one or more paths traversing nodes and edges of the graph, and using the auxiliary node and/or edge can make selecting an appropriate path for feature location (e.g., using a cost function) more computationally efficient. In this example, using the auxiliary node and/or edge can more efficiently determine which node(s), corresponding to one or more pixels in the image and/or measurement, should be the second and/or next to last node(s) in the selected path.
In some embodiments, a first auxiliary node can be generated as a start node and a second auxiliary node can be generated as an end node. The inventors further recognized that path determination is more computationally efficient when the auxiliary node is a start or end node. In some embodiments, auxiliary edges can be generated connecting the auxiliary node(s) to some or all nodes corresponding to pixels in a same column of the graph, such as a perimeter column. For example, one of the nodes corresponding to pixels in a perimeter column may be the second or next to last node in a path that starts or ends at the perimeter column side of the image and/or measurement.
In some embodiments, weighted values can be assigned to nodes and/or edges of the graph and a preset weighted value, such as a minimum value, can be assigned to the auxiliary node(s) and/or edge(s). For example, locating a retina fundus feature using the graph can include executing a cost function based on the weighted values and/or preset weighted value. The inventors recognized that using preset weighted values, such as minimum values (e.g., local and/or global minima), can make selection of a path that indicates the location of the feature more computationally efficient.
In some examples, executing the cost function may include minimizing the cost function and selecting a path may include selecting a path of connected nodes and/or edges with a minimum cost. In some examples, (e.g., when inverted or negated cost functions are used) executing the cost function may include maximizing the cost function such that finding a path may include finding a path of connected nodes and edges with the maximum cost.
The inventors also recognized that generating one or more auxiliary edges can also make feature location using the generated graph more computationally efficient when the start and/or end node of the graph corresponds to a pixel in the image and/or measurement. According to other techniques described herein, in some embodiments, generating a graph can include generating nodes corresponding to pixels of an image and/or measurement and edges connecting the nodes, selecting a start and/or end node of the graph from among the nodes, and generating at least one auxiliary edge connecting the start and/or end node(s) to another node of the graph. For example, the start node and/or end node can be selected as a node corresponding to a corner pixel of the image and/or measurement. In this example, the auxiliary edge(s) can connect the corner pixel(s) to nodes corresponding to other pixels in the same column of the image and/or measurement, such as a perimeter column that includes the corner pixel. In some embodiments, the start node and end node can correspond to opposing corner pixels of the image and/or measurement.
Techniques described herein for locating retina fundus features in an image and/or measurement are more computationally efficient than previous techniques. For example, techniques described herein may require fewer edges when generating a graph for an image and/or measurement, which enhances efficiency of determining and/or selecting a path traversing the graph that corresponds to a feature.
The inventors have also developed other techniques described further herein that can be used alone or in combination with the above mentioned techniques to further increase the accuracy and computational efficiency of locating one or more features of a subject's retina fundus in an image and/or measurement. Such techniques can include, for example, first locating a first feature of the subject's retina fundus (e.g., a first retinal layer boundary) in an image and/or measurement and then using the location of the first feature to locate a second feature of the subject's retina fundus in the same image and/or measurement, in a derivative of the image and/or measurement, and/or in a subset of pixels of the image and/or measurement. The inventors recognized that, in some cases, the first and second features can have known juxtapositions (e.g., one is expected to be above the other, or vice versa) and/or relative pixel intensity levels in the image and/or measurement that can advantageously make locating the second feature more efficient and/or accurate after locating the first feature.
In some embodiments, techniques described herein can be performed using a system with at least one processor and memory that is configured to receive images and/or measurements over a communication network. Alternatively or additionally, in some embodiments, techniques described herein can be implemented onboard, and/or on images and/or measurements captured by an imaging and/or measuring apparatus. In some embodiments, imaging and/or measuring apparatuses described herein can be suitable for use by a subject with or without assistance from a provider, clinician, or technician. In some embodiments, the imaging and/or measuring apparatus and/or associated systems described herein can be configured to determine the subject's health status based on the captured images and/or measurements.
It should be appreciated that techniques described herein can be implemented alone or in combination with any other techniques described herein. In addition, at times, reference can be made herein only to images, but it should be appreciated that aspects described herein for images also apply to measurements, as embodiments described herein are not so limited.
II. Example Systems for Generating a Graph from an Image of a Retina
As described above, the inventors have developed techniques for generating a graph from an image of a retina. In some embodiments, such techniques may be implemented using example systems described herein. While reference is made below to images, it should be appreciated that aspects described herein for images also apply to measurements, as embodiments described herein are not so limited.
In some embodiments, imaging apparatus 130 may be configured to capture an image of a subject's retina and provide the image to computer 140 over communication network 160. As shown in
In some embodiments, processor 134 may be alternatively or additionally configured to transmit captured images over communication network 160 to computer 140. In some embodiments, the imaging apparatus 130 may include a standalone network controller configured to communicate over communication network 160. Alternatively, the network controller may be integrated with processor 134. In some embodiments, imaging apparatus 130 may include one or more displays to provide information to a user of imaging apparatus 130 via a user interface displayed on the display(s). In some embodiments, imaging apparatus 130 may be portable. For example, imaging apparatus 130 may be configured to perform eye imaging using power stored in a rechargeable battery.
In some embodiments, computer 140 may be configured to obtain an image and/or measurement of a subject's retina fundus from imaging apparatus 130 and generate a graph from the image and/or measurement. For example. the computer 150 may be configured to use the graph to locate one or more features of the subject's retina fundus, such as a boundary between first and second layers of the subject's retina fundus. As shown in
In some embodiments, processor 144 can be configured to generate a graph from an image and/or measurement of a subject's retina fundus. For example, processor 144 can be configured to generate a plurality of nodes corresponding to a respective plurality of pixels of the image and/or measurement. In this example, the processor 144 can be configured to generate nodes for each pixel of the image and/or measurement or for only a subset of the image and/or measurement. In some embodiments, the processor 144 can be configured to generate a plurality of edges connecting the plurality of nodes. For example, once connected by edges, the processor 144 can be configured to traverse the nodes of the graph along the edges. In this example, the processor 144 can be configured to generate edges connecting each node or only a subset of the generated nodes.
In some embodiments, the processor 144 can be configured to generate an auxiliary node, as a start and/or end node of the graph, and a first edge from the auxiliary node to a second node of the graph. For example, the second node can be among the plurality of generated nodes that correspond to the pixels of the image and/or measurement, and the processor 144 can be configured to generate the auxiliary node in addition to the plurality of generated nodes that correspond to the pixels of the image and/or measurement. In some embodiments, the processor 144 can be configured to also generate a second edge from the start and/or end node to a third node of the graph. For example, the second and third nodes of the graph can be perimeter nodes corresponding to pixels along the perimeter of the image and/or measurement, such as in the same column of the image and/or measurement. Alternatively or additionally, in some embodiments, processor 144 can be configured to generate a graph from an image by selecting a start and/or end node from the nodes corresponding to the pixels of the image and/or measurement, with or without generating the auxiliary node. For example, processor 144 can be configured to generate an auxiliary edge connecting a start and/or end node to another node that corresponds to a pixel of the image and/or measurement.
In some embodiments, the processor 144 can be configured to locate at least one feature of the subject's retina fundus in the image and/or measurement using the graph. For example, the processor 144 can be configured to locate a boundary between first and second layers of the subject's retina fundus. In some embodiments, the processor 144 can be configured to determine a plurality of paths from the start node to the end node of the graph. For example, the processor 144 can be configured to traverse the graph from the start node to the end node via different paths that include one or more other nodes of the graph. In some embodiments, the processor 144 can be configured to assign a cost to each path based on a cost function. For example, the processor 144 can be configured to assign a cost based on derivatives of nodes included in the path (e.g., based on the difference of derivatives of the nodes). Alternatively or additionally, the processor 144 can be configured to assign a higher cost to longer paths (e.g., paths traversing more nodes than other paths). In some embodiments, the processor 144 can be configured to select a path from among the plurality of paths. For example, the processor 144 may be configured to select the path of the plurality of paths having and/or sharing a lowest cost.
In some embodiments, computer 140 may be further configured to pre-condition the image and/or measurement for locating the feature(s) of the subject's retina fundus. For example, in some embodiments, the processor 144 can be configured to generate a derivative of the image and/or measurement and generate the graph using the image and/or measurement derivative. For example, processor 144 of computer 140 may be configured to apply a filter to the image and/or measurement to generate the derivative prior to generating the graph. Alternatively or additionally, in some embodiments the processor 144 may be configured to shift pixels within a column of the image and/or measurement prior to generating the graph. For example, the processor 144 may be configured to shift the pixels such that one or more pixels that correspond to a feature of the image and/or measurement are aligned within at least one row of pixels (e.g., with the feature contained in only one or two rows of pixels). Further alternatively or additionally, the processor 144 may be configured to select a subset of pixels of the image and/or measurement in which to locate the feature(s) of the subject's retina fundus. For example, the processor 144 can be configured to apply a pixel characteristic threshold, such as a pixel intensity threshold, to the image and/or measurement and locate the feature(s) only in subsets of pixels that are above (or below) the threshold. Alternatively or additionally, processor 144 can be configured to select a subset of pixels in which to locate the feature(s) based on previously determined locations of one or more other features in the image and/or measurement.
In accordance with various embodiments, communication network 160 may be a local area network (LAN), a cell phone network, a Bluetooth network, the internet, or any other such network. For example, computer 140 may be positioned in a remote location relative to imaging apparatus 130, such as a separate room from imaging apparatus 130, and communication network 160 may be a LAN. In some embodiments, computer 140 may be located in a different geographical region from imaging apparatus 130 and may communicate over the internet.
It should be appreciated that, in accordance with various embodiments, multiple devices may be included in place of or in addition to imaging apparatus 130. For example, an intermediary device may be included in system 100 for communicating between imaging apparatus 130 and computer 140. Alternatively or additionally, multiple computers may be included in place of or in addition to computer 140 to perform various tasks herein attributed to computer 140.
It should also be appreciated that, in some embodiments, systems described herein may not include an imaging and/or measuring apparatus, as at least some techniques described herein may be performed using images and/or measurements obtained from other systems.
III. Example Techniques for Locating Retina Fundus Features in an Image
As described herein, the inventors have developed techniques for generating a graph from an image and/or measurement of a subject's retina fundus and locating one or more features of the subject's retina fundus using the generated graph. In some embodiments, techniques described herein can be implemented using the system of
In some embodiments, pixels of image 200 can have pixel intensity values (e.g., ranging from 0 to 255), which can control the brightness of the pixels. For example, in
In some embodiments, the processor(s) may be configured to store (e.g., in the storage medium) values associated with some or all nodes of graph 300, such as based on pixel intensity values of the pixels to which the nodes correspond. For example, the processor(s) may be configured to store, associated with node 301, the pixel intensity value of pixel 201. Alternatively or additionally, the processor(s) may be configured to store, associated with node 301, the derivative of the pixel intensity of image 200 at pixel 201. In either example, the processor(s) may be configured to use the stored values associated with each node to calculate costs associated with traversing one or more paths through the graph 300. Alternatively or additionally, in some embodiments, the processor(s) may be configured to store values associated with some or all edges of graph 300, such as based on the pixel intensity values of pixels corresponding to the nodes connected by the respective edge. For example, the processor(s) may be configured to store, associated with edge 311, a value that is based on the derivative of the pixel intensity of image 200 at each pixel 201 and 202. In this example, the processor(s) may be configured to use the stored values associated with each edge to calculate costs associated with traversing one or more paths through the graph 300.
In some embodiments, stored values associated with each node and/or edge connecting a pair of nodes may be weighted. In some examples, the stored values associated with each edge may be the calculated value of a cost function based on values of the nodes that form the edge. For example, the cost function may be 2−(ga+gb)+wmin, and the processor(s) may be configured to store, associated with edge 311, a weighted value wab equal to a value of the cost function 2−(ga+gb)+wmin, where ga, gb are derivatives of pixel intensity at pixels a and b corresponding to nodes that are connected by the edge, and wmin may be a weight that is preset value. For instance, the preset value may be predetermined and/or calculated based on pixel intensity values of the image 200 rather than based on the particular pixel intensity values of the pixels corresponding to the two nodes connected by the edge. In this example, the preset value may be or may be less than the minimum value of all other edges.
In some embodiments, auxiliary node 401 and/or 402 may be start and/or end nodes of the graph 400. For example, the processor(s) may be configured to determine one or more paths traversing nodes and edges of graph 400 that start and/or end at auxiliary node 401 and/or 402. In some embodiments, the processor(s) may be configured to calculate a cost for each path, such as based on costs associated with each edge and/or node traversed by the path. For example, the costs associated with each edge and/or node may be based on the pixel intensity and/or derivative of pixel intensity at the respective edge and/or node, which can cause the lowest and/or highest cost paths to indicate the location(s) of one or more retina fundus features in the image 200. In some embodiments, the auxiliary edges connecting the auxiliary nodes 401 and/or 402 to other nodes of the graph 400 may be weighted with the same, preset value, such as the minimum value. For example, the minimum value may provide a minimum cost for traversing each auxiliary edge. According to various embodiments, the minimum cost can be a global minimum and/or a local minimum with respect to local nodes (e.g., corresponding to a particular subset of pixels).
In some embodiments, the processor(s) may be configured to determine a plurality of paths traversing graph 400 and select path 450 from among the plurality of paths. For example, the processor(s) may be configured to calculate the cost of each path based on which nodes and/or edges are traversed by the respective path and determine that path 450 has and/or shares the minimum cost. In the example of
It should be appreciated that any corner nodes of graph 300 may be selected as start and/or end nodes for determining the plurality of paths, according to various embodiments. In some embodiments, the processor(s) may be configured to determine one or more paths traversing graph 500 between nodes 303 and 309 in the manner described herein for graph 400.
In some embodiments, the processor(s) can be configured to divide pixels of image 200 into subsets 600a and 600b after locating the feature of image 200 indicated by path 601. For example, the processor(s) can be configured to sort the pixels traversed by path 601 into subset 600a together with pixels that are contiguous with the traversed pixels on one side of path 601. In this example, the processor(s) can be configured to sort the pixels on the other side of path 601 into subset 600b. In this example, since a first feature may have been located in subset 600a by processing the whole image 200 to obtain path 601, dividing the image between subsets 600a and 600b can focus further processing of image 200 in subset 600b to locate additional features.
Alternatively or additionally, in some embodiments, the processor(s) can be configured to divide pixels of image 200 into subsets 600a and 600b based on characteristics of the pixels such as pixel intensity, frequency, and/or phase. For example, the processor(s) may be configured to sort, into each subset, contiguous pixels having above a threshold pixel intensity level and/or that are within a threshold pixel intensity level of one another. In this example, dividing the image into subsets of pixels based on pixel characteristics can facilitate locating features in expected locations relative to one another (e.g., locating adjacent retinal layer boundaries in a retina fundus image) and/or distinguishing features located in different subsets based on the relative characteristics (e.g., relative pixel intensity) of the subsets. For instance, the processor(s) can be configured to apply one or more vector quantization techniques (e.g., KMeans clustering) to obtain a plurality of clusters and select the cluster having a higher (or lower) cluster mean (e.g., corresponding to pixel intensity values), at which point the processor(s) can be configured to apply a polynomial fit to locate one or more features (e.g., the Retinal Pigment Epithelium and/or Retinal Nerve Fiber Layer) in the selected cluster.
In some embodiments, the image 700 can show one or more features of the subject's retina fundus. For example, in
In some embodiments, one or more processors described herein may be configured to locate one or more features of the subject's retina fundus shown in image 700. For example, the processor(s) can be configured to generate a graph from image 700 as described herein for graph 300, graph 400, and/or graph 500 and determine one or more paths traversing the graph (e.g., path 450). In this example, the processor(s) can be configured to select one or more paths and generate a version of image 700 showing the path(s) traversing the image 700, which can indicate the location(s) of the feature(s). In the example of
In some embodiments one or more processors described herein can be configured to generate a derivative of the image 700 and generate a graph using the derivative of the image 700.
The inventors have recognized that a derivative of an image of a subject's retina fundus may emphasize the location of certain features of the subject's retina fundus in the image. For example, in derivative image 800, portions 801 and 802 of derivative image 800, which can correspond to layers 701 and 708 shown in image 700, have higher pixel intensity values than in the image 700. In some embodiments, the processor(s) may be configured to generate a graph from a positive derivative image such as derivative image 800 and determine one or more paths traversing the graph to locate, in image 700, the boundary between the subject's ILM and the region of vitreous fluid adjacent the ILM, and/or the IS-OS boundary. For example, portions of the retina fundus image between the ILM layer and vitreous fluid region and/or the IS-OS boundary may be more prominent in the derivative image than in the retina fundus image. In some embodiments, the processor(s) can be configured to alternatively or additionally generate a negative derivative image of the image 700 and generate a graph from the negative derivative image and determine one or more paths traversing the graph, such as to locate the BM layer in the image 700. For example, a negative derivative image of a retina fundus image may make the BM layer more prominent.
In some embodiments one or more processors described herein can be configured to shift pixels of image 900 within columns of image 900 after locating at least one feature in image 900. For example, the processor(s) can be configured to locate the feature indicated by curve 901 and shift pixels within columns of image 900 until curve 901 forms a line across one or more rows of pixels of image 900. The inventors have recognized that shifting pixels of an image after locating a retina fundus feature (e.g., the RPE layer) can better position pixels of the image for feature location.
The inventors have also recognized that the foregoing techniques can be combined advantageously to locate retina fundus features in an image. One example process that incorporates multiple foregoing techniques is described herein in connection with
In some embodiments, the processor(s) described herein can be alternatively or additionally configured to generate a negative derivative of image 1100 to locate retina fundus features within image 1100. For example, the processor(s) can be configured to generate the negative derivative image after locating feature 1122 in image 1100, as feature 1122 can be used to divide the negative derivative image to facilitate locating additional features in image 1100.
Also shown in
According to some embodiments, the subset may be a subset of contiguous pixels having above a threshold pixel intensity level and the one or more processors may be configured to determine whether or not a contiguous set of pixels comprises pixels having a pixel intensity level higher than a threshold pixel intensity level.
In
In some embodiments, generating the nodes and/or edges of the graph at step 2401 can include the processor(s) generating nodes for some or all pixels of the image and/or measurement and edges connecting the nodes to one another, such as described herein in connection with
In some embodiments, generating the auxiliary node at step 2402 can include the processor(s) adding the auxiliary node to the nodes corresponding to pixels of the image and/or measurement, such as described herein in connection with
In some embodiments, generating the auxiliary edge at step 2403 can include the processor(s) connecting the auxiliary node to at least one node of the graph generated at step 2401, such as described herein in connection with
In some embodiments, method 2400 may further include locating a boundary between first and second layers of the subject's retina fundus using the graph, such as described herein including in connection with
In some embodiments, method 2400 can further include shifting pixels within columns of the image and/or measurement prior to generating the graph at step 2401. Alternatively or additionally, pixels of the image and/or measurement used to generate the graph may have previously been shifted within columns of the pixels prior to performing method 2400, as embodiments described herein are not so limited.
In some embodiments, method 2400 can further include generating a derivative (e.g., a positive and/or negative derivative) of the image and/or measurement and generating the graph using the derivative(s), as described herein including in connection with
In some embodiments, method 2400 can further include dividing pixels of the image and/or measurement into subsets and selecting a subset of pixels for generating the graph at step 2401 and/or within which to locate a feature (e.g., boundary between layers) of the subject's retina fundus, such as described herein including in connection with
In
In some embodiments, generating a graph from the image and/or measurement at step 2501 may be performed in the manner described herein for step 2401 of method 2400.
In some embodiments, selecting the start and/or end node from the nodes of the graph at step 2402 can include the processor(s) selecting a corner node corresponding to a corner pixel of the image and/or measurement as the start and/or end node. In some embodiments, the processor(s) can select a first node corresponding to a first corner pixel in a first column of the image and/or measurement as the start node and a second node corresponding to a second corner pixel in a second column of the image and/or measurement as the end node.
In some embodiments, generating an auxiliary edge connecting the start or end node to another node of the graph at step 2503 can include the processor(s) generating the auxiliary edge connecting the start node to another node corresponding to a pixel in the same column as the pixel corresponding to the start node, such as described herein in connection with
In some embodiments, method 2500 can further include assigning weighted values to some or all edges generated at step 2501 and/or assigning a preset weighted value to the auxiliary edge(s) generated at step 2503, such as described herein in connection with
In some embodiments, method 2500 can further include other steps of method 2400 described herein in connection with
As shown in
In some embodiments, shifting pixels within one or more columns of the image and/or measurement at step 2601 can include the processor(s) shifting the pixels until pixels of the image and/or measurement corresponding to at least one feature of the subject's retina fundus (e.g., the RPE) form a line along one or more rows of the image and/or measurement, as described herein in connection with
In some embodiments, generating a first derivative image and/or measurement from the image and/or measurement for locating the first feature(s) at step 2602 can include the processor(s) generating a positive and/or negative derivative image as described herein in connection with
In some embodiments, generating the second derivative image and/or measurement from the image and/or measurement for locating the second feature(s) at step 2603 can include the processor(s) generating a positive and/or negative derivative image as described herein for step 2602. For example, the processor(s) can generate a negative derivative image that further emphasizes the second feature(s) (e.g., the RPE-BM boundary) in the image and/or measurement. In some embodiments, the processor(s) can determine the location of the second feature(s) using the location(s) of the first feature(s) located at step 2602, as described herein in connection with
In some embodiments, selecting a subset of the image and/or measurement for locating the third feature(s) at step 2604 can include the processor(s) applying a threshold (e.g., pixel intensity threshold) to pixels of the image and/or measurement and selecting one or more subsets of pixels of the image and/or measurement that are above, or below, the threshold, such as described herein in connection with
In some embodiments, step 2604 can alternatively or additionally include generating a derivative image and/or measurement of the same or another selected subset(s), in the manner described herein for steps 2602-2603, for locating the third feature(s) (e.g., the INL-OPL, IPL-INL, and/or OPL-ONL), such as described herein in connection with
In some embodiments, method 2600 can further include some or all steps of method 2400 and/or 2500 described in connection with
IV. Applications
The inventors have developed improved imaging and measuring techniques that may be implemented using imaging apparatuses described herein. According to various embodiments, such imaging and measuring techniques may be used for processing an image.
The inventors have recognized that various health conditions may be indicated by the appearance of a person's retina fundus in one or more images captured according to techniques described herein. For example, diabetic retinopathy may be indicated by tiny bulges or micro-aneurysms protruding from the vessel walls of the smaller blood vessels, sometimes leaking fluid and blood into the retina. In addition, larger retinal vessels can begin to dilate and become irregular in diameter. Nerve fibers in the retina may begin to swell. Sometimes, the central part of the retina (macula) begins to swell, such as macular edema. Damaged blood vessels may close off, causing the growth of new, abnormal blood vessels in the retina. Glaucomatous optic neuropathy, or Glaucoma, may be indicated by thinning of the parapapillary retinal nerve fiber layer (RNFL) and optic disc cupping as a result of axonal and secondary retinal ganglion cell loss. The RNFL layer may be measured, for example, as averages in different eye sectors around the optic nerve head. The inventors have recognized that RNFL defects, for example indicated by OCT, are one of the earliest signs of glaucoma. In addition, age-related macular degeneration (AMD) may be indicated by the macula peeling and/or lifting, disturbances of macular pigmentation such as yellowish material under the pigment epithelial layer in the central retinal zone, and/or drusen such as macular drusen, peripheral drusen, and/or granular pattern drusen. AMD may also be indicated by geographic atrophy, such as a sharply delineated round area of hyperpigmentation, nummular atrophy, and/or subretinal fluid.
Stargardt's disease may be indicated by death of photoreceptor cells in the central portion of the retina. Macular edema may be indicated by a trench in an area surrounding the fovea. A macular hole may be indicated by a hole in the macula. Diabetic macular edema (DME) may be indicated by fluid accumulation in the retina due to damaged vessel leakage. Eye floaters may be indicated by non-focused optical path obscuring. Retinal detachment may be indicated by severe optic disc disruption, and/or separation from the underlying pigment epithelium. Retinal degeneration may be indicated by the deterioration of the retina. Age-related macular degeneration (AMD) may be indicated by a thinning of the retina overall, in particular the RPE layer. Wet AMD may also lead to leakage in the retina. Central serous retinopathy (CSR) may be indicated by an elevation of sensory retina in the macula, and/or localized detachment from the pigment epithelium. Choroidal melanoma may be indicated by a malignant tumor derived from pigment cells initiated in the choroid. Cataracts may be indicated by opaque lens, and may also cause blurring fluorescence lifetimes and/or 2D retina fundus images. Macular telangiectasia may be indicated by a ring of fluorescence lifetimes increasing dramatically for the macula, and by smaller blood vessels degrading in and around the fovea. Alzheimer's disease and Parkinson's disease may be indicated by thinning of the RNFL. It should be appreciated that diabetic retinopathy, glaucoma, and other such conditions may lead to blindness or severe visual impairment if not properly screened and treated. In another example, optic neuropathy, optic atrophy and/or choroidal folding can be indicated in images captured using techniques described herein. Optic neuropathy and/or optic atrophy may be caused by damage within the eye, such as glaucoma, optic neuritis, and/or papilledema, damage along the path of the optic nerve to the brain, such as a tumor, neurodegenerative disorder, and/or trauma, and/or congenital conditions such as Leber's hereditary optic atrophy (LHOA) autosomal dominant optic atrophy (ADOA). For example, compressive optic atrophy may be indicated by and/or associated with such extrinsic signs as pituitary adenoma, intracranial meningioma, aneurysms, craniopharyngioma, mucoceles, papilloma, and/or metastasis, and/or such extrinsic signs as optic nerve glioma, optic nerve sheath (ONS) meningioma, and/or lymphoma. Optic atrophy may be indicated by macular thinning with preserved foveal thickness. Vascular and/or ischemic optic atrophy be indicated by and/or associated with sector disc pallor, non-arteritic anterior ischemic optic neuropathy (NAION), arteritic ischemic optic neuropathy (AION), severe optic atrophy with gliosis, giant cell arteritis, central retinal artery occlusion (CRAO), carotid artery occlusion, and/or diabetes. Neoplastic optic atrophy may be indicated by and/or associated with lymphoma, leukemia, tumor, and/or glioma Inflammatory optic atrophy may be indicated by sarcoid, systemic lupus erythematosus (SLE), Behcet's disease, demyelination, such as multiple-sclerosis (MS) and/or neuromyelitis optica spectrum disorder (NMOSD) also known as Devic disease, allergic angiitis (AN), and/or Churg-Strauss syndrome. Infectious optic atrophy may be indicated by the presence of a viral, bacterial, and/or fungal infection. Radiation optic neuropathy may also be indicated.
Moreover, in some embodiments, an imaging apparatus may be configured to detect a concussion at least in part by tracking the movement of a person's eye(s) over a sequence of images. For example, iris sensors, white light imaging components, and/or other imaging components described herein may be configured to track the movement of the person's eyes for various indications of a concussion. Toxic optic atrophy and/or nutritional optic atrophy may be indicated in association with ethambutol, amiodarone, methanol, vitamin B12 deficiency, and/or thyroid ophthalmopathy. Metabolic optic atrophy may be indicated by and/or associated with diabetes. Genetic optic atrophy may be indicated by and/or associated with ADOA and/or LHOA. Traumatic optic neuropathy may be indicated by and/or associated with trauma to the optic nerve, ONS hematoma, and/or a fracture.
Accordingly, in some embodiments, a person's predisposition to various medical conditions may be determined based on one or more images of the person's retina fundus captured according to techniques described herein. For example, if one or more of the above described signs of a particular medical condition (e.g., macula peeling and/or lifting for AMD) is detected in the captured image(s), the person may be predisposed to that medical condition.
The inventors have also recognized that some health conditions may be detected using fluorescence imaging techniques described herein. For example, macular holes may be detected using an excitation light wavelength between 340-500 nm to excite retinal pigment epithelium (RPE) and/or macular pigment in the subject's eye having a fluorescence emission wavelength of 540 nm and/or between 430-460 nm. Fluorescence from RPE may be primarily due to lipofuscin from RPE lysomes. Retinal artery occlusion may be detected using an excitation light wavelength of 445 nm to excite Flavin adenine dinucleotides (FAD), RPE, and/or nicotinamide adenine dinucleotide (NADH) in the subject's eye having a fluorescence emission wavelength between 520-570 nm. AMD in the drusen may be detected using an excitation light wavelength between 340-500 nm to excite RPE in the subject's eye having a fluorescence emission wavelength of 540 nm and/or between 430-460 nm. AMD including geographic atrophy may be detected using an excitation light wavelength of 445 nm to excite RPE and elastin in the subject's eye having a fluorescence emission wavelength between 520-570 nm. AMD of the neovascular variety may be detected by exciting the subject's choroid and/or inner retina layers. Diabetic retinopathy may be detected using an excitation light wavelength of 448 nm to excite FAD in the subject's eye having a fluorescence emission wavelength between 590-560 nm. Central serous chorio-retinopathy (CSCR) may be detected using an excitation light wavelength of 445 nm to excite RPE and elastin in the subject's eye having a fluorescence emission wavelength between 520-570 nm. Stargardt's disease may be detected using an excitation light wavelength between 340-500 nm to excite RPE in the subject's eye having a fluorescence emission wavelength of 540 nm and/or between 430-460 nm. Choroideremia may be detected using an excitation light wavelength between 340-500 nm to excite RPE in the subject's eye having a fluorescence emission wavelength of 540 nm and/or between 430-460 nm.
The inventors have also developed techniques for using a captured image of a person's retina fundus to diagnose various health issues of the person. For example, in some embodiments, any of the health conditions described above may be diagnosed.
In some embodiments, imaging techniques described herein may be used for health status determination, which may include determinations relating to cardiac health, cardiovascular disease and/or cardiovascular risk, anemia, retinal toxicity, body mass index, water weight, hydration status, muscle mass, age, smoking habits, blood oxygen levels, heart rate, white blood cell counts, red blood cell counts, and/or other such health attributes. For example, in some embodiments, a light source having a bandwidth of at least 40 nm may be configured with sufficient imaging resolution capturing red blood cells having a diameter of 6 μm and white blood cells having diameters of at least 15 μm. Accordingly, imaging techniques described herein may be configured to facilitate sorting and counting of red and white blood cells, estimating the density of each within the blood, and/or other such determinations.
In some embodiments, imaging techniques described herein may facilitate tracking of the movement of blood cells to measure blood flow rates. In some embodiments, imaging techniques described herein may facilitate tracking the width of the blood vessels, which can provide an estimate of blood pressure changes and profusion. For example, an imaging apparatus as described herein configured to resolve red and white blood cells using a 2-dimensional (2D) spatial scan completed within 1 μs may be configured to capture movement of blood cells at 1 meter per second. In some embodiments, light sources that may be included in apparatuses described herein, such as super-luminescent diodes, LEDs, and/or lasers, may be configured to emit sub-microsecond light pulses such that an image may be captured in less than one microsecond. Using spectral scan techniques described herein, an entire cross section of a scanned line (e.g., in the lateral direction) versus depth can be captured in a sub-microsecond. In some embodiments, a 2-dimensional (2D) sensor described herein may be configured to capture such images for internal or external reading at a slow rate and subsequent analysis. In some embodiments, a 3D sensor may be used. Embodiments described below overcome the challenges of obtaining multiple high quality scans within a single microsecond.
In some embodiments, imaging apparatuses described herein may be configured to scan a line aligned along a blood vessel direction. For example, the scan may be rotated and positioned after identifying a blood vessel configuration of the subject's retina fundus and selecting a larger vessel for observation. In some embodiments, a blood vessel that is small and only allows one cell to transit the vessel in sequence may be selected such that the selected vessel fits within a single scan line. In some embodiments, limiting the target imaging area to a smaller section of the subject's eye may reduce the collection area for the imaging sensor. In some embodiments, using a portion of the imaging sensor facilitates increasing the imaging frame rate to 10 s of KHz. In some embodiments, imaging apparatuses described herein may be configured to perform a fast scan over a small area of the subject's eye while reducing spectral spread interference. For example, each scanned line may use a different section of the imaging sensor array. Accordingly, multiple depth scans may be captured at the same time, where each scan is captured by a respective portion of the imaging sensor array. In some embodiments, each scan may be magnified to result in wider spacing on the imaging sensor array, such as wider than the dispersed spectrum, so that each depth scan may be measured independently.
Having thus described several aspects and embodiments of the technology set forth in the disclosure, it is to be appreciated that various alterations, modifications, and improvements will readily occur to those skilled in the art. Such alterations, modifications, and improvements are intended to be within the spirit and scope of the technology described herein. For example, those of ordinary skill in the art will readily envision a variety of other means and/or structures for performing the function and/or obtaining the results and/or one or more of the advantages described herein, and each of such variations and/or modifications is deemed to be within the scope of the embodiments described herein. Those skilled in the art will recognize or be able to ascertain using no more than routine experimentation, many equivalents to the specific embodiments described herein. It is, therefore, to be understood that the foregoing embodiments are presented by way of example only and that, within the scope of the appended claims and equivalents thereto, inventive embodiments may be practiced otherwise than as specifically described. In addition, any combination of two or more features, systems, articles, materials, kits, and/or methods described herein, if such features, systems, articles, materials, kits, and/or methods are not mutually inconsistent, is included within the scope of the present disclosure.
The above-described embodiments can be implemented in any of numerous ways. One or more aspects and embodiments of the present disclosure involving the performance of processes or methods may utilize program instructions executable by a device (e.g., a computer, a processor, or other device) to perform, or control performance of, the processes or methods. In this respect, various inventive concepts may be embodied as a computer readable storage medium (or multiple computer readable storage media) (e.g., a computer memory, one or more floppy discs, compact discs, optical discs, magnetic tapes, flash memories, circuit configurations in Field Programmable Gate Arrays or other semiconductor devices, or other tangible computer storage medium) encoded with one or more programs that, when executed on one or more computers or other processors, perform methods that implement one or more of the various embodiments described above. The computer readable medium or media can be transportable, such that the program or programs stored thereon can be loaded onto one or more different computers or other processors to implement various ones of the aspects described above. In some embodiments, computer readable media may be non-transitory media.
The terms “image” and “measurement” as used herein in the specification and in the claims, unless clearly indicated to the contrary should be understood to mean an image and/or measurement, i.e., an image and a measurement, an image, or a measurement. The terms “images” and “measurements” may also be understood to mean images and/or measurements.
The above-described embodiments can be implemented in any of numerous ways. One or more aspects and embodiments of the present disclosure involving the performance of processes or methods may utilize program instructions executable by a device (e.g., a computer, a processor, or other device) to perform, or control performance of, the processes or methods. In this respect, various inventive concepts may be embodied as a computer readable storage medium (or multiple computer readable storage media) (e.g., a computer memory, one or more floppy discs, compact discs, optical discs, magnetic tapes, flash memories, circuit configurations in Field Programmable Gate Arrays or other semiconductor devices, or other tangible computer storage medium) encoded with one or more programs that, when executed on one or more computers or other processors, perform methods that implement one or more of the various embodiments described above. The computer readable medium or media can be transportable, such that the program or programs stored thereon can be loaded onto one or more different computers or other processors to implement various ones of the aspects described above. In some embodiments, computer readable media may be non-transitory media.
The terms “program” or “software” are used herein in a generic sense to refer to any type of computer code or set of computer-executable instructions that can be employed to program a computer or other processor to implement various aspects as described above. Additionally, it should be appreciated that according to one aspect, one or more computer programs that when executed perform methods of the present disclosure need not reside on a single computer or processor, but may be distributed in a modular fashion among a number of different computers or processors to implement various aspects of the present disclosure.
Computer-executable instructions may be in many forms, such as program modules, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Typically, the functionality of the program modules may be combined or distributed as desired in various embodiments.
Also, data structures may be stored in computer-readable media in any suitable form. For simplicity of illustration, data structures may be shown to have fields that are related through location in the data structure. Such relationships may likewise be achieved by assigning storage for the fields with locations in a computer-readable medium that convey relationship between the fields. However, any suitable mechanism may be used to establish a relationship between information in fields of a data structure, including through the use of pointers, tags or other mechanisms that establish relationship between data elements.
When implemented in software, the software code can be executed on any suitable processor or collection of processors, whether provided in a single computer or distributed among multiple computers.
Further, it should be appreciated that a computer may be embodied in any of a number of forms, such as a rack-mounted computer, a desktop computer, a laptop computer, or a tablet computer, as non-limiting examples. Additionally, a computer may be embedded in a device not generally regarded as a computer but with suitable processing capabilities, including a Personal Digital Assistant (PDA), a smartphone or any other suitable portable or fixed electronic device.
Also, a computer may have one or more input and output devices. These devices can be used, among other things, to present a user interface. Examples of output devices that can be used to provide a user interface include printers or display screens for visual presentation of output and speakers or other sound generating devices for audible presentation of output. Examples of input devices that can be used for a user interface include keyboards, and pointing devices, such as mice, touch pads, and digitizing tablets. As another example, a computer may receive input information through speech recognition or in other audible formats.
Such computers may be interconnected by one or more networks in any suitable form, including a local area network or a wide area network, such as an enterprise network, and intelligent network (IN) or the Internet. Such networks may be based on any suitable technology and may operate according to any suitable protocol and may include wireless networks, wired networks or fiber optic networks.
The acts performed as part of the methods may be ordered in any suitable way. Accordingly, embodiments may be constructed in which acts are performed in an order different than illustrated, which may include performing some acts simultaneously, even though shown as sequential acts in illustrative embodiments.
All definitions, as defined and used herein, should be understood to control over dictionary definitions, definitions in documents incorporated by reference, and/or ordinary meanings of the defined terms.
The indefinite articles “a” and “an,” as used herein in the specification and in the claims, unless clearly indicated to the contrary, should be understood to mean “at least one.”
The phrase “and/or,” as used herein in the specification and in the claims, should be understood to mean “either or both” of the elements so conjoined, i.e., elements that are conjunctively present in some cases and disjunctively present in other cases. Multiple elements listed with “and/or” should be construed in the same fashion, i.e., “one or more” of the elements so conjoined. Other elements may optionally be present other than the elements specifically identified by the “and/or” clause, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, a reference to “A and/or B”, when used in conjunction with open-ended language such as “comprising” can refer, in one embodiment, to A only (optionally including elements other than B); in another embodiment, to B only (optionally including elements other than A); in yet another embodiment, to both A and B (optionally including other elements); etc.
As used herein in the specification and in the claims, the phrase “at least one,” in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements. This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase “at least one” refers, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, “at least one of A and B” (or, equivalently, “at least one of A or B,” or, equivalently “at least one of A and/or B”) can refer, in one embodiment, to at least one, optionally including more than one, A, with no B present (and optionally including elements other than B); in another embodiment, to at least one, optionally including more than one, B, with no A present (and optionally including elements other than A); in yet another embodiment, to at least one, optionally including more than one, A, and at least one, optionally including more than one, B (and optionally including other elements); etc.
Also, the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use of “including,” “comprising,” or “having,” “containing,” “involving,” and variations thereof herein, is meant to encompass the items listed thereafter and equivalents thereof as well as additional items.
In the claims, as well as in the specification above, all transitional phrases such as “comprising,” “including,” “carrying,” “having,” “containing,” “involving,” “holding,” “composed of,” and the like are to be understood to be open-ended, i.e., to mean including but not limited to. Only the transitional phrases “consisting of” and “consisting essentially of” shall be closed or semi-closed transitional phrases, respectively.
This application claims the benefit of priority under 35 U.S.C. § 119(e) to U.S. Provisional Application Ser. No. 63/284,791, titled “FEATURE LOCATION TECHNIQUES FOR RETINA FUNDUS IMAGES AND/OR MEASUREMENTS,” and filed on Dec. 1, 2021, the entire contents of which are incorporated by reference herein.
Number | Date | Country | |
---|---|---|---|
63284791 | Dec 2021 | US |