The present invention relates to an information processing apparatus, an information processing method, and a storage medium.
Various types of ophthalmologic apparatuses employing optical devices have conventionally been used. Optical devices include anterior eye portion cameras, fundus cameras, and Scanning Laser Ophthalmoscopes (SLOs). There is also known an optical tomographic imaging apparatus based on Optical Coherence Tomography (OCT) using multi-wavelength light wave interference. An OCT-based optical tomographic imaging apparatus is capable of obtaining a high-resolution tomographic image of a sample, and is becoming indispensable as an ophthalmologic apparatus for out-patient clinics specialized in retina. In recent years, OCT Angiography (hereinafter referred to as OCTA) has been used as a method for visualizing capillary vessels without using an angiography agent in OCT. As discussed in Japanese Unexamined Patent Application Publication No. 2015-515894, OCTA is a method for visualizing thin blood vessels such as capillary vessels. According to this method, a scan with measurement light is performed at the same position on the retina a plurality of times to detect motions of scattering particles such as red corpuscles.
According to an aspect of the present invention, an information processing apparatus includes a scan control unit configured to control a scan with measurement light for generating a three-dimensional motion contrast image of a fundus, and a display control unit configured to display information indicating a progress of the scan on a display unit.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
In Optical Coherence Tomography Angiography (OCTA), a volume scan is acquired by scanning the same position a plurality of times, and therefore the imaging time tends to be longer than conventional volume scan. A long imaging time may cause fatigue of the subject eye, and accordingly imaging may not be normally performed because of blink, pupil vignetting, or face movement. To prevent this situation, it is beneficial for an operator to talk to a subject while monitoring the progress of imaging. However, there has been an issue that the operator cannot understand the progress of imaging.
The technique to be discussed has been devised in view of such an issue to allow an operator to understand the progress of imaging.
A first exemplary embodiment of the present invention will be described below with reference to the accompanying drawings.
An information processing apparatus 150 performs tomographic image construction and controls the stage unit 120. A display unit 160 displays various information. An input unit 170 is a keyboard, a mouse, etc. and receives a user operation. In the information processing apparatus 150, a central processing unit (CPU) 151 reads a control program stored in a read only memory (ROM) 152 and performs various processing. A random access memory (RAM) 153 is used as a main memory of the CPU 151 and a temporary storage area such as a work area. A hard disk drive (HDD) 154 stores various data and various programs. Functions and processing of the information processing apparatus 150 (described below) are implemented when the CPU 151 reads a program stored in the ROM 152 or HDD 154 and executes the program. Functions and processing of the information processing apparatus 150 (described below) may be implemented by using a processor other than a CPU. For example, a graphics processing unit (GPU) may be used instead of a CPU.
A hardware configuration of the imaging system 100 is not limited to that in the present exemplary embodiment. For example, the display unit 160 and the input unit 170 may be integrally formed with the information processing apparatus 150, as a touch panel apparatus and so on. In addition, the information processing apparatus 150 may be integrally formed with the base unit 130.
Likewise, a third dichroic mirror 223 further branches the optical path 232 into an optical path for a charge couple device (CCD) 241 for fundus observation and an optical path for a fixation lamp 240, according to a wavelength band. The optical head 110 includes lenses 213 and 214. The lens 213 is driven by a motor (not illustrated) for focusing for the fixation lamp 240 and fundus observation. The CCD 241 has sensitivity to the wavelength of illumination light for fundus observation (not illustrated), more specifically, around 780 nm. The fixation lamp 240 generates visible light to encourage the fixation of the subject eye. The optical path 233 includes lenses 210 and 211, a slit prism 219, and an infrared CCD 242 for anterior eye portion observation. The CCD 242 has sensitivity to the wavelength of illumination light for anterior eye portion observation (not illustrated), more specifically, around 970 nm.
The optical path 231 configures an OCT optical system as described above to capture a tomographic image of the fundus of the subject eye 200. More specifically, the optical path 231 is used for acquiring an interference signal for forming a tomographic image. A shutter 234 allows the light irradiation to the subject eye only during the imaging time. An XY scanner 235 performs a scan with light on the fundus. The XY scanner 235 illustrated as one mirror scans in 2-axis (XY) directions. The optical head 110 includes lenses 215 and 216. The lens 215 is driven by a motor (not illustrated) to focus, on the fundus of the subject eye 200, the light from a light source 237 emitted from an optical fiber 252 connected to an optical coupler 236. In this focusing, light from the fundus is simultaneously focused in spot form and incident at the end of the optical fiber 252.
A configuration of the optical path from the light source, a reference optical system, and a spectroscope will be described below. The optical head 110 further includes a light source 237, a mirror 224, and a density filter 238 connected to a motor (not illustrated). The density filter 238 rotates to change the amount of transmitted light. The optical head 110 further includes single mode optical fibers 251 to 254 connected to and integrated with the optical coupler 236. The optical head 110 further includes a lens 217 and a spectroscope 280. These components configure a Michelson interference system. Light emitted from the light source 237 passes through the optical fiber 251 and is divided into measurement light on the side of the optical fiber 252 and reference light on the side of the optical fiber 253 via the optical coupler 236.
The measurement light passes through the optical path of the above-described OCT optical system, illuminates the fundus of the subject eye 200 as an observation target, and reaches the optical coupler 236 via the same optical path through reflection and dispersion on the retina. On the other hand, the reference light passes through the optical fiber 253, the density filter 238, the lens 217, and the dispersion compensation glass 239 inserted to adjust the dispersions of the measurement light and reference light, and reaches the mirror 224 to be reflected thereby. Then, the reference light returns along the same optical path and reaches the optical coupler 236.
The measurement light and reference light are coupled by the optical coupler 236 to become interference light. When the light path length of the measurement light becomes almost the same as the light path length of the reference light, interference occurs. The mirror 224 is held to be adjustable in the optical axis direction by a motor and drive mechanism (not illustrated), so that the light path length of the reference light can be adjusted with the light path length of the measurement light which depends on the subject eye 200. The interference light is led to the spectroscope 280 via the optical fiber 254.
The optical head 110 includes a polarization adjustment portion 261 on the measurement light side disposed in the optical fiber 252, and a polarization adjustment portion 262 on the reference light side disposed in the optical fiber 253. The polarization adjustment portions 261 and 262 include some portions of optical fibers being winded in loop form. In the polarization adjustment portions 261 and 262, the loop portions are rotated centering on the fiber longitudinal direction to apply torsion to respective fibers, enabling adjusting polarization conditions of the measurement light and reference light. In this apparatus, it is assumed that the polarization conditions of the measurement light and reference light have been adjusted and fixed in advance.
The spectroscope 280 includes lenses 281 and 282, a diffraction grating 283, and a line sensor 284. After the interference light emitted from the optical fiber 254 becomes parallel light via the lens 281, the parallel light is spectrally dispersed by the diffraction grating 283 and focused on the line sensor 284 by the lens 282.
The periphery of the light source 237 will be described below. The light source 237 is a Super Luminescent Diode (SLD) as a typical low coherent light source. The light source 237 has a central wavelength of 855 nm and a wavelength bandwidth of about 100 nm. The bandwidth is a significant parameter since it influences the resolution (in the optical axis direction) of the tomographic image to be obtained. Although an SLD is selected as the type of the light source 237, Amplified Spontaneous Emission (ASE) is also applicable since it is only necessary to emit low coherent light. Near-infrared light is suitable as the central wavelength, taking the eye measurement into consideration. Further, since the central wavelength influences the horizontal resolution of a tomographic image to be obtained, it is desirable that the central wavelength is an as short wavelength as possible. For both reasons, the central wavelength was set to 855 nm.
Although, in the present exemplary embodiment, a Michelson interferometer was used as an interferometer, a Mach-Zehnder interferometer is also usable. When there is a large difference in light quantity between the measurement light and the reference light, the use of a Mach-Zehnder interferometer is desirable. On the other hand, when there is a small difference in light quantity therebetween, the use of a Michelson interferometer is desirable.
The number of repetitions m may be determined according to the A scan speed and the moving amount of the subject eye 200. The character p indicates the number of times of A scan sampling in a single B scan. More specifically, the plane image size is determined by p×n. When p×n is large and the measurement pitch (Δx, Δy) is the same, it is possible to scan a wide range. In this case, however, the scanning time prolongs, causing the above-described issue of a motion artifact occurrence and issue of an increase in burden on the subject.
Δx indicates the interval (x pitch) between adjacent X positions, and Δy indicates the interval (y pitch) between adjacent Y positions. According to the present exemplary embodiment, the x pitch is set to a half of the beam spot diameter of irradiation light on the fundus, i.e., 10 μm. Even if a pitch is set to be smaller than a half of the beam spot diameter on the fundus, the effect of increasing the definition of an image to be generated is small. Similar to Δx, Δy is also set to 10 μm. Although Δy may be set to be larger than 10 m to reduce the scanning time, it is preferable that Δy does not exceed 20 μm which is the beam spot diameter. As for the x and y pitches, although increasing the beam spot diameter on the fundus degrades the definition, an image of a wide range can be acquired with a small volume of data. The x and y pitches may be freely changed according to clinical demands. Although, in the example illustrated in
A layer recognition unit 402 extracts retinal layer structures based on a two-dimensional tomographic image, and identifies the shape of each layer boundary. Retina layers to be identified include the retinal nerve fiber layer (RNFL), ganglion cell layer (GCL), inner nuclear layer (INL), outer nuclear layer+inner segment (ONL+IS), outer segment (OS), retinal pigment epithelium (RPE), and Bruch's membrane (BM). A generation unit 403 selects predetermined pixels for each pixel sequence from respective pixel value sequences (in the depth direction) of the tomographic image obtained by the reconstruction unit 401, and generates a two-dimensional image. Examples of two-dimensional images include projection images, EnFace images, and OCTA images. A projection image is generated by integrating predetermined pixels for each pixel sequence from respective pixel value sequences (in the depth direction) of the tomographic image obtained by the reconstruction unit 401.
Each function of the information processing apparatus 150 (described above with reference to
A method for processing an EnFace image will be described below with reference to
Based on the input boundary layer shape, the generation unit 403 sets a depth range Zi for generating an EnFace image in each A scan image that constitutes a tomographic image. The generation unit 403 sets depth ranges Zi for all of A scan images Ai. The generation unit 403 calculates the average value, maximum value, and median value of tomographic images included in the set depth ranges and then generates an EnFace image by using two-dimensional distributions of these values.
An OCTA image is generated by using de-correlation values between tomographic images obtained by scanning the same portion a plurality of times. Also in OCTA image generation, similar to the method for processing an EnFace image, the layer recognition unit 402 extracts retinal layer structures from a two-dimensional tomographic image and identifies the shape of each layer boundary. Based on the boundary shapes, the generation unit 403 sets a depth range Zi for calculating OCTA in each A scan image that constitutes a tomographic image. The generation unit 403 further calculates a two-dimensional OCTA image by calculating the average value in the depth direction of de-correlation values between tomographic images included in the depth range Zi.
The imaging control unit 404 controls the optical head 110 and the base unit 130. The imaging control unit 404 controls, for example, the scanning with the measurement light. A progress identification unit 405 identifies the scan progress, i.e., the progress of an OCTA image, through control of the imaging control unit 404. A display processing unit 406 controls display on the display unit 160.
When the user presses a Start button 611, the imaging control unit 404 automatically performs focal/alignment adjustment. When performing fine focal/alignment adjustment, the imaging control unit 404 moves the position of the optical head 110 in the z direction relative to the subject eye in response to an operation on a slider 621. The imaging control unit 404 also performs focal adjustment in response to an operation on a slider 622 and performs position adjustment on a coherence gate in response to an operation on a slider 623. Then, when the user presses a Capture button 612, the imaging control unit 404 performs control to start imaging. When a result of image capturing is obtained, a tomographic image 630 is displayed as illustrated in
In step S700, before OCTA image capturing is performed, the imaging control unit 404 acquires an SLO image as a reference image. An SLO image is an example of an image obtained by capturing the imaging range corresponding to an OCTA image (three-dimensional motion contrast image) obtained in imaging control processing. In step S701, the imaging control unit 404 starts OCTA image capturing. The imaging control unit 404 performs B scan image capturing while sequentially changing the scanning line from y1 to yn. The number of scanning lines and the interval between lines are predetermined. This processing is an example of scan control processing for controlling a scan with the measurement light.
In step S702, during imaging, the imaging control unit 404 acquires an SLO image as an observation image. In step S703, the imaging control unit 404 performs image processing to calculate the amount of positional deviation between the reference image and the SLO image acquired in step S702. In step S704, based on the deviation amount, the imaging control unit 404 determines whether the processing target area (scanning line) is to be rescanned. More specifically, the imaging control unit 404 compares the deviation amount calculated in step S703 with a threshold value as a preset value. The deviation amount equal to or larger than the threshold value means that the B scan position acquired last is not correct. In this case, the imaging control unit 404 changes the scanning line back to the last line to correct the scanning position and then perform a rescan. Therefore, when the deviation amount is equal to or larger than the threshold value, the imaging control unit 404 determines that the processing target area is to be rescanned.
When the imaging control unit 404 determines that the processing target areas is to be rescanned (YES in step S704), the processing proceeds to step S705. On the other hand, when the imaging control unit 404 determines that the processing target area is not to be rescanned (NO in step S704), the processing proceeds to step S706. In step S705, the imaging control unit 404 changes the scanning line back to the last line. In a case where the current scanning line is Y4, for example, in step S705, the imaging control unit 404 changes the scanning line from Y4 to Y3, then the processing proceeds to step S706. Although, in the present exemplary embodiment, the imaging control unit 404 changes the scanning line back to the last line when performing a rescan, the line position to be rescanned is not limited to the last line.
The line position to be rescanned may be determined according to one frame time of an SLO image for determining a rescan and the time interval between B scans in OCTA. For example, in a case where a single frame includes 10 B scans, it is desirable, in rescanning, to change the line position back to the B scan position by 10 lines. In another example, the imaging control unit 404 may set B scans included in a single frame of the corresponding SLO image as rescan targets, instead of the processing target scanning lines.
In step S706, the imaging control unit 404 corrects the scanning position based on the deviation amount calculated in step S703. More specifically, the imaging control unit 404 sets, as a new scanning position, a position offset by the deviation amount calculated in step S703. In step S707, the imaging control unit 404 acquires a B scan image. Each time a scanned image (tomographic image) is acquired, the display processing unit 406 updates the display of the tomographic image 630 in the measurement screen 600 to the newly obtained tomographic image. The imaging control unit 404 repeats the processing in steps S702 to S707 while sequentially changing the scanning line from y1 to yn to complete imaging of all scanning areas.
In repeating the processing, the information processing apparatus 150 according to the present exemplary embodiment performs progress display processing for displaying the progress of processing. More specifically, after completion of the processing in step S707, then in step S708, the progress identification unit 405 of the information processing apparatus 150 identifies the scan progress related to OCTA image capturing. More specifically, the progress identification unit 405 identifies, as a scan progress rate, the ratio of the number of B scans completed till the time of the processing in step S708 to the number of B scans to be performed in OCTA image capturing.
The progress identification unit 405 identifies the remaining time of OCTA image capturing. In a scan, the readout time of the line sensor 284 becomes dominant (rate controlling) at the time of scan processing. According to the present exemplary embodiment, therefore, the progress identification unit 405 calculates the remaining time based on the readout time of the line sensor 284 related to a single A scan. More specifically, in the information processing apparatus 150, “(Readout time of line sensor related to a single A scan)×(Number of repetitions of A scan)×(Number of B scans)” is preset as an OCTA image capturing time. Based on the number of B scans completed till the time of the processing in step S708, the progress identification unit 405 obtains the elapsed time after the start of imaging and calculates the remaining time by subtracting the elapsed time from the imaging time.
In another example, the progress identification unit 405 may obtain the actual elapsed time after the start of imaging. Then, the progress identification unit 405 may recalculate the time used for a single B scan based on the actual elapsed time and the number of B scans completed, change the imaging time based on the recalculated used time, and recalculate the remaining time according to the changed imaging time.
In step S709, the display processing unit 406 performs control to display the progress in an imaging screen and update the display. This processing is an example of display control processing for displaying information about the scan progress on the display unit 160. The CPU 151 changes the scanning line yi by incrementing i by one (up to n) to repeat the processing in steps S702 to S709 for the number of scanning lines. When the processing is completed for up to the scanning line yn, imaging control processing ends.
In this way, the imaging system 100 according to the present exemplary embodiment displays the progress of imaging (scanning) at the time of OCTA image capturing. This display allows the operator to know about not only the scan progress but also the timing when scan will end, and therefore assist and talk to the subject. This allows the operator to expect the success rate of imaging to be improved.
As a first modification of the first exemplary embodiment, the display processing unit 406 may display, on the SLO image 640, an image for distinguishing between an area where scan is completed and an area where scan is not completed. However, the display contents are not limited to the present exemplary embodiment. For example, as illustrated in
As a second modification, information indicating the progress is not limited to the exemplary embodiment. In another example, the total number of lines, the number of scanned lines, the remaining numbers of lines, elapsed processing time, and total processing time in a scan may be displayed in numerical form.
As the second modification, the timing of updating the progress display is not limited to the exemplary embodiment. In another example, the CPU 151 may perform the processing in steps S708 and S709 every other time in repetition of the processing in steps S702 to S707. In another example, the CPU 151 may periodically perform the processing in steps S708 and S709 asynchronously with the repetition of the processing in steps S702 to S707. In another example, the CPU 151 may perform the processing in steps S708 and S709 when a progress display instruction is received in response to a user operation.
An imaging system 100 according to a second exemplary embodiment will be described below. The imaging system 100 according to the second exemplary embodiment performs a rescan after completion of scanning in all of scanning ranges. The imaging system 100 according to the second exemplary embodiment will be described below centering on differences from the imaging system 100 according to the first exemplary embodiment.
When the CPU 151 determines that the processing target area is to be rescanned (YES in step S704), the processing proceeds to step S1000. In step S1000, instead of performing a rescan, the imaging control unit 404 registers the processing target area as a rescan line (rescan area) to be rescanned, and the processing proceeds to step S706. Information about the rescan line is recorded in a storage unit such as the RAM 153.
After completion of the processing in step S707, the processing proceeds to step S1001. In step S1001, the imaging control unit 404 identifies the scan progress related to OCTA image capturing. According to the present exemplary embodiment, in addition to the scan progress, the imaging control unit 404 calculates processing time used to perform the rescan processing on the rescan line registered in step S1000. In step S1002, the display processing unit 406 performs control to display the scan progress on the imaging screen. According to the present exemplary embodiment, in addition to the scan progress for the scanning area, the display processing unit 406 performs control to display, as a progress, the rescan line and the processing time used to perform the rescan processing.
After completion of repeating the processing in steps S702 to S707, S1001, and S1002, the CPU 151 performs a scan on the rescan line. More specifically, the CPU 151 repetitively performs the processing in steps S1010 to S1015. The processing in steps S1010 to S1013, targeting each of the rescan lines registered in step S1000, is repeated until completion of processing on all of the processing target rescan lines. For convenience of descriptions, hereinafter, the processing in steps S702 to S707, S1001, and S1002 is referred to as scan processing, and the processing in steps S1010 to S1015 is referred to as rescan processing.
The imaging control unit 404 selects one of the rescan lines registered in step S1000 and performs processing in step S1010 and subsequent steps. The processing in steps S1010 to S1013 is scan processing to be performed on the rescan lines, and basic processing is similar to the processing in steps S702 to S707. More specifically, the imaging control unit 404 acquires an SLO image in step S1010, calculates the deviation amount in step S1011, and corrects the scanning position in step S1012. In step S1013, the imaging control unit 404 acquires a B scan image at a new scanning position offset by the deviation amount.
In step S1014, the imaging control unit 404 calculates the processing time in the rescan processing as the scan progress. When the rescan processing has already been started, the remaining processing time is calculated as the processing time. In step S1015, the display processing unit 406 displays the progress and updates the progress display according to the progress identified in step S1014.
In this way, at the time of OCTA image capturing, the imaging system 100 according to the second exemplary embodiment displays, as the progress of imaging (scanning), the progress of the scan processing and the progress of the rescan processing at the time of processing. Therefore, together with the progress of the scan processing, the operator can understand the processing time related the rescan processing subsequently performed.
As a modification of the second exemplary embodiment, the imaging control unit 404 may perform further rescan on the processing target scanning line when the deviation amount calculated in step S1011 in the rescan processing is equal to or larger than the threshold value.
Modifications according to the first and the second exemplary embodiments will be described below. Although, in the above-described exemplary embodiments, tracking and rescan are performed while sequentially changing the scan line one by one for the general volume scan, tracking and rescan are not limited thereto. In another example, similar to cross scan, the above-described exemplary embodiments are also applicable to a case where sampling is performed on the same position a plurality of times.
Although, in the above-described examples, the scan progress is displayed on an SLO image, the scan progress display is not limited thereto. In another example, instead of an SLO image, the display processing unit 406 may display the scan progress on an anterior eye portion image acquired for anterior eye portion alignment.
Modifications of progress display will be described below.
In another example, as illustrated in
As a second modification of progress display, the display processing unit 406 may display the progress on the tomographic image 630.
As a third modification of progress display, as illustrated in
In the example illustrated in
In another example, the imaging system 100 may display the elapsed time, the remaining time, the ratio of completed scans, and the ratio of the remaining scan related to OCTA image capturing (scanning) in the form of numerical values, a bar graph, and a pie chart. In still another example, the imaging system 100 may perform control to change the color and brightness of the arrow indicating the current scanning position according to the scan progress.
While various exemplary embodiments of the present invention have been described, the present invention is not limited thereto but can be modified in diverse ways without departing from the spirit and scope thereof. Portions of the above-described exemplary embodiments may be combined.
Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2017-044605, filed Mar. 9, 2017, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2017-044605 | Mar 2017 | JP | national |