The present invention relates to an object information acquiring apparatus and to a signal processing method.
In a photoacoustic imaging technique, which is an imaging technique using light, an object is first irradiated with pulsed light generated from a light source. When the irradiating light is propagated/diffused in the object, the energy of the light is absorbed by a plurality of inner portions of the object to generate acoustic waves (hereinafter referred to as photoacoustic waves). By receiving the photoacoustic waves using a probe (transducers) and performing analytical processing of reception signals in a processing device, information related to the characteristic values of the inner portions of the object is acquired as image data. This allows the distribution of the characteristic values inside the object to be visualized.
In recent years, a preclinical study which acquires an image of a blood vessel from a small animal using photoacoustic imaging and a clinical study which applies photoacoustic imaging to a diagnosis of breast cancer or the like have actively been pursued. U.S. Pat. No. 6,216,025 discloses a technique which uses a probe in which a plurality of receiving elements that detect photoacoustic waves are disposed at different positions along a generally hemispherical surface to visualize the characteristic values of the inner portions of an object. This technique orients the directions in which the plurality of receiving elements disposed over the generally spherical surface have high reception directivities to a predetermined region to allow the predetermined region to be visualized with high resolution or with low noise and high sensitivity.
A photoacoustic wave generated from an absorber having a cylindrical shape, such as a blood vessel, has a high directivity. Accordingly, the photoacoustic wave propagated from the absorber having the cylindrical shape is received by those of the plurality of receiving elements included in the probe which are located within a limited region. On the other hand, image reconstruction is performed using signals from all the receiving elements included in the probe so that reception signals not derived from the photoacoustic waves are also used for the reconstruction. As a result, a problem arises in that averaging reduces the intensity of an image of a blood vessel to be visualized.
Particularly when the receiving elements are arranged in a region smaller than the hemisphere, i.e., when the probe has a limited field of vision, a problem arises in that a blood vessel extending in a direction which is more nearly perpendicular to the aperture plane of the probe is more susceptible to the influence of the limited field of vision and the image intensity decreases.
The present invention has been achieved in view of the foregoing problems. An object of the present invention is to allow an absorber which generates a photoacoustic wave having a directivity to be excellently imaged.
The present invention provides an object information acquiring apparatus, comprising:
an irradiating unit which emits light;
a plurality of receiving elements which receive acoustic waves generated when an object is irradiated with the light, convert the acoustic waves to electric signals, and output the electric signals;
a probe which has a supporting member and on which the plurality of receiving elements are arranged in the supporting member such that respective directivity axes of the receiving elements are converged;
a weighting processing unit which performs weighting on a plurality of electric signals output from the plurality of receiving elements; and
an information acquiring unit which acquires characteristic information of the object using the electric signals, wherein
the plurality of receiving elements include a first receiving element and a second element, and
the weighting processing unit applies weights to the electric signals such that a weight applied to a first electric signal corresponding to the first receiving element receiving a first acoustic wave of first intensity is higher than a weight applied to a second electric signal corresponding to the second receiving element receiving a second acoustic wave of second intensity lower than the first intensity.
The present invention also provides a signal processing method for processing electric signals generated through reception and conversion, by a plurality of receiving elements, of acoustic waves generated upon irradiation of an object with light, the plurality of receiving elements being arranged on a supporting member such that respective directivity axes of the receiving elements are converged, the method comprising the steps of:
performing weighting on a plurality of electric signals output from the plurality of receiving elements; and
acquiring characteristic information of the object using the electric signals, wherein
the plurality of receiving elements include a first receiving element and a second element, and
in the weighting, a weight applied to a first electric signal corresponding to the first receiving element receiving a first acoustic wave of first intensity is higher than a weight applied to a second electric signal corresponding to the second receiving element receiving a second acoustic wave of second intensity lower than the first intensity.
The present invention allows an absorber which generates a photoacoustic wave having a directivity to be excellently imaged.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
Referring to the drawings, the following will describe preferred embodiments of the present invention. However, the dimensions, materials, and shapes of components described below, relative positioning thereof, and the like are to be appropriately changed in accordance with a configuration of an apparatus to which the invention is applied and various conditions and are not intended to limit the scope of the invention to the following description.
The present invention relates to a technique which detects photoacoustic waves propagated from an object to generate and acquire the characteristic information of the inner portions of the object. Accordingly, the present invention is considered to be an object image acquiring apparatus, a control method therefor, an object information acquiring method, or a signal processing method. The present invention is also considered to be a program for causing an information processing device including hardware resources such as a CPU and a memory to implement each of the foregoing methods, or a non-transitory storage medium which can be read by a computer storing the program.
The object information acquiring apparatus of the present invention includes a photoacoustic imaging apparatus using a photoacoustic effect which irradiates an object with light (an electromagnetic wave) to receive acoustic waves generated in the object and acquire the characteristic information of the object as image data. In this case, the characteristic information is information on characteristic values corresponding to a plurality of respective positions in the object which are generated using reception signals derived from the received photoacoustic waves.
The characteristic information acquired by photoacoustic measurement is values reflecting the amount of absorbed light energy and the absorptance thereof. The characteristic information includes, e.g., the source of an acoustic wave generated by irradiation with light at a single wavelength and an initial sound pressure in the object or a light energy absorption density and an absorption coefficient which are derived from the initial sound pressure. From the characteristic information obtained using a plurality of different wavelengths, the concentration of a substance forming a tissue can be acquired. By obtaining an oxygenated hemoglobin concentration and a deoxygenated hemoglobin concentration as the substance concentration, an oxygen saturation distribution can be calculated. As the substance concentration, a glucose concentration, a collagen concentration, a melanin concentration, a volume fraction of fat or water, or the like can also be obtained.
On the basis of characteristic information from each position in the object, a two-dimensional or three-dimensional characteristic information distribution can be obtained. Distribution data can be generated as image data. The characteristic information may also be obtained not as numerical value data, but as distribution information from each position in the object. Specific examples of the distribution information include an initial sound pressure distribution, an energy absorption density distribution, an absorption coefficient distribution, and an oxygen saturation distribution.
The acoustic wave mentioned in the present invention is typically an ultrasound wave and includes an elastic wave referred to as a sound wave or an acoustic wave. An electric signal to which an acoustic wave is converted by a transducer or the like is referred to also as an acoustic signal. However, the ultrasound wave or acoustic wave recited in the present specification is not intended to limit the wavelength of such an elastic wave. The acoustic wave generated by a photoacoustic effect is referred to as a photoacoustic wave or an optical ultrasound wave. An electric signal derived from a photoacoustic wave is referred to also as a photoacoustic signal. Distribution data is referred to also as photoacoustic image data or reconstruction image data.
The following embodiments will describe, as an object information acquiring apparatus, a photoacoustic imaging apparatus which irradiates an object with pulsed light, receives acoustic waves from the object on the basis of a photoacoustic effect, and analyzes the received acoustic waves to acquire the distribution of light absorbers in the object. The photoacoustic imaging apparatus is appropriate for a diagnosis of blood vessel disease, malignancy, or the like for a human or an animal and follow-up observation after chemotherapy. Examples of the object include a part of a living body such as the breast or hand of the object, an animal other than a human such as a mouse, a non-living object, and a phantom.
Referring to
The photoacoustic imaging apparatus further includes an input unit 108 for a user to operate the apparatus, a weighting processing unit 109 which applies weights to the reception signals on the basis of the information received by the input unit, a calculation unit 110 which reconstructs the reception signals into image data, and a display unit 111 which displays generated object information and a user interface (UI) for operating the apparatus.
Note that the calculation unit 110 has a CPU, a main storage device, and an auxiliary storage device. The calculation unit 110 may be included in an independent computer or may also be exclusively designed hardware. Typically, a work station or the like is used as the calculation unit 110. A measurement target is an object 112. The weighting processing unit 109 and the calculation unit 110 may also be configured of common hardware. The calculating unit corresponds to an information acquiring unit in the present invention.
Light Source 101
As the light source 101, a laser light source is desirable to produce a high output. However, instead of a laser, a light emitting diode, a flash lump, or the like can also be used. Various lasers such as a solid-state laser, a gas laser, a dye laser, and a semiconductor laser can be used. A Nd:YAG-excited Ti:sa laser or an alexandrite laser with a high output and a continuously variable wavelength is particularly preferred. By using reception signals derived from light at a plurality of waveforms, a substance concentration can be acquired. Accordingly, a configuration having a plurality of single-wavelength lasers with different wavelengths or a configuration including a variable wavelength laser is preferred.
It is desirable that the wavelength of the pulsed light is a specified wavelength which allows the light to be absorbed in a specified one of the components forming the object and at which the light is propagated into the object. Specifically, when the object is a living body, it is desirable that the wavelength of the pulsed light is at least 700 nm and not more than 1100 nm. To effectively generate photoacoustic waves, the object should be irradiated with the light in a sufficiently short period of time depending on the thermal property of the object. When the object is a living body, the pulse width of the pulsed light (irradiating light) generated from the light source is preferably about 1 nanosecond to 100 nanoseconds.
Irradiating Unit 102
The irradiating unit irradiates the object with the pulsed light emitted from the light source. The irradiating unit includes optical members such as a mirror which reflects the light, a lens which magnifies the light, a diffuser panel which diffuses the light, and a waveguide such has an optical fiber. The irradiating unit leads the irradiating light to the object, while processing the irradiating light such that the distribution of the irradiating light has an intended shape. Note that, in order to improve the safety of the object and widen a diagnosis region, the irradiating light is preferably spread to have a certain area. The irradiating unit of the present embodiment is located in the vicinity of the pole of the probe to irradiate the object with the light from below. However, the positional relationship between the direction in which the object is irradiated and the object is not limited thereto.
Probe 103
The probe 103 has an opening which is upwardly open and a configuration in which the plurality of receiving elements 104 are arranged in the surface (inner circumferential surface) of a hemispherical supporting member around a point 113 as a center of curvature. The plurality of receiving elements 104 are arranged to be widely and generally equally dispersed over the hemispherical surface. The reception sensitivity of each of the receiving elements 104 is highest in a normal direction with respect to the receiving surface thereof for an acoustic wave and has a distribution around the normal direction. The direction in which the reception sensitivity is high is referred to also as a “directivity axis”, which is typically a normal direction with respect to the receiving surface. By converging the respective directivity axes of the receiving elements to the vicinity of the center of curvature of the probe, a high-sensitivity region 114 which is visualized with high sensitivity and high accuracy is formed around the center of curvature 113. As the shape of the probe, not only a hemispherical shape, but also a spherical-crown-like shape, a bowl-like shape, or the like can be used.
The relative positions of the probe 103 and the object vary as the moving mechanism 105 causes the probe 103 to scan the object. As a result of the scanning, the high-sensitivity region 114 is moved relative to the object 112 to allow object information over a wide range to be visualized with high sensitivity and high accuracy. In the present embodiment, the scanning using the probe 103 is performed along an xy-plane. Under the probe 103, the irradiating unit 102 is provided. The light transmitted by an optical transmission system is emitted in a z-direction toward the object 112. The aperture plane of the probe is parallel with the xy-plane.
In the present embodiment, the receiving elements are arranged in the region of the surface of the hemispherical supporting member which is smaller than the hemisphere. That is, the receiving elements are arranged in a region where a polar angle θp in element coordinates around the center of curvature 113 is limited to a degree of less than 90 degrees. This allows an aperture plane 115 of the probe and the object to have a clearance therebetween. As a result, even in the state where the high-sensitivity region 114 is aligned with the deep portion of the object, the probe 103 is allowed to scan over a wide range in the xy-plane. For example, the receiving elements are arranged in the range of the polar angle θp from −70 degrees to +70 degrees.
When the object is under test, an acoustic transmission medium such as water, oil, or gel is injected into the clearance between the probe 103 and the object 112. Alternatively, a thin cup-shaped holding member (not shown) which holds the object 112 may also be used. In this case, the acoustic transmission medium is injected into each of the clearance between the probe 103 and the holding member and the clearance between the holding member and the object 112.
Receiving Elements 104
Each of the receiving elements 104 is a means which detects the acoustic wave generated in the object, converts the detected acoustic wave to an electric signal (photoacoustic signal), and outputs the electric signal. The receiving element 104 is referred to also as an acoustic wave detector or a transducer. As the receiving element 104, a receiving element which can detect an acoustic wave in a frequency band of 100 KHz to 10 MHz generated from a living body is used. Specifically, a transducer using the piezoelectric phenomenon of lead zirconium titanate (PZT) or the like, a transducer using the resonance of light, a transducer using a change in electrostatic capacitance such as CMUT, or the like can be used. As the receiving element 104, a receiving element having a high sensitivity and a wide frequency band is desirable.
As for the receiving elements, the elements can be considered as follows. That is, the plurality of receiving elements include a first receiving element and a second element. And, the weighting processing unit applies weights to the electric signals such that a weight applied to a first electric signal corresponding to the first receiving element receiving a first acoustic wave of first intensity is higher than a weight applied to a second electric signal corresponding to the second receiving element receiving a second acoustic wave of second intensity lower than the first intensity.
Moving Mechanism 105
The moving mechanism 105 moves the probe 103 to change the relative positions of the plurality of receiving elements 104 and the object 112. As the moving mechanism, an automatic stage using a stepping motor or a servo motor can be used. In the present embodiment, the moving mechanism 105 moves the probe 103 along a two-dimensional spiral path. However, the probe 103 may also be moved along a linear or three-dimensional path.
Signal Receiving Unit 106
The signal receiving unit 106 has a means which amplifies the electric signal output from each of the receiving elements and convers the amplified electric signal to a digital signal. Specifically, the signal receiving unit 106 has an amplifier, an A/D converter, a FPGA chip, and the like. The signal receiving unit 106 collects reception signals from the probe 103 at a predetermined sampling rate with a predetermined number of samples in time series and converts the collected reception signals to digital signal data. Since the plurality of reception signals are obtained, it is desirable that a plurality of signals can be simultaneously processed. This can reduce an image formation time. Note that the “reception signal” in the present specification is a concept including an analog signal output from each of the receiving elements as well as a digital signal resulting from the subsequent A/D conversion of the analog signal.
Input Unit 108
The input unit 108 is an input device for performing an image processing operation related to an image such as giving an instruction to set parameters related to the object information to be generated or start measurement or setting observation parameters for the generated object information. In particular, in the present embodiment, the input unit determines a pattern of weights to be applied to the reception signals. The input unit 108 includes a mouse, a keyboard, a touch panel, or the like which receives an input from a user to give an event notification to software such as OS in accordance with an operation by the user. In the case of using the touch panel, the display unit 111 serves also as the input unit 108.
Weighting Processing Unit 109
The weighting processing unit 109 has the function of giving weights in accordance with the weight pattern specified by the input unit to the reception signals. Weight coefficients in the present embodiment are defined in accordance with the z-coordinates of the receiving elements and the reception signals are multiplied by the weight coefficients on a per receiving-element basis. The specific content of weighting processing will be described later. The reception signals multiplied by the weight coefficients in the weighting processing unit are transmitted to the calculating unit. Note that the weighting processing unit 109 also has the function of transmitting the reception signals to the calculation unit 110 without giving the weights thereto (using 1 as a multiplier) for a comparison between weighting effects.
Calculation Unit 110
The calculation unit 110 is a means which processes the signals resulting from digital conversion and reconstructs an image representing information on the optical property or morphology of the inner portion of the object. For the reconstruction, any method may be used such as a Fourier transformation method, a universal back-projection method (UBP method), a filtered back-projection method, or a phasing addition (Delay and Sum) process. The characteristic information of the inner portion of the object is acquired as a set of voxel data when three-dimensional information is acquired. On the other hand, the characteristic information of the inner portion of the object is acquired as a set of pixel data when two-dimensional information is acquired. The generated image is transmitted as photoacoustic image data to the display unit 111 and presented to a user.
Each of the weighting processing unit 109 and the calculation unit 110 can be configured as a program module operated in the same information processing device (such as a PC or work station which has a CPU and a memory and processes information in accordance with a program). However, the weighting processing unit 109 and the calculation unit 110 configured of different information processing devices may also be connected with each other to be used.
Display Unit 111
The display unit 111 displays a photoacoustic image (first image) reconstructed without giving weights to the reception signals from the object, a photoacoustic image (second image) reconstructed by giving weights to the reception signals from the object, and a composite image (third image) of the first and second images. Each of the photoacoustic images is displayed as an arbitrary cross-sectional image or three-dimensional image. The display unit 111 also has a moving image display mode in which photoacoustic images which are sequentially reconstructed every time the probe changes its position are continuously displayed. To ensure a real-time property in a processing flow from the reception of signals to the display of images, it is desirable that image reconstruction processing can be performed at a high speed. As the display unit 111, a fixed-mount-type display such as a liquid crystal display (LCD), a cathode ray tube (CRT), or an organic EL display, a tablet terminal, or the like can be used.
Processing Flow Associated with Reception of Photoacoustic Wave
A description will be given of a processing flow associated with the reception of a photoacoustic wave. In response to an instruction from the system control unit 107, the moving mechanism 105 starts to move such that the probe 103 moves along a predetermined path. The light source 101 generates light at predetermined light emission intervals in response to the instruction from the system control unit 107. The pulsed light generated from the light source 101 at a given time while the probe 103 is moving is propagated by the irradiating unit 102 to irradiate the object 112. A part of the light energy propagated in the object is absorbed by an absorber (such as, e.g., a blood vessel containing a large amount of hemoglobin) which absorbs light at a predetermined wavelength. As a result of thermal expansion of the absorber, a photoacoustic wave is generated.
The probe 103 receives and converts the photoacoustic wave to a time-series reception signal. The reception signals output from the plurality of receiving elements 104 are successively input to the signal receiving unit 106. The signal receiving unit 106 performs the amplification and AD conversion of the reception signals and transmits the digitized reception signals to the weighting processing unit 109.
Directivity of Photoacoustic Wave
The photoacoustic wave generated from an absorber may have a directivity in accordance with the shape of the absorber. For example, the photoacoustic wave generated from a cylindrical absorber has a high directivity in a direction perpendicular to a cylindrical axis. This is because extremely small spherical waves generated from the individual points in the cylindrical absorber are superimposed on each other and consequently an overall wave front is strongly propagated in the direction perpendicular to the cylindrical axis.
A double-headed arrow 203 in each of the drawings shows a simulated range of irradiating light at the height of the center of curvature of the probe. As described above, the photoacoustic wave generated from the cylindrical absorber 201 has a high directivity in the direction perpendicular to the cylindrical axis. As a result, it can be seen that the majority of the photoacoustic wave is incident on the belt-like region having a width in accordance with the size of the range of the irradiating light. For example, in each of
The present invention aims at improving the intensity of a reconstruction image by performing reconstruction in consideration of an intensity distribution of signals from the individual receiving elements resulting from the directivity of the photoacoustic wave as described above. Specifically, by multiplying the signals from the individual elements by weight coefficients in accordance with the signal intensities, the influence of averaging resulting from the use of the signals from all the elements, including the elements on which the photoacoustic wave is incident in small amounts, is reduced to improve the intensity of the reconstruction image of the absorber.
In image reconstruction processing based on a universal back-projection method or the like, information on a characteristic value such as an initial sound pressure is obtained in accordance with Formula (1).
In Formula (1), p0(r) represents an initial sound pressure distribution at a position r, b (r0, t) represents projection data, and dΩ0 represents the solid angle of a detector with respect to an arbitrary observation point. By back-projecting the projection data in accordance with the integration in Formula (1), the initial sound pressure distribution p0(r) can be acquired. By integrating intensity data projected by all the individual elements on an arbitrary reconstruction voxel and dividing the resulting value by the sum of the sold angles of the individual elements with respect to the arbitrary observation point, the initial sound pressure corresponding to the arbitrary reconstruction voxel, i.e., the intensity of a reconstruction image is determined.
In the signal processing method of the present invention, such weighting processing as to increase the intensity difference between the receiving element which detects a signal having a relatively high intensity and the receiving element which detects a signal having a relatively low intensity is performed. This can reduce the influence of the averaging and improve the intensity of a reconstruction image of the absorber.
When the cylindrical absorber is inclined relative to the aperture plane of the probe using the center of curvature of the probe as a supporting point, as the inclination angle θ increases, the high-intensity region of the photoacoustic wave moves toward the edge of the probe. When the inclination angle θ exceeds 70 degrees, the number of the elements which receive photoacoustic waves having high intensities decreases, though depending on the shape of the probe or the arrangement of the elements in the probe. This is because, as described above, the receiving elements in the probe are arranged in a range of not larger than the hemisphere and therefore have limited fields of vision. As a result, a phenomenon is observed in which the intensity of a reconstruction image decreases when the inclination angle θ is about 70 degrees or more.
Influence of Directivity on Image and Weighting
This phenomenon also similarly occurs when a blood vessel in a living body is imaged using a photoacoustic imaging apparatus. A blood vessel at a large inclination angle relative to the aperture plane of the probe, i.e., a blood vessel extending in a depth direction is lower in the intensity of a reconstruction image than a blood vessel at a small inclination angle relative to the aperture plane of the probe, i.e., a blood vessel extending in a horizontal direction. Since a plurality of blood vessels extending in various directions are contained in a living body, a phenomenon is observed in which the S/N ratio of a blood vessel lower in intensity decreases due to the artifact noise of a blood vessel higher in intensity, resulting in low contrast.
Accordingly, in the present invention, reconstruction is performed in which weights in accordance with the signal intensity distribution resulting from the directivities of photoacoustic waves are applied to reception signals. Preferably, such weighting is performed as to increase the contribution of the photoacoustic wave generated from a blood vessel extending in the depth direction in a living body to a reconstruction image and reduce the contribution of the photoacoustic wave generated from a blood vessel extending in a relatively horizontal direction to the reconstruction image. As a result, the image intensity of the blood vessel extending in the depth direction is improved.
Processing Flow Associated with Weighting of Reception Signals and Image Display
Using
Step S501 is the step of acquiring reception signals derived from the photoacoustic waves generated from the object. First, the plurality of receiving elements receive the photoacoustic waves propagated from the object irradiated with light. Next, the signal receiving unit performs amplification processing and digital conversion processing on the successively input reception signals. The digital reception signals are transmitted to the weighting processing unit.
Note that weighting and image reconstruction need not immediately be performed on the reception signals. It may also be possible to, e.g., store the reception signals in a memory not shown and perform imaging processing on the reception signals later. The present invention can also be considered to be a signal processing apparatus or a signal processing method which perform weighted image reconstruction on the reception signals already stored in the memory. In that case, the signal processing apparatus selects a weight pattern or performs weighted reconstruction processing in accordance with the positions of the elements in the probe, the positional relationships with a light emitting end, or the like stored as the accompanying information of the reception signals.
Step S502 is the step of specifying a pattern of the weights to be applied to the reception signals. The specification of the weight pattern is performed by the user via the input unit.
Using the weight pattern selecting unit 601, the user selects the weight pattern to be used among the plurality of weight patterns prepared in advance. Preferably, the weight patterns are categorized according to type and a large item selector and a small item selector are provided to facilitate pattern selection. In accordance with selected pattern, the receiving elements displayed on the element arrangement display unit 602 are displayed in colors corresponding to weight coefficients. In accordance with the selected weight pattern, the weight pattern graph display unit 603 displays a graph showing the weight coefficients with respect to the positions of the receiving elements in the z-direction. The user can recognize the elected weight pattern using the element arrangement display unit 602 and the weight pattern graph display unit 603.
The user can also finely and precisely adjust the values in the weight pattern prepared in advance using the element region specifying unit 604 and the weight coefficient specifying unit 605. When the user finely adjusts the weight pattern, the result thereof is reflected on the element arrangement display unit 602 and the weight pattern graph display unit 603. The weight pattern thus specified is transmitted to the weighting processing unit.
An example is shown herein in which the weight pattern is determined through the specification by the user. However, a configuration may also be used in which the predetermined weight pattern is automatically specified in the weighting processing unit 109 to be used. For example, the pattern may be automatically selected such that a larger weight is applied to the element on which a photoacoustic wave is incident in a larger amount on the basis of the relations among the positions of the individual receiving elements, the position and light irradiation direction of the light irradiating unit, and the position of the object. Alternatively, it may also be possible to determine the weights applied to the individual receiving elements or select the weight pattern on the basis of the intensities of the reception signals from the individual receiving elements.
Step S503 is the step in which the weighting processing unit applies the weight pattern to the reception signals.
The weight pattern is not limited to the example described above. The weights may be in an arbitrary pattern intended by the user as long as each of the weight is defined with respect to each of the positions of the elements in the z-direction. By giving such weights to the reception signals, it is possible to correct the intensity of a reconstruction image in accordance with the inclination angle of the cylindrical absorber.
As has been described heretofore, in the present step, by multiplying the reception signal from each of the receiving elements by the weight coefficient in accordance with the specified weight pattern, the weight is applied to the reception signal. The reception signal multiplied by the weight coefficient in the weighting processing unit is transmitted to the calculation unit.
Step S504 is the step of reconstructing the weighted reception signals to photoacoustic image data. In this step, the calculation unit converts the digital reception signals resulting from the multiplication by the weight coefficients and received from the weighting processing unit to reconstruction image data showing information on the optical property or morphology of the inner portion of the object and transmits the reconstruction image data to the display unit.
Step S505 is the step of reconstructing the unweighted reception signals to photoacoustic image data. In this step, the digital reception signals received from the signal receiving unit are converted to reconstruction image data showing information on the optical property or morphology of the inner portion of the object.
Step S506 is the step of displaying the photoacoustic image data on the display unit. In this step, the photoacoustic image data as the object information received from the calculation unit is displayed on the display unit.
The composite image display unit 804 has a slide bar 805 which can change the ratio α at which the two images are summed up. The summation ratio α is, e.g., a coefficient given by Formula (2) when the image intensity of the composite image at an in-image pixel location (x, y) is I, the image intensity of the normal reconstruction image is Inormal, and the image intensity of the weighted reconstruction image is Iweight.
[Math. 2]
I(x,y)=(1−α)*Inormal(x,y)+α*Iweight(x,y) (2)
The user can move the slide bar 805 to any position on the UI. For example, when the slide bar 805 is moved to the left end, the summation ratio of the weighted image is 0 so that the normal reconstruction image is displayed on the composite image display unit 804. When the slide bar 805 is moved to the right end, the summation ratio of the normal reconstruction image is 0 so that the weighted reconstruction image is displayed on the composite image display unit 804.
When the signals having the weights which increase as the locations where the photoacoustic waves corresponding to the signals are generated are closer to the edge of the probe (higher positions along the z-axis), as shown in
In the example shown in
As has been described heretofore, the photoacoustic imaging apparatus according to the present embodiment allows an improvement in the image intensity of the absorber as the object to be visualized. By particularly giving the weight coefficients proportional to the z-position of the probe to the reception signals, it is possible to improve the image intensity of a blood vessel extending in the depth direction, which is low in image intensity and less likely to be visualized in conventional reconstruction.
The following will describe a configuration of and processing by a photoacoustic imaging apparatus according to a second embodiment with emphasis on portions different from those in the first embodiment.
As shown in
In addition, by giving such a weight pattern as used in the present embodiment to the reception signals, the number of signals used for image reconstruction processing in the calculation unit is reduced to allow a reduction in calculation load. In general, image reconstruction processing involves a large amount of calculation. Particularly when real-time reconstruction is performed following the reception of photoacoustic waves, the ratio of the time period required for the reconstruction processing to the time period from the reception of the photoacoustic waves to the display of a reconstruction image is high. Therefore, it is desirable to minimize the reconstruction processing time. By giving the 0 weight coefficient to each of the receiving elements included in the specified region defined by the positions of the receiving elements in the z-direction, it is possible to simultaneously perform image enhancement in accordance with the angle at which a blood vessel extends and reduce the time required for the image reconstruction processing.
In the example shown in the present embodiment, the weight coefficient is constant in the range given by z1≤z≤za. However, it may also be possible that the weight coefficient is not constant and may vary in accordance with the z-position. Also, the range of z may be any position or any range between 0 and za. Any weight pattern can be used as long as the weight coefficient in the specified region defined by the z-position is 0 and the data amount of the reception signals used for the image reconstruction processing can be reduced.
The following will describe a configuration of and processing by a photoacoustic imaging apparatus according to a third embodiment with emphasis on portions different from those in the embodiments described above.
Methods for performing the reconstruction processing using the reception signals to which the weight pattern as described above is applied are roughly divided into two categories. A method in the first category does not uses the signals from the receiving elements in the region 1 and applies the respective weight coefficients to the reception signals from the receiving elements included in the regions 2 and 3 to reconstruct an image. A method in the second category does not use the signals from the receiving elements in the region 1 and uses only the reception signals from the receiving elements included in the regions 2 and 3 to generate respective reconstruction images for the individual regions. When image reconstruction is performed on a per region basis using the reception signals from the receiving elements included in each of the regions, since each of the regions is defined by the position thereof in the z-direction, an image in which a blood vessel at an angle corresponding to each of the regions is particularly enhanced is reconstructed.
The following will describe a configuration of and processing by a photoacoustic imaging apparatus according to a fourth embodiment with emphasis on portions different from those in the embodiments described above.
As shown in
However, when the position and angle of a blood vessel as an object to be observed are unknown, it is impossible to determine only one weighted element region in advance. Accordingly, in the present embodiment, belt-like regions in a plurality of patterns are specified and image reconstruction corresponding to the individual regions is performed a plurality of times. The reconstruction images generated for the individual regional patterns may be displayed in juxtaposition in the same manner as in the example shown in the third embodiment or a composite image of the individual reconstruction images may also be displayed. Alternatively, it may also be possible that the user specifies an arbitrary composite coefficient or the calculation unit has a program which produces a composite image selectively from target images having relatively high intensities. Still alternatively, the calculation unit may also have the function of estimating the center position and polar angle of the cylindrical absorber from the intensities of the reconstruction images corresponding to the individual regions.
To reduce the number of patterns, it may also be possible to have a weight pattern having a large region, as shown in
As described above, the photoacoustic imaging apparatus according to the present embodiment allows an improvement in the image intensity of the absorber as an object to be visualized.
Note that, when the photoacoustic wave generated from the absorber has a directivity, the present invention applies weights to the respective electric signals derived from the plurality of receiving elements to thus achieve the effect. Accordingly, the present invention is intended not only for a blood vessel as the absorber. For example, a cylindrical absorber other than a blood vessel, such as a lymph vessel, can also be a measurement target. The present invention is applicable to any case where a photoacoustic wave shows any directivity on the basis of the shape of the absorber other than a cylindrical shape.
While the present invention has been described heretofore with reference to the specific embodiments, it is to be understood that the invention is not limited to the specific embodiments described above. The embodiments can be corrected within the scope not departing from the technical idea of the present invention.
Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2016-202182, filed on Oct. 14, 2016, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2016-202182 | Oct 2016 | JP | national |