Aspects of the present invention generally relate to a photoacoustic apparatus using a plurality of semiconductor light-emitting elements.
In recent years, as an imaging technology using light, photoacoustic apparatuses which perform imaging of the inside of a subject using the photoacoustic effect have been researched and developed. The photoacoustic apparatus forms an absorption coefficient distribution image based on ultrasonic waves (photoacoustic waves) that are generated by the photoacoustic effect from an optical absorber which has absorbed energy of light radiated onto a subject. Then, the photoacoustic apparatus generates a structure image or function image of the inside of the subject from the absorption coefficient distribution image.
To acquire information about a subject having a region broader than an irradiation spot of light, there is not only a method of performing scanning while changing the position of a light radiation unit but also a method of providing a plurality of light radiation units and sequentially switching a light radiation unit to radiate light.
Japanese Patent Application Laid-Open No. 2005-218684 discusses a configuration which guides light emitted from a light source to radiation units arranged in an array-like manner via a plurality of optical fibers and sequentially radiates light from the radiation units onto a subject. Moreover, the configuration discussed in Japanese Patent Application Laid-Open No. 2005-218684 generates an image of the subject based on photoacoustic waves that are generated by the subject being sequentially irradiated with light.
The configuration discussed in Japanese Patent Application Laid-Open No. 2005-218684 has a need for a number of optical fibers corresponding to the number of light radiation units, and is, therefore, hard to be available for high-density packaging. Moreover, it shows no findings for obtaining a good-quality image based on photoacoustic waves acquired by sequentially radiating light.
According to an aspect of the present invention, a photoacoustic apparatus includes a plurality of semiconductor light-emitting elements, an ultrasonic wave reception unit, a control unit, and an image generation unit. The semiconductor light-emitting elements radiate light onto a subject. The ultrasonic wave reception unit receives ultrasonic waves generated by the radiation of light onto the subject and outputs an electrical signal based on the received ultrasonic waves. The control unit controls light emission patterns of the plurality of semiconductor light-emitting elements to radiate the light onto the subject in positions different from each other and at times different from each other. The image generation unit generates a plurality of images, each such image being generated by independently reconstructing a respective electrical signal output by the ultrasonic wave reception unit based on the light radiated onto the subject in a respective one of the positions and at a respective one of the times. The image generation unit generates a composite image concerning the subject by combining the plurality of images.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
Hereinafter, various exemplary embodiments, features, and aspects of the invention will be described in detail below with reference to the drawings. In this regard, however, for example, the dimension, material, shape, and relative disposition of each constituent component described below can be changed or altered as appropriate according to the configuration of an apparatus to which the invention is applied and various conditions thereof. Therefore, the scope of the invention is not intended to be limited to the following description.
A photoacoustic apparatus according to an exemplary embodiment of the invention is related to a technique for generating and acquiring characteristic information about the inside of a subject by detecting acoustic waves propagating from the subject. Therefore, the invention is comprehended as a photoacoustic apparatus or a control method therefor, or a subject information acquisition method or signal processing method. The invention is also comprehended as a display method for generating and displaying an image representing characteristic information about the inside of a subject. The invention is also comprehended as a program that causes an information processing apparatus including hardware resources such as a central processing unit (CPU) and a memory to perform the above methods or a non-transitory computer-readable storage medium storing such a program.
The photoacoustic apparatus according to an exemplary embodiment of the invention includes a photoacoustic imaging apparatus using the photoacoustic effect, which receives photoacoustic waves generated inside a subject by light (electromagnetic waves) being radiated onto the subject and acquires characteristic information about the subject as image data. In this case, the characteristic information is information about characteristic values respectively corresponding to a plurality of positions inside the subject, generated with use of signals derived from the received photoacoustic waves.
In the present exemplary embodiment, photoacoustic image data is a concept including every piece of image data derived from photoacoustic waves generated by light irradiation. For example, the photoacoustic image data is image data representing a spatial distribution of at least one piece of subject information, such as the generated sound pressure (initial sound pressure), absorbed energy density, and absorption coefficient of photoacoustic waves, and the density (for example, oxygen saturation) of a material constituting a subject. Furthermore, photoacoustic image data representing spectral information, such as the density of a material constituting a subject, is obtained based on photoacoustic waves generated by light irradiation with a plurality of wavelengths different from each other. The photoacoustic image data representing spectral information can be an oxygen saturation, a value obtained by weighting the oxygen saturation with the density such as an absorption coefficient, a total hemoglobin concentration, an oxyhemoglobin concentration, or a deoxyhemoglobin concentration. Moreover, the photoacoustic image data representing spectral information can be a glucose concentration, a collagen concentration, a melanin concentration, or a volume fraction of fat or water.
A two-dimensional or three-dimensional characteristic information distribution is obtained based on pieces of characteristic information at various positions inside the subject. Distribution data can be generated as image data. The characteristic information can be obtained as not numerical data but distribution information at various positions inside the subject. In other words, the characteristic information can be distribution information, such as initial sound pressure distribution, energy absorption density distribution, absorption coefficient distribution, and oxygen saturation distribution.
In the present exemplary embodiment, the acoustic waves are typically ultrasonic waves, and include elastic waves called sound waves or acoustic waves. An electrical signal obtained by transducing acoustic waves with, for example, a transducer is also referred to as an “acoustic signal”. In this regard, however, the term “ultrasonic waves” or “acoustic waves” in the context of the present specification is not intended to limit wavelengths of such elastic waves. Acoustic waves generated by the photoacoustic effect are referred to as “photoacoustic waves” or “photoultrasonic waves”. An electrical signal derived from photoacoustic waves is also referred to as a “photoacoustic signal”. The distribution data is also referred to as “photoacoustic image data” or “reconstructed image data”.
In the following exemplary embodiments, a photoacoustic apparatus which irradiates a subject with pulsed light, receives photoacoustic waves from the subject, and generates a blood vessel image (structure image) of the inside of the subject is taken as an example of a subject information acquisition apparatus. While, in the following exemplary embodiments, a photoacoustic apparatus including a hand-held type probe is taken as an example, the following exemplary embodiments can also be applied to a photoacoustic apparatus which includes a probe mounted on a stage and performs mechanical scanning.
A photoacoustic apparatus according to an exemplary embodiment of the invention includes a plurality of semiconductor light-emitting elements, and an ultrasonic wave reception unit configured to receive ultrasonic waves generated by radiation of light from the plurality of semiconductor light-emitting elements onto a subject to output an electrical signal. The photoacoustic apparatus further includes an image generation unit configured to reconstruct an image concerning the subject based on the output electrical signal, and a control unit configured to control light emission patterns of the plurality of semiconductor light-emitting elements in such a manner that light is radiated from the plurality of semiconductor light-emitting elements onto the subject in positions different from each other and at times different from each other.
Then, the image generation unit generates a plurality of images by independently reconstructing a plurality of electrical signals each corresponding to the electrical signal derived from light radiated onto the subject in positions different from each other and at times different from each other, and generates a composite image by combining the plurality of images.
The photoacoustic apparatus according to the present exemplary embodiment is configured to include a light radiation unit containing a plurality of semiconductor light-emitting elements, and, therefore, facilitates achievement of high-density packaging. Moreover, since the photoacoustic apparatus according to the present exemplary embodiment generates a plurality of images by independently reconstructing a plurality of electrical signals derived from light radiated onto the subject in positions different from each other and at times different from each other, noises derived from light radiated in the other positions and at the other times are unlikely to enter each image. Therefore, a good-quality image can be obtained.
Furthermore, it is desirable that the image generation unit perform weighting processing during generation of a composite image based on information concerning a light quantity distribution of an irradiated region of light on the subject, which is determined, at least, based on the light emission pattern. In a plurality of light emission patterns, a region in which irradiated regions of light overlap becomes larger in light quantity than a region in which irradiated regions of light do not overlap. Therefore, the image generation unit performs correction processing for performing weighting corresponding to the light quantity distribution, so that an influence of a difference in light quantity distribution on an image can be reduced.
Moreover, the image generation unit can determine a region in which to reconstruct an image, based on information concerning a light quantity distribution of an irradiated region of light on the subject corresponding to the light emission pattern.
Moreover, it is desirable that the control unit be configured to enable light emission in radiation modes differing from each other in sequence of light emission patterns. To enable light emission in radiation modes different from each other, the control unit is able to cause a plurality of semiconductor light-emitting elements to perform light emission in a light emission pattern corresponding to each radiation mode. Furthermore, at least one of the radiation modes can be set as a light emission pattern in which all of the plurality of semiconductor light-emitting elements performs light emission. At that time, radiation modes can be switched by a mode control unit. During switching of the radiation modes, a radiation mode (light emission patterns) can be determined based on information concerning an instruction from the user or an optical absorption coefficient of the surface of the subject (for example, the skin of a living body). For example, in a case where the optical absorption coefficient of a surface of the subject is smaller than a predetermined value, the mode control unit is able to select a radiation mode in which all of the plurality of semiconductor light-emitting elements performs light emission. The predetermined value can be set based on, for example, the color of a skin.
Referring to
Moreover, in a case where the optical absorber 101b is located in front of or behind the isochronous surface 121a of the transducer 120a, the large photoacoustic waves generated by the optical absorber 101b are received before or after the photoacoustic waves generated by the optical absorber 101a. As a result, after reconstruction, a signal which is actually non-existent is generated in front of or behind the optical absorber 101a, and thus also becomes a large cause for noise.
On the other hand, photoacoustic waves generated by the optical absorber 101x, which is located at a place in which the energy of radiation by the semiconductor light-emitting element 200a is large, are not present on the isochronous surface 121a of the transducer 120a with respect to the photoacoustic waves generated by the optical absorber 101a. Therefore, a photoacoustic signal obtained by conversion performed by the transducer 120a is able to be easily separated based on a difference in reception time.
Next, a case where photoacoustic waves generated by the optical absorber 101a are received and converted into an electrical signal by the transducer 120b or 120c is described. Photoacoustic waves generated by the optical absorber 101c at a region in which the energy of radiation by the semiconductor light-emitting element 200c, which is located on the isochronous surface 121b of the transducer 120b, is large become a cause for noise with respect to the photoacoustic waves generated by the optical absorber 101a. Moreover, photoacoustic waves generated by the optical absorber 101d at a region in which the energy of radiation by the semiconductor light-emitting element 200d, which is located on the isochronous surface 121c of the transducer 120c, is large become a cause for noise with respect to the photoacoustic waves generated by the optical absorber 101a.
As described above, with respect to photoacoustic waves generated by an optical absorber of interest, which are received by a transducer of the ultrasonic wave reception unit 120, photoacoustic waves having a large intensity generated by an optical absorber located on the isochronous surface and close to a semiconductor light-emitting element become a cause for noise.
As mentioned above, photoacoustic waves generated by an optical absorber located near the skin surface close to a semiconductor light-emitting element have a great influence. For example, in a case where the amount of melanin of the pigment contained in the skin is large, photoacoustic waves generated at the skin surface are large, so that the above-mentioned cause for noise becomes large. Moreover, for example, a mole and body hair also become a cause for noise.
From this, it can be understood that, to reduce a cause for noise, it would be good that, as illustrated in
In this way, turning off semiconductor light-emitting elements other than a semiconductor light-emitting element corresponding to a region (irradiated region) in which to reconstruct an image (generate a reconstructed image) enables reducing influences of optical absorbers located near the other semiconductor light-emitting elements. Naturally, in this case, photoacoustic waves are seldom or never generated in the irradiated regions corresponding to the semiconductor light-emitting elements turned off, so that it is naturally hard to acquire images formed with the photoacoustic waves. Accordingly, only the semiconductor light-emitting element 200a is caused to perform light emission, so that a reconstructed image is generated with respect to the irradiated region 201a corresponding to the semiconductor light-emitting element 200a. Subsequently, only the semiconductor light-emitting element 200b is caused to perform light emission, so that a reconstructed image is generated with respect to the irradiated region 201b corresponding to the semiconductor light-emitting element 200b, and, subsequently, only the semiconductor light-emitting element 200c is caused to perform light emission, so that a reconstructed image is generated with respect to the irradiated region 201c corresponding to the semiconductor light-emitting element 200c. Subsequently, only the semiconductor light-emitting element 200d is caused to perform light emission, so that a reconstructed image is generated with respect to the irradiated region 201d corresponding to the semiconductor light-emitting element 200d. Finally, the respective acquired reconstructed images are combined to acquire a reconstructed image of the entire subject.
This enables acquiring a reconstructed image having a good image quality over the entire region of the subject. On the other hand, since radiation is performed four times to obtain a reconstructed image of the entire subject, approximately a quadruple time is required to obtain the reconstructed image.
The driver unit 210 controls light emission of a plurality of semiconductor light-emitting elements of the light radiation unit 200 according to a radiation pattern (light emission pattern) of a radiation mode, which is described below. Details of the method for controlling light emission of a plurality of semiconductor light-emitting elements of the light radiation unit 200 are described below.
The semiconductor light-emitting elements of the light radiation unit 200 perform light emission with a first period (sampling period) according to a radiation pattern, thus irradiating a subject 100. The ultrasonic wave reception unit 120 receives photoacoustic waves generated from the subject 100 with the first period (sampling period), thus outputting an electrical signal (photoacoustic signal) as an analog signal. The signal acquisition unit 140 converts the analog signal output from the ultrasonic wave reception unit 120 into a digital signal, thus outputting the digital signal to the computer 150. The computer 150 calculates, with a second period (the period of image capturing frame rate), an arithmetic mean of the digital signal output from the signal acquisition unit 140 with the first period (sampling period), and stores the arithmetic mean as an electrical signal derived from photoacoustic waves (photoacoustic signal) in a memory. The computer 150 generates photoacoustic image data by performing processing, such as image reconstruction, on the stored digital signal. Then, the photoacoustic image data is displayed by the display unit 160.
Moreover, the computer 150 performs control of the entire photoacoustic apparatus 1.
Although not illustrated, the computer 150 can perform image processing for displaying or processing for synthesizing a graphic for graphical user interface (GUI) on the obtained photoacoustic image data.
While the present exemplary embodiment is described with use of the terms “first period (sampling period)” and “second period (the period of image capturing frame rate), the “period” used in exemplary embodiments of the invention does not need to be “perfectly constant in repetition time”. In other words, in the exemplary embodiments of the invention, even in a case where repetition is performed at time intervals that are not constant, the term “period” is used. Moreover, the first period (sampling period) includes, for example, a period in which a break period is included. A repetition time in a time that does not include the break period is referred to as a “period” in the exemplary embodiments of the invention.
The user (for example, a doctor or technician) can perform a diagnosis by checking a photoacoustic image displayed on the display unit 160. The displayed image can be stored in, for example, a memory included in the computer 150 or a data management system connected to the photoacoustic apparatus 1 via a network, based on a storing instruction from the user or the computer 150. The input unit 170 receives, for example, an instruction from the user.
Subsequently, a desirable configuration of each block is described in detail.
Referring to
The probe 180, which is illustrated in
As illustrated in
The light radiation unit 200 generates light to be radiated onto the subject 100. To generate pulsed light and acquire a material density such as oxygen saturation, the light radiation unit 200 is desirably a light source capable of outputting a plurality of wavelengths.
Moreover, from the quantity of light specified as a light source or usage of mounting inside the housing of the probe 180, it is desirable that the light radiation unit 200 include a plurality of semiconductor light-emitting elements, such as semiconductor lasers or light-emitting diodes, as illustrated in
The pulse width of light which the light radiation unit 200 emits is, for example, 10 nanoseconds (ns) or more and 1 microsecond (μs) or less. Moreover, the wavelength of light is desirably 400 nanometers (nm) or more and 1600 nm or less, but the wavelength can be determined according to optical absorption characteristics of an optical absorber intended to be imaged. To perform imaging of a blood vessel at high resolution, wavelengths large in absorption at a blood vessel (400 nm or more and 800 nm or less) can be used. To perform imaging of the deep portion of a living body, light having wavelengths small in absorption at background tissues (for example, water and fat) of the living body (700 nm or more and 1100 nm or less) can be used. In the present exemplary embodiment, since semiconductor light-emitting elements are used as a light source of the light radiation unit 200, the quantity of light is insufficient. In other words, a photoacoustic signal which is obtained by performing radiation once does not reach an intended signal-to-noise ratio (S/N). Therefore, with respect to each order of the light emission sequence, light emission is performed with a first period (sampling period), an arithmetic mean of the photoacoustic signal is calculated to improve S/N, and, then, a reconstructed image is calculated with a second period (the period of image capturing frame rate) based on the calculated arithmetic mean of the photoacoustic signal.
In the case of a use application for which outputs of the semiconductor light-emitting elements are sufficient, light emission with a first period (sampling period) and calculation of an arithmetic mean of the photoacoustic signal do not need to be performed. In other words, a reconstructed image can be generated with radiation performed once.
An example of the wavelength of the light radiation unit 200 used in the present exemplary embodiment is desirably a wavelength of 797 nm. In other words, the exemplary wavelength is a wavelength capable of reaching the deep portion of a subject, and is suitable for detection of a blood vessel structure because absorption coefficients of oxyhemoglobin and deoxyhemoglobin are approximately equal. Moreover, if a light source with a wavelength of 756 nm is used as the second wavelength, an oxygen saturation can be obtained by using an absorption coefficient difference between oxyhemoglobin and deoxyhemoglobin.
The ultrasonic wave reception unit 120 includes ultrasonic transducers, each of which receives photoacoustic waves generated by light emission performed with the first period (sampling period) to output an electrical signal, and a supporting member, which supports the ultrasonic transducers. In the following description, the ultrasonic transducer is simply referred to as a “transducer”. As a member configuring the transducer, for example, a transducer using, for example, a piezoelectric material and a capacitance-type transducer Fabry-Perot interferometer can be used. The piezoelectric material includes, for example, a piezo-ceramic material, such as piezoelectric zirconate titanate (PZT), and a high-molecular piezoelectric membrane material, such as polyvinylidene fluoride (PVDF). The capacitance-type transducer is referred to as a “capacitive micro-machined ultrasonic transducer (CMUT)”.
An electrical signal obtained by a transducer with the first period (sampling period) is a time-resolved signal. Therefore, the amplitude of the electrical signal represents a value that is based on a sound pressure received by a transducer at each time (for example, a value proportional to the sound pressure).
Furthermore, the transducer is desirably the one capable of detecting frequency components configuring photoacoustic waves (typically, 100 kilohertz (kHz) to 10 megahertz (MHz)). Moreover, it is also desirable that a plurality of transducers be arranged side by side on the supporting member to form a flat surface or curved surface such as that called a 1D array, a 1.5D array, a 1.75D array, or a 2D array. Furthermore, in
The ultrasonic wave reception unit 120 can include an amplifier which amplifies time-series analog signals output from the transducers. Moreover, the ultrasonic wave reception unit 120 can include an analog-to-digital (A/D) converter which converts time-series analog signals output from the transducers into time-series digital signals. In other words, the ultrasonic wave reception unit 120 can include the signal acquisition unit 140.
Furthermore, to detect acoustic waves from various angles to improve image resolution, such a transducer arrangement as to surround the subject 100 from all of the sides is desirable. Moreover, in a case where the subject 100 is too large to be surrounded from all of the sides, the transducers can be arranged on a hemispherical supporting member. A probe 180 including an ultrasonic wave reception unit 120 having such a shape is not a hand-held type probe, and is suitable for a mechanical-scanning type photoacoustic apparatus, which relatively moves the probe with respect to the subject 100. The movement of the probe can be performed with use of a scanning unit such as an XY stage. Furthermore, the arrangement and number of transducers and the shape of the supporting member are not limited to those mentioned above, and can be optimized according to the subject 100.
A medium which propagates photoacoustic waves can be arranged in a space between the ultrasonic wave reception unit 120 and the subject 100. This enables matching of acoustic impedances at a surface boundary between the subject 100 and the transducers. The medium includes, for example, water, oil, and ultrasonic gel.
The photoacoustic apparatus 1 can include a holding member which holds the subject 100 to stabilize the shape thereof. The holding member is desirably a member which is high in both light transmittivity and acoustic wave transmittivity. For example, polymethylpentene, polyethylene terephthalate, and acrylic can be used.
In a case where the apparatus according to the present exemplary embodiment not only generates a photoacoustic image but also generates an ultrasound image by transmission and reception of acoustic waves, the transducer can also function as a transmission unit that transmits acoustic waves. A transducer serving as a reception unit and a transducer serving as a transmission unit can be a single (common) transducer or can be separate configurations.
The signal acquisition unit 140 includes amplifiers, each of which amplifies an electrical signal that is an analog signal output from the ultrasonic wave reception unit 120 and generated with the light emission performed with the first period (sampling period), and A/D converters, each of which converts the analog signal output from the amplifier into a digital signal. The signal acquisition unit 140 can be configured with, for example, a Field Programmable Gate Array (FPGA) chip.
An operation of the signal acquisition unit 140 is described in more detail. Analog signals output from a plurality of transducers arranged in an array-like manner of the ultrasonic wave reception unit 120 are amplified by the respective corresponding amplifiers and are then converted by the respective corresponding A/D converters into digital signals. The A/D conversion rate corresponds to at least two times the bandwidth of an input signal or more. As mentioned above, if the frequency components of photoacoustic waves are at 100 kHz to 10 MHz, the A/D conversion rate corresponds to a conversion performed at a frequency of 20 MHz or more, desirably, at a frequency of 40 MHz. Furthermore, the signal acquisition unit 140 uses a light emission control signal to synchronize timing of light radiation and timing of signal acquisition processing. In other words, A/D conversion is started at the above-mentioned A/D conversion rate based on the light emission time with every first period (sampling period), and the obtained analog signal is converted into a digital signal. As a result, a digital data string at every time interval of one out of the A/D conversion rate (at every A/D conversion interval) from the light emission time is able to be acquired with every first period (sampling period) from each of the plurality of transducers.
The signal acquisition unit 140 is also called a “data acquisition system (DAS)”. In the context of the present application, the electrical signal is a concept including not only an analog signal but also a digital signal.
As mentioned above, the signal acquisition unit 140 can be mounted inside the housing 181 of the probe 180. With such a configuration, information between the probe 180 and the computer 150 is transferred with a digital signal, so that noise resistance is improved. Moreover, as compared with the case of transferring an analog signal, using a high-speed digital signal enables reducing the number of wirings, so that the operability of the probe 180 is improved.
Moreover, an arithmetic mean operation to be described below can also be performed by the signal acquisition unit 140. In this case, it is desirable that the arithmetic mean operation be performed with use of hardware such as a FPGA.
The computer 150 includes a calculation unit (image generation unit) 151, a storage unit 152, and a control unit 153. A unit assuming a calculation function as the calculation unit 151 can be configured with a processor, such as a CPU or a graphics processing unit (GPU), or an arithmetic circuit, such as an FPGA chip. These units can be configured with a single processor or arithmetic circuit, or can be configured with a plurality of processors or arithmetic circuits.
The computer 150 performs an arithmetic mean operation described below with respect to each of the plurality of transducers. The computer 150 performs an arithmetic mean operation on every piece of data of the same time from the light emission time of the above-mentioned digital data string output from the signal acquisition unit 140 with every first period (sampling period). Then, the computer 150 stores, in the storage unit 152, the arithmetic-mean digital data string as an arithmetic-mean electrical signal (photoacoustic signal) derived from photoacoustic waves with every second period (the period of image capturing frame rate).
Then, the calculation unit 151 performs generation of photoacoustic image data (a structure image or a function image) using image reconstruction based on the arithmetic-mean photoacoustic signal stored in the storage unit 152 with every second period (the period of image capturing frame rate), and performs other various calculation processing operations. The calculation unit 151 can receive, from the input unit 170, various parameter inputs, such as the speed of sound of the subject and a configuration of the holding portion, and use the parameter inputs for the calculation operations.
A reconstruction algorithm with which the calculation unit 151 converts the electrical signal into three-dimensional volume data can employ an optional method, such as a time-domain back projection method, a Fourier domain back projection method, and a model-based method (repetitive calculation method). The time-domain back projection method includes, for example, universal back-projection (UBP), filtered back-projection (FBP), and phasing and summing (delay-and-sum).
In a case where the light radiation unit 200 employs two wavelengths, the calculation unit 151 performs image reconstruction processing to generate a first initial sound pressure distribution from a photoacoustic signal derived from light of the first wavelength and to generate a second initial sound pressure distribution from a photoacoustic signal derived from light of the second wavelength. Moreover, the calculation unit 151 obtains a first absorption coefficient distribution by correcting the first initial sound pressure distribution with a light quantity distribution of the light of the first wavelength and obtains a second absorption coefficient distribution by correcting the second initial sound pressure distribution with a light quantity distribution of the light of the second wavelength. Additionally, the calculation unit 151 obtains an oxygen saturation distribution from the first and second absorption coefficient distributions. Furthermore, as long as the oxygen saturation distribution is eventually obtained, the contents or orders of calculation operations are not limited to those mentioned above.
The storage unit 152 is configured with a non-transitory storage medium, such as a volatile memory including a random access memory (RAM), and a read-only memory (ROM), a magnetic disc, and a flash memory. Furthermore, a storage medium storing a program is a non-transitory storage medium. Additionally, the storage unit 152 is configured with a plurality of storage media.
The storage unit 152 is able to store various pieces of data, such as photoacoustic signals subjected to the arithmetic mean operation with the second period (the period of image capturing frame rate), photoacoustic image data generated by the calculation unit 151, and reconstructed image data that is based on photoacoustic image data.
The control unit 153 is configured with an arithmetic element such as a CPU. The control unit 153 controls an operation of each constituent component of the photoacoustic apparatus 1. The control unit 153 stores a plurality of radiation patterns and radiation modes described below, and sends, to the driver unit 210, a light emission control signal used to control light emission of the semiconductor light-emitting elements with the first period (sampling period) according to a plurality of radiation patterns of the designated radiation mode. Then, the semiconductor light-emitting elements perform light emission according to a plurality of radiation patterns of the designated radiation mode, thus irradiating the subject. The control unit 153 also serves as a light emission control unit which controls light emission according to radiation patterns of a radiation mode. Moreover, the control unit 153 can have the function to select a radiation mode during acquisition of a reconstructed image from among a plurality of radiation modes according to an instruction from the user or automatically, as described below.
Moreover, the control unit 153 reads out program code stored in the storage unit 152, and controls an operation of each constituent component of the photoacoustic apparatus 1 based on the program code.
Additionally, the control unit 153 performs, for example, adjustment of an image with respect to the display unit 160. With this, oxygen saturation distribution images are sequentially displayed along with the movement and photoacoustic measurement of the probe 180.
The computer 150 can be a workstation exclusively designed for the present exemplary embodiment. The computer 150 can also be a general-purpose personal computer (PC) or workstation which are configured to operate according to instructions from a program stored in the storage unit 152. Moreover, the components of the computer 150 can be configured with respective different pieces of hardware. Additionally, at least some constituent components of the computer 150 can be configured with a single piece of hardware.
Moreover, the computer 150 and the ultrasonic wave reception unit 120 can be provided as a configuration contained in a common casing. Additionally, some signal processing operations can be performed by a computer contained in a casing and the remaining signal processing operations can be performed by a computer provided outside the casing. In this case, the computers provided inside and outside the casing can be collectively referred to as a computer according to the present exemplary embodiment. In other words, pieces of hardware constituting a computer do not need to be contained in a single casing. An information processing apparatus provided in, for example, a cloud computing service and installed in a remote location can be used as the computer 150.
The computer 150 is equivalent to a processing unit in the present exemplary embodiment. In particular, the calculation unit 151 plays a central role in implementing the function of the processing unit.
The display unit 160 is a display such as a liquid crystal display or an organic electroluminescence (EL) display. The display unit 160 is a device which displays, for example, an image that is based on, for example, subject information obtained by the computer 150 and numerical values of a specific position. The display unit 160 can also display a graphical user interface (GUI) used to operate an image or the apparatus. Image processing (for example, adjustment of a luminance value) can be performed by the display unit 160 or the computer 150.
An operation console which is able to be operated by the user and is configured with, for example, a mouse and a keyboard can be employed as the input unit 170. Moreover, the display unit 160 can be configured with a touch panel, so that the display unit 160 can be used as the input unit 170. The input unit 170 receives inputs, such as instructions and numerical values, from the user, and transmits the inputs to the computer 150.
Furthermore, the constituent components of the photoacoustic apparatus can be configured as respective separate apparatuses or can be configured as a single integrated apparatus. Moreover, at least some constituent components of the photoacoustic apparatus can be configured as a single integrated apparatus.
Moreover, the computer 150 also causes the control unit 153 to perform drive control of constituent components included in the photoacoustic apparatus. Additionally, the display unit 160 can display, in addition to an image generated by the computer 150, for example, a GUI. The input unit 170 is configured to allow the user to input information thereto. The user can use the input unit 170 to perform operations for starting and ending of measurement, designation of a radiation mode described below, and an instruction for storage of a generated image.
The subject 100 is not a component constituting the photoacoustic apparatus, but is described below. The photoacoustic apparatus according to the present exemplary embodiment is able to be used for the purpose of, for example, diagnosis of, for example, malignant tumor or blood vessel disease of a human being or an animal or follow-up of chemical treatment. Therefore, the subject 100 is assumed to be a region targeted for diagnosis, such as a living body, specifically, a breast, each organ, a network of vessels, a head, a neck, an abdomen, and extremities including hands and fingers and toes of a human body or an animal. For example, if a human body is an object to be measured, for example, oxyhemoglobin, deoxyhemoglobin, a blood vessel containing a lot of such hemoglobin, or a new blood vessel formed near a tumor can be set as a target serving as an optical absorber. Moreover, for example, plaque on the wall of a carotid artery can be set as a target serving as an optical absorber. If the subject is a human body, melanin of the pigment contained in the skin may become the above-mentioned optical absorber which generates photoacoustic waves that become a cause for noise. Moreover, a pigment such as methylene blue (MB) or indocyanine Green (ICG), gold fine particles, or an externally-introduced material obtained by integrating or chemically modifying those can be set as an optical absorber. Additionally, a puncture needle or an optical absorber applied to a puncture needle can be set as an observation object. The subject can be an inanimate object such as a phantom or a test object.
The operation of causing the semiconductor light-emitting elements to sequentially perform light emission to acquire a reconstructed image as described above is described as follows. In the example illustrated in
To implement a radiation mode illustrated in
Since the quantity of light of each semiconductor light-emitting element is small, with a view to improving S/N, as indicated on line T2 illustrated in
Furthermore, the length of the sampling period tw1 is set in consideration of a maximum permissible exposure (MPE) with respect to the skin. This is because, the shorter the length of the sampling period tw1, the smaller the MPE value becomes. For example, in a case where the measurement wavelength is 750 nm, the pulse width of pulsed light is 1 μs, and the sampling period tw1 is 0.1 milliseconds (ms), the MPE value with respect to the skin is about 14 J/m2. On the other hand, in a case where the peak power of pulsed light radiated from the light radiation unit 200 is 2 kilowatts (kW) and the irradiation area from the light radiation unit 200 is 150 mm2, the light energy radiated from the light radiation unit 200 onto the subject 100, such as a human body, is about 13.3 J/m2. In this case, the light energy radiated from the light radiation unit 200 becomes equal to or less than the MPE value. In this way, if the sampling period tw1 is 0.1 ms or more, it can be assured that the light energy is equal to or less than the MPE value. In the above-described way, the light energy is set in a range that does not exceed the MPE value, based on the sampling period tw1, the peak power of pulsed light, and the irradiation area.
Next, as indicated on line T2 to line T4 illustrated in
Next, as indicated on line T4 illustrated in
Then, the display unit 160 displays composite reconstructed image data.
Here, the sampling period tw1 and the period of image capturing frame rate tw2 are determined as follows.
As mentioned above, due to a restriction imposed by the MPE value, the sampling period tw1 is determined based on the peak power of pulsed light and the radiation area. Then, the number of times of the arithmetic mean operation is determined based on the ratio of the S/N of a photoacoustic signal acquired by radiation of pulsed light performed one time to the S/N of a photoacoustic signal determined by the specified image quality. For example, if the S/N of a photoacoustic signal acquired by radiation of pulsed light performed one time is 1/5 times the S/N of a photoacoustic signal determined by the specified image quality, S/N is required to be five times improved. Therefore, averaging is performed 25 times. For example, if the sampling period tw1 is 0.1 ms, the period of image capturing frame rate is 2.5 ms or more, in other words, the image capturing frame rate is 400 Hz or less.
Moreover, the sampling period tw1 is also restricted by heat generation of semiconductor light-emitting elements. More specifically, if the thermal resistance of the probe is determined, the temperature is determined based on the power consumption of semiconductor light-emitting elements. The sampling period tw1 is made longer in such a manner that the temperature of semiconductor light-emitting elements does not exceed the permissible temperature.
On the other hand, since, if the number of times of the arithmetic mean operation is made large, photoacoustic signals are subjected to the arithmetic mean operation for a long time, in a case where the subject has, for example, a body motion, blurring occurs due to the motion. To reduce motion blur, it is more advantageous to make the number of times of the arithmetic mean operation as small as possible. Specifically, it is desirable to design the number of times of the arithmetic mean operation in such a manner that the motion blur is restricted to 1/2 or less of the specified resolution. For example, assuming that the specified resolution is 0.2 mm and the body motion of the subject is 5 mm/sec, in a case where the sampling period tw1 is 0.1 ms, the number of times of the arithmetic mean operation is set to 200 times or less, in other words, the period of image capturing frame rate tw2 is set to 20 ms or less.
In consideration of such a plurality of conditions, the sampling period tw1 and the period of image capturing frame rate tw2 are determined. Naturally, in a case where it is impossible to satisfy all of the conditions, the priority for the conditions is determined and these parameters are thus determined.
The first exemplary embodiment of the invention has been described with a radiation mode which includes four radiation patterns as illustrated in
According to the first exemplary embodiment of the invention, providing a plurality of radiation patterns to cause semiconductor light-emitting elements to perform light emission in a time-division manner enables preventing a decrease in image quality caused by high-intensity photoacoustic waves generated near the subject surface.
Next, a second exemplary embodiment of the invention is described.
As mentioned above, according to the first exemplary embodiment, a good-quality composite reconstructed image can be obtained. However, since a composite reconstructed image is obtained by acquiring a plurality of pieces of reconstructed image data using a plurality of radiation patterns, a long time is required. The second exemplary embodiment of the invention is a configuration capable of updating a composite reconstructed image in a short time even in a radiation mode including a plurality of radiation patterns.
As indicated by line T4 to line T6 illustrated in
In this case, while a composite reconstructed image is formed with pieces of reconstructed image data corresponding to the radiation patterns sequentially updated, there is an advantage of being able to update a portion for which reconstructed image data has been acquired with a minimum delay.
According to the second exemplary embodiment of the invention, as with the first exemplary embodiment, a decrease in image quality caused by a high-intensity photoacoustic signal generated near the subject surface can be prevented. Moreover, updating of a composite reconstructed image can be performed in the shortest amount of time.
A third exemplary embodiment of the invention provides a photoacoustic apparatus having a plurality of radiation modes and capable of switching radiation modes according to an instruction from the user or automatically. The designation of a radiation mode by the user can be performed via the mouse 171 or the keyboard 172 of the input unit 170. Moreover, storing and execution of radiation modes and radiation patterns are performed by the computer 150.
The photoacoustic apparatus according to the third exemplary embodiment has, for example, three radiation modes. Radiation mode 1 serving as the first radiation mode is the radiation mode described in the first exemplary embodiment. In other words, the radiation mode 1 includes four radiation patterns in each of which two of eight semiconductor light-emitting elements perform light emission at the same time. Radiation mode 2 serving as the second radiation mode includes two radiation patterns illustrated in
Radiation mode 3 serving as the third radiation mode is a radiation mode including one radiation pattern in which eight semiconductor light-emitting elements perform light emission at the same time.
Characteristics of such three radiation modes are as follows. The radiation mode 3 is able to irradiate the entire region of the subject with light emission of one radiation pattern. Therefore, the radiation mode 3 is able to obtain a reconstructed image of the entire region of the subject at a speed higher than those of the other radiation modes. Moreover, the radiation mode 3 does not need to perform combining of reconstructed images. In a case where there are few optical absorbers near the subject surface, since deterioration of a reconstructed image is small, the radiation mode 3 is effective. As mentioned above, the radiation mode 1 is a mode capable of reducing deterioration of a reconstructed image in a case where an optical absorber is present near the subject surface. However, since reconstructed images acquired by radiation performed four times are combined to obtain a reconstructed image of the entire region of the subject, the speed of obtaining a reconstructed image becomes low. The radiation mode 2 is a radiation mode intermediate between the radiation mode 1 and the radiation mode 3, and has advantages of both modes.
In a photoacoustic apparatus having such three radiation modes, it is desirable that the photoacoustic apparatus be configured to allow the user to select a radiation mode. For example, in the case of a subject with a low melanin concentration of the pigment contained in the skin, such as the skin of a Caucasian person, the user can select the radiation mode 3. Moreover, in a case where the probe is applied to the surface of the skin of a Negroid person or the skin having a mole, since the melanin concentration of the pigment contained in the skin is high, the user can select the radiation mode 1. Additionally, it is desirable that, with radiation modes being configured to be switchable in real time, the user be allowed to select a radiation mode while viewing the obtained reconstructed image.
Furthermore, it is desirable that a radiation mode be automatically set according to the optical absorption coefficient of the skin. More specifically, the probe 180 can be additionally provided with a camera or reflectance measurement device used to observe the condition of the skin, and a radiation mode can be automatically selected according to the brightness of the skin. Naturally, in a case where the brightness of the skin is low (in a case where the optical absorption coefficient of the skin is large), the radiation mode 1 is selected. Moreover, without use of the camera or reflectance measurement device, the optical absorption coefficient of the skin can be estimated based on the magnitude of a photoacoustic signal on the skin surface received by the ultrasonic wave reception unit 120, so that a radiation mode can be selected based on the estimated optical absorption coefficient. For example, first, a photoacoustic signal is received in the radiation mode 3, and, if a signal at the time corresponding to the skin surface of the acquired photoacoustic signal is large, the computer 150 performs control to select the radiation mode 2, and, if the signal is much larger, the computer 150 performs control to select the radiation mode 1.
As described above, according to the third exemplary embodiment of the invention, an optimal radiation mode can be designated by the user or can be selected automatically according to, for example, the melanin concentration of the skin serving as a subject. As a result, a reconstructed image with less noise can be obtained in the shortest amount of time without depending on the melanin concentration of the skin.
A fourth exemplary embodiment of the invention is another exemplary embodiment of the hand-held type probe.
The probe 180, which is illustrated in
A light emission sequence of the semiconductor light-emitting elements 200a to 200d and irradiated regions thereof are illustrated in
In a case where the light radiation unit (light source) 200 is implemented by a plurality of semiconductor light-emitting elements, any variation of light outputs of the respective semiconductor light-emitting elements induces luminance variation in generation of a composite reconstructed image. In other words, a variation in luminance occurs in a composite reconstructed image. A fifth exemplary embodiment of the invention is a configuration for correcting a variation in luminance of a reconstructed image, which is caused by a variation of light outputs between a plurality of semiconductor light-emitting elements constituting the light radiation unit (light source) 200.
The fifth exemplary embodiment is described with reference to the configuration of the light radiation unit 200 including a plurality of semiconductor light-emitting elements 200a to 200h described in the first exemplary embodiment.
In a case where there is a variation in the quantity of light emission between a plurality of semiconductor light-emitting elements 200a to 200h, corrections are performed based on any of the following correction methods.
One correction method actually measures the quantity of light of an irradiated region for each of a plurality of radiation patterns, and corrects a reconstructed image acquired in each of the plurality of radiation patterns with the reciprocal of the quantity of light of an irradiated region for each of the plurality of radiation patterns. Such a correction can be easily performed in a case where a plurality of semiconductor light-emitting elements performs light emission according to radiation patterns. As illustrated in
Another correction method is a method of actually acquiring a composite reconstructed image such as a phantom having no variation in optical absorption coefficient and generating correction data in such a manner that an unevenness in luminance does not occur in the composite reconstructed image. Moreover, a further method of actually acquiring a composite reconstructed image with use of a phantom with a known optical absorption coefficient and generating correction data in such a manner that an unevenness in luminance of the composite reconstructed image becomes a luminance corresponding to the known optical absorption coefficient can also be employed. Thus, a method of performing corrections to a composite reconstructed image can be employed.
This correction method is suitable for a case where a portion in which irradiated regions of semiconductor light-emitting elements overlap is large as in the above-mentioned radiation mode 3.
As described in the fifth exemplary embodiment of the invention, even in a case where the light radiation unit (light source) 200 is implemented by a plurality of semiconductor light-emitting elements, a variation in a reconstructed image caused by a variation in light outputs of the plurality of semiconductor light-emitting elements can be corrected in an appropriate manner.
Radiation patterns which are effective in exemplary embodiments of the invention are described. As apparent from
The wavelength of light which the light radiation unit 200 emits can include a plurality of wavelengths as mentioned above. If a plurality of wavelengths is employed, oxygen saturation serving as functional information can be calculated. For example, exemplary embodiments of the invention can acquire photoacoustic signals while alternately switching two wavelengths with a period of image generation rate, calculate composite reconstructed image data, and calculate oxygen saturation based on two pieces of composite reconstructed image data. The calculation of oxygen saturation is described in detail in Japanese Patent Application Laid-Open No. 2015-142740, and the detailed description thereof is, therefore, omitted.
Moreover, a photoacoustic apparatus according to an exemplary embodiment of the invention can be additionally provided with the function of transmitting ultrasonic waves from transducers and performing measurement based on reflected waves. In this case, naturally, the light radiation unit 200 does not perform light emission.
In a photoacoustic apparatus according to an exemplary embodiment of the invention, since a plurality of semiconductor light-emitting elements is used as a light source, high-density packaging regarding light radiation positions can be easily implemented. Moreover, since a plurality of electrical signals derived from light radiated onto a subject in positions different from each other and at times different from each other is reconstructed independently from each other to generate a plurality of images and the generated plurality of images is combined, a good-quality image can be obtained.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2017-107062, filed May 30, 2017, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2017-107062 | May 2017 | JP | national |