Embodiments of the present invention will be described with reference to the drawings.
Embodiment 1 will be described with reference to
As shown in
An X-ray 3-dimensional helical computer tomography system 15, a 3-dimensional magnetic resonance imaging system 16, and a high-speed network 17 for optical communication or ADSL to which the X-ray 3-dimensional helical computer tomography system 15 and the 3-dimensional magnetic resonance imaging system 16 are connected. The X-ray 3-dimensional helical computer tomography system 15 and the 3-dimensional magnetic resonance imaging system 16 are connected to the image processing device 11 in the body cavity probe apparatus 1 via the network 17.
In order to be inserted into the body cavity such as the esophagus, the stomach, or the duodenum, the ultrasonic endoscope 2 has a rigid potion 21 located at its distal end and composed of a rigid material such as stainless steel, a long-sized flexible portion 22 located closer to the proximal end than the rigid portion 21 and composed of a flexible material, and an operation portion 23 located closer to the proximal end than the flexible portion 22 and composed of a rigid material. The rigid portion 21 and the flexible portion 22 form an insertion portion that is inserted into the body cavity.
The rigid portion 21 has image signal acquisition means fixed thereto to optically pick up images to acquire image signals as described below.
The rigid portion 21 has an optical observation window 24 formed of cover glass. An objective lens 25 and an image pickup device, for example, a CCD (Charge Coupled Device) camera 26, are provided inside the optical observation window 24; the objective lens 25 forms an optical image and the CCD camera 26 is located at the image formation position. Further, an illumination light irradiation window (illumination window; not shown) is provided adjacent to the optical observation window 24 to irradiate the interior of the body cavity with illumination light.
The CCD camera 26 is connected to the optical observation device 3 by a signal line 27. The illumination light irradiation window (not shown) is configured to irradiate illumination light to illuminate the interior of the body cavity. An image of the body cavity surface is formed in the CCD camera 26 through the optical observation window 24 via the objective lens 25. A CCD signal from the CCD camera 26 is outputted, via the signal line 27, to the optical observation device 3, serving as image creation means for generating real-time images of optical images.
The rigid portion 21 also has image signal acquisition means fixed thereto to acoustically perform an image pick-up operation to acquire echo signals as image signals.
The rigid portion 21 has a group of annular arrayed ultrasonic transducers at, for example, a cylindrical distal end thereof; the group of annular arrayed ultrasonic transducers are arranged around the periphery of the insertion shaft and formed by cutting the distal end into pieces like strips of paper. The group of ultrasonic transducers forms an ultrasonic transducer array 29.
Ultrasonic transducers 29a constituting the ultrasonic transducer array 29 are connected to the ultrasonic observation device 4, serving as image creation means for generating ultrasonic real-time images via a corresponding signal line 30 through the operation portion 23. The center of the ring of the ultrasonic transducer array 29 corresponds to the pivoting center of an ultrasonic beam for radial scanning described below.
Here, normal orthogonal bases (unit vectors in the respective directions) V, V3, and V12 fixed to the rigid portion 21 are defined as shown in
That is, the vector V is parallel to a longitudinal direction (insertion shaft direction) of the rigid portion 21 and corresponds to a normal vector in an ultrasonic tomogram. The vector V3, which is orthogonal to the vector V, is a three-o'clock direction vector, and the vector V12 is a twelve-o'clock direction vector.
In the rigid portion 21, an image position and orientation detecting coil 31 as an image position and orientation detecting device for the ultrasonic transducer array 29 is fixed to a position very close to the center of the ring of the ultrasonic transducer array 29. The image position and orientation detection coil 31 has coils wound in the two directions (axes) of the vectors V and V3 and integrally formed so as to extend in the two axial directions. The image position and orientation detection coil 31 is set to be able to detect both directions of the vectors V and V3.
The flexible portions 22 contains a plurality of insertion shape detecting coils 32 arranged along the insertion shaft, for example, at given intervals to detect the insertion shape of the flexible portion 22 constituting an insertion portion of the ultrasonic endoscope 2.
As shown in
Accordingly, to be more exact, the insertion shape detecting device is composed of the image position and orientation detecting coil 31 provided in the rigid portion 21 and the insertion shape detecting coil 32 provided in the flexible portion 22.
The plurality of the insertion shape detecting coils 32 as an insertion shape detecting device for detecting the insertion shape may be provided, for example, only at the distal end of the flexible portion 22 to detect the insertion shape of the distal end of the insertion portion of the ultrasonic endoscope 2.
The present embodiment adopts the plurality of insertion shape detecting coils 32 as an insertion shape detecting device to detect the insertion shape utilizing magnetic fields. This makes it possible to prevent an operator and a patient (subject) from being exposed to radiations in detecting the insertion shape.
A bendable bending portion is often provided in the vicinity of the distal end of the flexible portion 22. The plurality of insertion shape detecting coils 32 may be provided only in the vicinity of the bending portion.
The position and orientation calculation device 5, constituting detection means for detecting the position, orientation, and the like of the image position and orientation detecting coil 31, is connected, via signal lines, to the transmission antenna 6, a plurality of A/D units 9a, 9b, and 9c constituting the A/D unit portion 9, and the image processing device 11, containing insertion shape creation means, 3-dimensional image creation means, synthesis means, image index creation means and the like.
The position and orientation calculation device 5 and the image processing device 11 are connected by, for example, an RS-232C-conforming cable 33.
The transmission antenna 6 is composed of a plurality of transmission coils (not shown) with different winding axis orientations. The transmission coils are integrally housed in, for example, a rectangular housing. The plurality of transmission coils are connected to the position and orientation calculation device 5.
An A/D unit 9i (i=a to c) comprises an amplifier (not shown) that amplifies inputted analog signals and an analog/digital conversion circuit (not shown) that samples the amplified signals and converts the signals into digital data.
The A/D unit 9a is connected individually to the image position and orientation detecting coil 31 and the plurality of insertion shape detecting coils 32 via a signal line 34.
The A/D unit 9b is connected to the elongate body cavity contact probe 8 via a signal line 35. The A/D unit 9c is connected individually to the plurality of body surface detecting coils 7 via a signal line 36.
Arrow lines in
(a) First flow: dotted lines indicate the flow of signals and data for optical images.
(b) Second flow: dashed lines indicate the flow of signals and data for ultrasonic tomograms.
(c) Third flow: solid lines indicate the flow of signals and data for positions as well as the flow of data created by processing the signals and data.
(d) Fourth flow: alternate long and short dash lines indicate the flow of reference image data and data created by processing the reference image data.
(e) Fifth flow: thick lines indicate the flow of signals and data for a final display screen obtained by synthesizing ultrasonic tomogram data (described below) with 3-dimensional guide image data (described below).
(f) Sixth flow: curves indicate the flow of signals and data for other control operations.
The body surface detecting coil 7 comprises four coils wound in one axial direction, respectively, which are each releasably fixed to characteristic points on the body surface of the subject 37, specifically, the surface of the abdomen (these characteristic points are hereinafter simply referred to as body surface feature points) by tapes, belts, bands or the like. The body surface detecting coil 7 is utilized to detect positions using magnetic fields from the body surface feature points.
In normal upper endoscopic inspections, the subject 37 assumes what is called a left lateral position in which the subject 37 lies on his or her left side on a bed 38 and then has the endoscope inserted through his or her mouth. Accordingly, the left lateral position is shown in
In the description of the present embodiment, the body surface feature points are the “xiphoid process”, a characteristic point on the skeleton, the “left anterior superior iliac spine”, the left side of the pelvis, the “right anterior superior iliac spine”, the right side of the pelvis, and the “spinous process of vertebral body”, located on the spine between the right and left anterior superior iliac spine.
The operator can locate the position of the four points through palpation. Further, the four points are not flush with one another but form an un-orthogonal coordinate axis having, as basic vectors, three vectors extending from the xiphoid process as an origin to the respective other feature points. The un-orthogonal coordinate axis is shown in
As shown in
As shown in
The treatment instrument channel 46 is configured so that the body cavity contact probe 8 can be inserted through the forceps port 44 and project from the projection port 45. The opening direction of the projection port 45 is such that the body cavity contact probe 8 projects from the projection port 45 to fall within the optical visual field range of the optical observation window 24.
The image processing device 11 has a matching circuit 51, an image index creation circuit 52, an insertion shape creation circuit 53, a communication circuit 54, a reference image storage portion 55, an interpolation circuit 56, a 3-dimensional human body image creation circuit 57, a synthesis circuit 58, a rotational transformation circuit 59, and a 3-dimensional image creation circuit 60 (hereinafter referred to as a 3-dimensional guide image creation circuit A and a 3-dimensional guide image creation circuit B) that creates 3-dimensional guide images in two different line-of-sight directions, a mixing circuit 61, a display circuit 62, and a control circuit 63.
Position and orientation data outputted by the position and orientation calculation device 5 is inputted to the matching circuit 51; the position and orientation calculation device 5 constitutes the detection means for detecting the positions and orientations of the insertion shape detecting device and the like.
The matching circuit 51 maps position and orientation data calculated in an orthogonal coordinate axis 0-xyz according to a predetermined conversion equation to calculate new position and orientation data in an orthogonal coordinate axis 0′-x′y′z′ as described below.
The matching circuit 51 outputs the new position and orientation data as position and orientation mapping data to the image index creation circuit 52, which creates image index data, and the insertion shape creation circuit 53, which creates insertion shape data.
The communication circuit 54 internally has a high-capacity, high-speed communication modem and is connected to the X-ray 3-dimensional helical computer tomography system 15 which creates 3-dimensional data of human body and the 3-dimensional magnetic resonance imaging system 16 via the network 17.
The reference image storage portion 55 comprises a hard disk drive or the like which can store a large volume of data. The reference image storage portion 55 stores a plurality of reference image data as anatomical image information.
As shown in
In picking up a tomogram of the subject 37, the exposure of the subject 37 to radiations can be reduced or avoided by using the 3-dimensional magnetic resonance imaging system 16 more often than the X-ray 3-dimensional helical computer tomography system 15.
The reference image data in the reference image storage portion 55 in
Here, as shown in
As shown in
Each of the volume memories VM is configured to be able to store a large volume of data. A voxel space is assigned to a partial storage region of the volume memory VM. As shown in
The 3-dimensional human body image creation circuit 57 and the rotational transformation circuit 59, both shown in
The display circuit 62 has a switch 62a that switches inputs to the display circuit 62. The switch 62a has an input terminal α, an input terminal β, an input terminal γ, and one output terminal. The input terminal α is connected to the reference image storage portion 55. The input terminal β is connected to an output terminal (not shown) of the optical observation device 3. The input terminal γ is connected to the mixing circuit 61. The output terminal is connected to the display device 14, which displays optical images, ultrasonic tomograms, and 3-dimensional guide images, and the like.
The control circuit 63 is connected to the portions and circuits in the image processing device 11 via signal lines so as to output instructions to the portions and circuits. The control circuit 63 is connected directly to the ultrasonic observation device 4, a mouse 12, and a keyboard 13 via control lines.
As shown in
Depressing any of the display switching keys 13α, 13β, and 13γ allows the control circuit 63 to output an instruction to the display circuit 62 to switch the switch 62a to the input terminal α, β, or γ. Depressing the display switching key 13α allows the switch 62a to be switched to the input terminal α. Depressing the display switching key 13β allows the switch 62a to be switched to the input terminal γ. Depressing the display switching key 137 allows the switch 62a to be switched to the input terminal γ.
The signals and data described above in (a) first flow to (f) sixth flow will be sequentially described. (a) The operation of the present embodiment will be described along the first flow of signals and data for an optical image shown by a dotted line.
The illumination light irradiation window (not shown) of the rigid portion 21 irradiates the optical visual field range with illumination light. The CCD camera 26 picks up an image of an object within the optical visual field range and performs a photoelectric conversion to obtain a CCD signal. The CCD camera 26 then outputs the CCD signal to the optical observation device 3.
The optical observation device 3 creates data for a real-time image of the optical visual field range on the basis of the inputted CCD signal. The optical observation device 3 then outputs the data to input terminal β of the switch 62a of the display circuit 62 in the image processing device 11 as optical image data.
(b) The operation of the present embodiment will be described along the second flow of signals and data for an ultrasonic tomogram.
When the operator depresses the scan control key 66, the control circuit 63 outputs a scan control signal to the ultrasonic observation device 4 to instruct a radial scan described below to be controllably turned on and off.
The ultrasonic observation device 4 selects some of the ultrasonic transducers 29a constituting the ultrasonic transducer array 29 and transmits excitation signals shaped like pulse voltages to the selected ultrasonic transducers.
Each of the selected ultrasonic transducers 29a receives and converts the corresponding excitation signal into an ultrasonic wave that is a compressional wave for a medium.
In this case, the ultrasonic observation device 4 delays the excitation signals so that the excitation signals reach the corresponding ultrasonic transducers 29a at different times. The value (delay amount) of the delay is adjusted so that ultrasonic waves excited by the ultrasonic transducers 29a form one ultrasonic beam when allowed to overlap one another in the subject 37.
The ultrasonic beam is emitted to the exterior of the ultrasonic endoscope 2. A reflected wave from the interior of the subject 37 returns to each ultrasonic transducer 29a along a path opposite to that of the ultrasonic beam.
Each ultrasonic transducer 29a converts the reflected wave into an electric echo signal and transmits the signal to the ultrasonic observation device 4 along a path opposite to that of the excitation signal.
The ultrasonic observation device 4 reselects a plurality of the ultrasonic transducers 29a to be involved in the formation of an ultrasonic beam such that the ultrasonic beam pivots in a plane (hereinafter referred to as a radial scan plane) which contains the center of the ring of the ultrasonic transducer array 29 and which is perpendicular to the rigid portion 21 and flexible portion 22. The ultrasonic observation device 4 then transmits excitation signals again to the selected ultrasonic transducers 29a. Thus, the transmission angle of the ultrasonic beam is varied. Repeating this allows what is called a radial scan to be achieved.
In this case, for each radial scan of the ultrasonic transducer array 29, the ultrasonic observation device 4 creates one digitalized ultrasonic tomogram data for a real-time image perpendicular to the insertion shaft of the rigid portion 21 from the echo signals into which the ultrasonic transducers 29a converts the reflected waves. The ultrasonic observation device 4 then outputs the ultrasonic tomogram data to the mixing circuit 61 in the image processing device 11. At this time, the ultrasonic observation device 4 processes the ultrasonic tomogram data into a square.
Thus, in the present embodiment, the ultrasonic observation device 4 reselects a plurality of ultrasonic transducers 29a to be involved in the formation of an ultrasonic beam and transmits excitation signals again. Thus, for example, 12 o'clock direction of a square ultrasonic tomogram is determined depending on which ultrasonic transducer 29a the ultrasonic observation device 4 selects as the 12 o'clock direction in transmitting excitation signals.
Thus, the normal vector V, 3 o'clock vector V3, and 12 o'clock vector V12 for the ultrasonic tomogram are defined. The ultrasonic observation device 4 further creates ultrasonic tomogram data obtained through observations from a direction -V opposite to that of the normal vector V.
The following are performed in real time: the radial scan by the ultrasonic transducer array 29, the creation of ultrasonic tomogram data by the ultrasonic observation device 4, and the output to the mixing circuit 61. In the present embodiment, ultrasonic tomograms are generated as real-time images.
(c) Now, the operation of the present embodiment will be described along the third flow of signals and data for positions and of data created by processing the signals and data.
The position and orientation calculation device 5 excites the transmission coil (not shown) in the transmission antenna 6. The transmission antenna 6 generates alternating magnetic fields in a space. The following coils detect and convert alternating magnetic fields into positional electric signals and then output the signals to the A/D units 9a, 9b, and 9c, respectively: two coils constituting the image position and orientation detecting coil 31, which detects the position and orientation (direction) of the image signal acquisition means by ultrasonic waves, the coils wound in the directions of the vectors V and V3 and having orthogonal winding axes, and the plurality of insertion shape detecting coils 32, which detect the insertion shape of the flexible portion 22, as well as the body cavity detecting coil 42 and body surface detecting coil 7, serving as subject detecting devices.
In each of the A/D units 9a, 9b, and 9c, the amplifier amplifies the positional electric signal, and the analog/digital conversion circuit samples and converts the signal into digital data. Each of the A/D units 9a, 9b, and 9c then outputs the digital data to the position and orientation calculation device 5.
Then, on the basis of the digital data from the A/D unit 9a, the position and orientation calculation device 5 calculates the position of the image position and orientation detecting coil 31 and the directions of the orthogonal winding axes thereof, that is, the vectors V and V3. The position and orientation calculation device 5 calculates the outer product V×V3 of the vectors V and V3, corresponding to the directions of the orthogonal winding axes, and thus the vector V12 in the 12 o'clock direction, corresponding to the remaining orthogonal direction is calculated. The position and orientation calculation device 5 thus calculates the orthogonal three directions, that is, the vectors V, V3, and V12.
Then, on the basis of the digital data from the A/D units 9a to 9c, the position and orientation calculation device 5 calculates the position of each of the plurality of insertion shape detecting coils 32, the position of each body surface detecting coil 7, and the position of the body cavity detecting coil 42.
The position and orientation calculation device 5 then outputs the position and orientation of the image position and orientation detecting coil 31, the position of each of the plurality of insertion shape detecting coils 32, the position of each of the four body surface detecting coils 7, and the position of the body cavity detecting coil 42 to the matching circuit 51 in the image processing device 11 as position and orientation data.
Now, the position and orientation data will be described below in detail.
As shown in
The position of the image position and orientation detecting coil 31 is defined as 0″. The image position and orientation detecting coil 31 is fixed to a position very close to the center of the ring of the ultrasonic transducer array 29. Accordingly, the position 0″ aligns with the center of radial scanning and with the center of ultrasonic tomograms.
Here, the position and orientation data is defined as follows.
The directional components of a position vector 00″ at the position 0″ of the image position and orientation detecting coil 31 on the orthogonal coordinate axis 0-xyz:
(x0, y0, z0)
The angular components of an Euler angle (described below) indicating the orientation of the image position and orientation detecting coil 31 with respect to the orthogonal coordinate axis 0-xyz:
(ψ, θ, φ)
The directional components of the position vector of each of the plurality of insertion shape detecting coils 32 on the orthogonal coordinate axis 0-xyz:
(xi, yi, zi) (i denotes a natural number from 1 to the total number of the insertion shape detecting coils 32).
The directional components of the position vectors of the four body surface detecting coils 7 on the orthogonal coordinate axis 0-xyz:
(xa, ya, za), (xb, yb, zb), (xc, yc, zc), (xd, yd, zd)
The directional components of the position vector of the body cavity detecting coil 42 on the orthogonal coordinate axis 0-xyz:
(xp, yp, zp)
Here, the Euler angle is such that when the orthogonal coordinate axis 0-xyz in
i after the rotation=V3, j after the rotation=V12, and k after the rotation=V. ψ denotes the first rotation angle around the z axis, θ denotes the rotation angle around the y axis, and φ denotes the second rotation angle around the z axis.
In
The matching circuit 51 calculates, from the following the first, second, third and fourth data groups, a conversion equation that maps a position and orientation expressed on the orthogonal coordinate axis 0-xyz to a position and orientation in the voxel space expressed on the orthogonal coordinate axis 0′-x′y′z′.
The method for calculation will be described below. The position and orientation data described below for the first and second data groups is varied by movement of the subject 37. New conversion equations are created in conjunction with movement of the subject 37. The creation of a new conversion equation will also be described below.
A first data group included in the position and orientation data includes the directional components (xa, ya, za), (xb, yb, zb), (xc, yc, zc), and (xd, yd, zd) of the position vectors, on the orthogonal coordinate axis 0-xyz, of the body surface detecting coils 7 attached to the xiphoid process, the left anterior superior iliac spine, the right anterior superior iliac spine, and the spinous process of vertebral body of the subject 37.
A second data group included in the position and orientation data includes the directional components (xp, yp, zp) of the position vector of the body cavity detecting coil 42 on the orthogonal coordinate axis 0-xyz.
In
A third data group includes the coordinates (xa′, ya′, za′), (xb′, yb′, zb′), (xc′, yc′, zc′), and (xd′, yd′, zd′), on the orthogonal coordinate axis 0′-x′y′z′, of pixels on any of the reference image data nos. 1 to N which correspond to points on the body surface which are closest to the xiphoid process, the left anterior superior iliac spine, the right anterior superior iliac spine, and the spinous process of vertebral body.
The pixels are pre-specified on any of the reference image data nos. 1 to N by the operator. The method for specification will be described below.
A fourth data group includes the coordinates (xp″, yp″, zp″), on the orthogonal coordinate axis 0′-x′y′z′, of a pixel on any of the reference image data nos. 1 to N which corresponds to the duodenal papilla.
The pixels are pre-specified on any of the reference image data nos. 1 to N by the operator.
The method for specification will be described below. In
Then, the matching circuit 51 maps the position and orientation data calculated for the orthogonal coordinate axis 0-xyz according to the above conversion equation to calculate new position and orientation data for the orthogonal coordinate axis 0′-x′y′z′.
The matching circuit 51 outputs the new position and orientation data as position and orientation mapping data to the image index creation circuit 52 and the insertion shape creation circuit 53.
The image index creation circuit 52 creates image index data from position and orientation mapping data with a total of six degrees of freedom including the directional components (x0, y0, z0) of the position vector 00″, on the orthogonal coordinate axis 0-xyz, of the position 0″ of the image position and orientation detecting coil 31 and the angular components (ψ, θ, φ) of the Euler angle indicating the orientation of the image position and orientation detecting coil 31 with respect to the orthogonal coordinate axis 0-xyz. The image index creation circuit 52 then outputs the image index data to the synthesis circuit 58.
This is shown in
The image index data is image data on the orthogonal coordinate axis 0′-x′y′z′ obtained by synthesizing a parallelogrammatic ultrasonic tomogram marker Mu with, for example, a blue distal direction marker Md (expressed in blue in
The insertion shape creation circuit 53 creates insertion shape data (through an interpolation and marker creation process) from the position and orientation mapping data including the directional components (x0, y0, z0) of the position vector 00″ of the position 0″ of the image position and orientation detecting coil 31 and the directional components (xi, yi, zi) of the position vector of each of the plurality of insertion shape detecting coils 32 on the orthogonal coordinate axis 0-xyz. The insertion shape creation circuit 53 then outputs the insertion shape data to the synthesis circuit 58.
This is shown in
(d) Now, the operation of the present embodiment will be described along the fourth flow of reference image data and data created by processing the reference image data.
The operator pre-acquires reference image data on the entire abdomen of the subject 37 using the X-ray 3-dimensional helical computer tomography system 15 or the 3-dimensional magnetic resonance imaging system 16.
The operator gives an instruction to acquire reference image data by depressing a predetermined key on the keyboard 13 or selecting from a menu on a screen using the mouse 12. At the same time, the operator indicates from where to acquire the data. In response to the instruction, the control circuit 63 instructs the communication circuit 54 to load the reference image data and indicates to the communication circuit 54 from where to acquire the data.
For example, if the data is to be acquired from the X-ray 3-dimensional helical computer tomography system 15, the communication circuit 54 loads a plurality of two-dimensional CT images through the network 17 as reference image data and stores the images in the reference image storage portion 55.
When the X-ray 3-dimensional helical computer tomography system 15 is used to pick up images, an X ray contrast material is injected through a blood vessel in the subject 37 before image pickup so as to allow blood vessels (in a broad sense, vessels) such as the aorta and the superior mesenteric vein, or an organ containing a large number of blood vessels to be displayed on two-dimensional CT images at a high or medium luminance so as to be differentiated from the surrounding organs having lower luminances.
If for example, the data is to be acquired from the 3-dimensional magnetic resonance imaging system 16, the communication circuit 54 loads a plurality of two-dimensional MRI images through the network 17 as reference image data and stores the images in the reference image storage portion 55.
When the 3-dimensional magnetic resonance imaging system 16 is used to pick up images, an MRI contrast material with a high nuclear magnetic resonance sensitivity is injected through a blood vessel in the subject 37 before image pickup so as to allow blood vessels such as the aorta and the superior mesenteric vein, or an organ containing a large number of blood vessels to be displayed on two-dimensional MRI images at a high or medium luminance so as to be differentiated from the surrounding organs having lower luminances.
Since the operation performed when the operator selects the X-ray 3-dimensional helical computer tomography system 15 as a data source is similar to that performed when the operator selects the 3-dimensional magnetic resonance imaging system 16 as a data source, description will be given only of the operation performed when the X-ray 3-dimensional helical computer tomography system 15 is selected as a data source and when the communication circuit 54 loads a plurality of two-dimensional CT images as reference image data.
The interpolation circuit 56 reads all the reference image data nos. 1 to N from the reference image storage portion 55. The interpolation circuit 56 sequentially fills the read reference image data into a voxel space in the interpolation memory 56a.
Specifically, the luminances of the pixels in the reference image data are outputted to voxels having addresses corresponding to the pixels. The interpolation circuit 56 then performs interpolation on the basis of the luminance values of the adjacent reference image data to fill empty voxels with the data. In this manner, all the voxels in the voxel space are filled with data (hereinafter referred to as voxel data) based on the reference image data.
The 3-dimensional human body image creation circuit 57 extracts voxels of a high luminance value (mostly indicating the blood vessel) and voxels of a medium luminance value (mostly indicating the organ such as the pancreas which contains a large number of blood vessels) according to the luminance value range from the interpolation circuit 56. The 3-dimensional human body image creation circuit 57 classifies the voxels according to the luminance and colors the voxels.
The 3-dimensional human body image creation circuit 57 then sequentially fills the extracted voxels into a voxel space in the synthesis memory 58a in the synthesis circuit 58 as 3-dimensional human body image data. At this time, the 3-dimensional human body image creation circuit 57 fills the extracted voxels so that the address of each extracted voxel in the voxel space in the interpolation memory 56a is the same as that in the voxel space in the synthesis memory 58a.
The 3-dimensional human body image creation circuit 57 also has the function of extraction means to extract the organ, blood vessels, and the like. The extraction means may be provided in the 3-dimensional guide image creation circuit A or B. Then, when a 3-dimensional guide image is to be created, the 3-dimensional guide image creation circuit A or B may be allowed to select the organ or the blood vessels.
The synthesis circuit 58 sequentially fills image index data and insertion shape data into the voxel space in the synthesis memory 58a. This is shown in
In
The rotational transformation circuit 59 reads the synthetic 3-dimensional data and executes a rotating process on the synthetic 3-dimensional data in accordance with a rotation instruction signal from the control circuit 63.
The 3-dimensional guide image creation circuit A executes a rendering process such as hidden surface removal or shading on the synthetic 3-dimensional data to create image data (hereinafter referred to as 3-dimensional guide image data) that can be outputted to the screen.
The default direction of 3-dimensional guide image data is from the ventral side of the body. Accordingly, the 3-dimensional guide image creation circuit A creates 3-dimensional guide image data based on the observation of the subject 37 from the ventral side. Although the default direction of 3-dimensional guide image data is from the ventral side of the body, the 3-dimensional guide image creation circuit A may create 3-dimensional guide image data based on the observation of the subject 37 from the dorsal side. Alternatively, the 3-dimensional guide image creation circuit A may create 3-dimensional guide image data based on the observation from another direction.
The 3-dimensional guide image creation circuit A outputs 3-dimensional guide image data based on the observation from the ventral side of the subject to the mixing circuit 61. The 3-dimensional guide image data is shown in
In the 3-dimensional guide image data in
For the other organs, the ultrasonic tomogram marker Mu is opaque so as to make invisible those parts of the organs which are located behind the ultrasonic tomogram marker Mu. In
The 3-dimensional guide image creation circuit B executes a rendering process such as hidden surface removal or shading on the rotated synthetic 3-dimensional data to create 3-dimensional guide image data that can be outputted to the screen.
In the present embodiment, by way of example, it is assumed that in response to an input provided by the operator via the mouse 12 and the keyboard 13, the control circuit 63 issues a rotation instruction signal to rotate the 3-dimensional guide image data by 90° so that the subject can be observed from the caudal side.
Thus, the 3-dimensional guide image creation circuit B creates 3-dimensional guide image data based on the observation from the caudal side of the subject.
The 3-dimensional guide image creation circuit B outputs 3-dimensional guide image data based on the observation from the caudal side of the subject to the mixing circuit 61. The 3-dimensional guide image data is shown in
In the 3-dimensional guide image data in
For the other organs, the ultrasonic tomogram marker Mu is opaque so that the rear side of the ultrasonic tomogram marker Mu cannot be viewed. In
The ultrasonic tomogram marker Mu shown in
(e) Now, the operation of the present embodiment will be described along the fifth flow of signals and data for a final display screen obtained by synthesizing ultrasonic tomogram data with 3-dimensional guide image data.
The mixing circuit 61 in
The display circuit 62 converts the mixture data into an analog video signal and outputs the signal to the display device 14.
On the basis of the analog video signal, the display device 14 properly arranges the ultrasonic tomogram, the 3-dimensional guide image based on the observation of the subject 37 from the caudal side, and the 3-dimensional guide image based on the observation of the subject 37 from the ventral side for comparison.
As shown in
In the display example in
Further, as shown by white arrows in
(f) Now, the operation of the present embodiment will be described along the sixth flow of signals and data for control operations.
The following components of the image processing device 11 in
The control will be described below in detail.
A general description will be given below of how the image processing device 11, the keyboard 13, the mouse 12, and the display device 14 in accordance with the present embodiment work as the operator operates the apparatus.
The first step S1 corresponds to a process of specifying body surface feature points and body cavity feature points on reference image data. That is, in step S1, body surface feature points and body cavity feature points are specified on the reference image data.
In the next step S2, the operator fixes the body surface detecting coil 7 to the subject 37. The operator has the subject 37 lie on his or her left side, that is, has the subject 37 assume what is called a left lateral position. The operator palpates the subject 37 and fixes the body surface detecting coils 7 to positions on the body surface which are closest to the four body surface feature points, the xiphoid process, the left anterior superior iliac spine, the right anterior superior iliac spine, and the spinous process of vertebral body.
The next step S3 corresponds to a process of calculating a correction value.
In step S3, the image processing device 11 acquires position and orientation data on body cavity feature points to calculate a conversion equation that maps position and orientation data expressed on the orthogonal coordinate axis 0-xyz into position and orientation mapping data in the voxel space expressed on the orthogonal coordinate axis 0′-x′y′z′. The image processing device 11 further calculates a correction value for the conversion equation on the basis of the position and orientation data on the body cavity feature points.
The next step S4 executes a process of creating and displaying ultrasonic tomograms and 3-dimensional guide images. In step S4, ultrasonic tomograms and 3-dimensional guide images are created and displayed.
Now, a specific description will be given of the processing in step S1 in
In the first step S1-1, the operator depresses the display switching key 13α. The control circuit 63 gives an instruction to the display circuit 62. In response to the instruction, the switch 62a in the display circuit 62 is switched to the input terminal α.
In the next step S1-2, the operator uses the mouse 12 and the keyboard 13 to specify any of the reference image data nos. 1 to N.
In the next step S1-3, the control circuit 63 causes the display circuit 62 to read the specified one of the reference image data nos. 1 to N, stored in the reference image storage portion 55.
The display circuit 62 converts the reference image data from the reference image storage portion 55 into an analog video signal, and outputs the reference image data to the display device 14. The display device 14 displays the reference image data.
In the next step S1-4, the operator uses the mouse 12 and the keyboard 13 to specify body surface feature points on the reference image data. Specifically, the operator performs the following operation.
The operator performs an operation such that the displayed reference image data contains any of the four body surface feature points of the subject 37, the xiphoid process, the left anterior superior iliac spine, the right anterior superior iliac spine, and the spinous process of vertebral body. If the reference image data contains none of the feature points, the process returns to step S1-2, where the operator specifies another reference image data. In step S1-3, the operator repeats displaying a different reference image data until the reference image data contains any of the feature points.
The operator uses the mouse 12 and the keyboard 13 to specify pixels on the displayed reference image data corresponding to points on the body surface of the subject 37 which are closest to the four points on the body surface, the xiphoid process, the left anterior superior iliac spine, the right anterior superior iliac spine, and the spinous process of vertebral body.
The specified points are shown by black circles and white circles ◯ in
In
In the step S1-5, the operator uses the mouse 12 and the keyboard 13 to specify a body cavity feature point P″. In the present embodiment, by way of example, the body cavity feature point P″ is the duodenal papilla (the opening in the common bile duct leading to the duodenum). Specifically, the operator performs the following operation.
The operator uses the mouse 12 and the keyboard 13 to specify any of the reference image data nos. 1 to N.
The control circuit 63 causes the display circuit 62 to read, via a signal line (not shown), the specified one of the reference image data nos. 1 to N, stored in the reference image storage portion 55.
The display circuit 62 outputs the read reference image data to the display device 14. The display device 14 displays the reference image data. If the displayed reference image data does not contain the duodenal papilla, the body cavity feature point of the subject 37, the operator specifies another reference image data. The operator repeats displaying a different reference image data until the displayed reference image data contains the duodenal papilla.
The operator uses the mouse 12 and the keyboard 13 to specify a pixel on the displayed reference image data which corresponds to the duodenal papilla, a point in the body cavity of the subject 37.
The specified point is denoted by P″ in
In the next step S1-6, the control circuit 63 calculates the coordinates, on the orthogonal coordinate axis 0′-x′y′z′ in the voxel space, of each of the pixels corresponding to the body surface feature points specified in step S1-4 and of the pixel corresponding to the body cavity feature point P″ specified in step S1-5, on the basis of the addresses on the reference image data. The control circuit 63 then outputs the coordinates to the matching circuit 51.
The calculated value of each of the coordinates, on the orthogonal coordinate axis 0′-x′y′z′, of each of the pixels corresponding to the body surface feature points specified in step S1-4 are defined as (xa′, ya′, za′), (xb′, yb′, zb′), (xc′, yc′, zc′), and (xd′, yd′, zd′).
The calculated value of each of the coordinates, on the orthogonal coordinate axis 0′-x′y′z′, of the pixel corresponding to the body cavity feature point specified in step S1-5 is defined as (xp″, yp″, zp″).
The matching circuit 51 stores the coordinates. After step S1-6, the process proceeds to step S2 in
When the correction value calculation process in step S3 is started, in the first step S3-1, the operator depresses the display switching key 13β. In response to this instruction, the control circuit 63 gives an instruction to the display circuit 62. The switch 62a in the display circuit 62 is switched to the input terminal β according to the instruction.
In the next step S3-2, the display circuit 62 converts optical image data from the optical observation device 3 into an analog video signal, and outputs the optical image to the display device 14. The display device 14 displays the optical image.
In the next step S3-3, the operator inserts the rigid portion 21 and flexible portion 22 of the ultrasonic endoscope 2 into the body cavity of the subject 37.
In the next step S3-4, while observing the optical image, the operator moves the rigid portion 21 to search for the body cavity feature point. Upon finding the body cavity feature point, the operator moves the rigid portion 21 to the vicinity of the body cavity feature point.
In the next step S3-5, while observing the optical image, the operator inserts the body cavity contact probe 8 through the forceps port 44 and projects the body cavity contact probe 8 from the projection port 45. The operator then brings the distal end of the body cavity contact probe 8 into contact with the body cavity feature point under the optical image field of view.
This is shown in
In the next step S3-6, the operator depresses the body cavity feature point specification key 65.
In the next step S3-7, the control circuit 63 gives an instruction to the matching circuit 51. In response to the instruction, the matching circuit 51 loads position and orientation data from the position and orientation calculation device 5 and stores the data. The position and orientation data contains the following two types of data as described above.
The directional components of each of the position vectors of the four body surface detecting coils 7 on the orthogonal coordinate axis 0-xyz, that is, in this case, the coordinates of the four body surface feature points on the orthogonal coordinate axis 0-xyz: (xa, ya, za), (xb, yb, zb), (xc, yc, zc), and (xd, yd, zd).
The directional components of the position vector of the body cavity detecting coil 42 on the orthogonal coordinate axis 0-xyz, that is, in this case, the coordinate of the body cavity feature point on the orthogonal coordinate axis 0-xyz: (xp, yp, zp).
In the next step S3-8, the matching circuit 51 creates a first conversion equation expressing a first map, from the coordinates of the body surface feature points. Specifically, this is carried out as follows.
First, the matching circuit 51 already stores the following contents:
First, the coordinates, on the orthogonal coordinate axis 0′-x′y′z′ in the voxel space, of the pixels corresponding to the body surface feature points specified in step S1: (xa′, ya′, za′), (xb′, yb′, zb′), (xc′, yc′, zc′), and (xd′, yd′, zd′).
Second, the coordinate, on the orthogonal coordinate axis 0′-x′y′z′ in the voxel space, of the pixel corresponding to the body cavity feature point specified in step S1): (xp″, yp″, zp″).
Third, the coordinates, on the orthogonal coordinate axis 0-xyz, of the body surface feature points loaded in step S3-7: (xa, ya, za), (xb, yb, zb), (xc, yc, zc), and (xd, yd, zd).
Fourth, the coordinate, on the orthogonal coordinate axis 0-xyz, of the body cavity feature point loaded in step 3-7: (xp, yp, zp).
The matching circuit 51 creates a first conversion equation that expresses first mapping from any point on the orthogonal coordinate axis 0-xyz to an appropriate point on the orthogonal coordinate axis 0′-x′y′z′ in the voxel space, from the third coordinates (xa, ya, za), (xb, yb, zb), (xc, yc, zc), and (xd, yd, zd) and the first coordinates (xa′, ya′, za′), (xb′, yb′, zb′), (xc′, yc′, zc′), and (xd′, yd′, zd′). The first mapping and the first conversion equation are defined as follows.
As shown in
The first mapping is mapping from the subject 37 to the voxel space such that the “coordinates of any point on the orthogonal coordinate axis 0-xyz expressed by the nonorthogonal coordinate system on the subject 37” is the same as the “coordinates of a resulting point on the orthogonal coordinate axis 0′-x′y′z′ whose coordinates are expressed by the nonorthogonal coordinate system in the voxel space”.
Further, the first conversion equation converts the “coordinates of any point on the orthogonal coordinate axis 0-xyz” into the “coordinates, on the orthogonal coordinate axis 0′-x′y′z′, of a point in the voxel space resulting from the first mapping”.
For example, as shown in
The coordinates of the point Q′ on the orthogonal coordinate axis 0′-x′y′z′ are defined as (x0′, y0′, z0′). The first conversion equation converts the coordinates (x0, y0, z0) of the point 0″ on the orthogonal coordinate axis 0-xyz into the coordinates (x0′, y0′, z0′) of the point Q′ on the orthogonal coordinate axis 0′-x′y′z′.
In the next step S3-9, the matching circuit 51 maps the body cavity feature point P to the point P′ in the voxel space on the basis of the first conversion equation, as shown in
In the next step S3-10, the matching circuit 51 calculates a vector P′P″ on the basis of the coordinates (xp′, yp′, zp′) of the point P′ on the orthogonal coordinate axis 0′-x′y′z′ in the voxel space and the coordinates (xp″, yp″, zp″), on the orthogonal coordinate axis 0′-x′y′z′ in the voxel space, of the point P″ corresponding to the body cavity feature point specified in step S1, as follows.
P′P″=(xp″, yp″, zp″)−(xp′, yp′, zp′)=(xp″−xp′, yp″−yp′, zp″−zp′)
In the step S3-11, the matching circuit 51 stores the vector P′P″. The vector P′P″ acts as a correction value used to correct the first conversion equation to create a second conversion equation during a process described below. After step S3-11, the process proceeds to the next step S4.
Now, description will be given of the process of creating and displaying ultrasonic tomograms and 3-dimensional guide images in step S4.
When the processing in step S4 is started, in the first step S4-1, the operator depresses the display switching key 13γ. The control circuit 63 gives an instruction to the display circuit 62. The switch 62a in the display circuit 62 is switched to the input terminal γ in response to this instruction.
In the next step S4-2, the operator depresses the scan control key 66.
In the next step S4-3, the control circuit 63 outputs a scan control signal to the ultrasonic observation device 4. Then, the ultrasonic transducer array 29 starts radial scanning.
In the next step S4-4, the control circuit 63 gives an instruction to the mixing circuit 61. In response to the instruction, the mixing circuit 61 sequentially loads ultrasonic tomogram data inputted by the ultrasonic observation device 4 in accordance with the radial scanning.
In the next step S4-5, the control circuit 63 gives an instruction to the matching circuit 51. The matching circuit 51 loads position and orientation data from the position and orientation calculation device 5 and stores the data. The loading is instantaneously performed. Thus, the matching circuit 51 loads the position and orientation data including the following data obtained at the moment when the mixing circuit 61 loads the ultrasonic tomogram data in step S4-4.
The directional components of the position of the image position and orientation detecting coil 31 on the orthogonal coordinate axis 0-xyz, that is, the position vector 00″ of the center of radial scanning and of the center 0″ of the ultrasonic tomogram:
(x0, y0, z0).
The angular components of the Euler angle indicating the orientation of the image position and orientation detecting coil 31, that is, the orientation of the ultrasonic tomogram, with respect to the orthogonal coordinate axis 0-xyz:
(ψ, θ, φ).
The directional components of the position vector of each of the plurality of insertion shape detecting coils 32 on the orthogonal coordinate axis 0-xyz:
(xi, yi, zi) (i is a natural number between 1 and the total number of the insertion shape detecting coils 32).
The direction components of the position vector of each of the four body surface detecting coils 7 on the orthogonal coordinate axis 0-xyz:
(xa, ya, za), (xb, yb, zb), (xc, yc, zc), (xd, yd, zd).
In the next step S4-6, the matching circuit 51 uses the directional components (xa, ya, za), (xb, yb, zb), (xc, yc, zc), (xd, yd, zd) of the position vector of each of the four body surface detecting coil 7 on the orthogonal coordinate axis 0-xyz, which are contained in the position and orientation data loaded in step S4-5, to update the first conversion equation stored in step S3).
The matching circuit 51 then combines the updated first conversion equation with the translation of the vector P′P″ stored in step S3 to create a new second conversion equation that expresses second mapping.
The matching circuit 51 combines the first conversion equation with the translation of the vector P′P″ to create a new second conversion equation that expresses second mapping. The concept of the second mapping is as follows.
Second mapping=first mapping+translation of the vector P′P″
The translation of the vector P′P″ produces a correction effect described below. The vector P′P″ acts as a correction value.
The first mapping is mapping from the subject 37 to the voxel space such that the “coordinates of any point on the orthogonal coordinate axis 0-xyz expressed by the nonorthogonal coordinate system on the subject 37” is the same as the “coordinates of a resulting point on the orthogonal coordinate axis 0′-x′y′z′ whose coordinates are expressed by the nonorthogonal coordinate system in the voxel space ”.
Ideally, the mapping point P′ of the body cavity feature point P created in the voxel space by the first mapping desirably aligns with the point P″ corresponding to the body cavity feature point specified in step S1). However, it is actually difficult to accurately align these points with each other.
This is because various factors prevent the “spatial relationship between any point on the orthogonal coordinate axis 0-xyz and the nonorthogonal coordinate system on the subject 37 from completely matching the “spatial positional relationship between a point on the orthogonal coordinate axis 0′-x′y′z′ which anatomically corresponds to the above point and the nonorthogonal coordinate system in the voxel space”.
This is because, in the case of the present embodiment, although the first mapping and the first conversion equation are determined from the coordinates of the body surface feature points, which are the characteristic points on the skeleton, the duodenal papilla P, which is the body cavity feature point, does not always maintain the same relationship with the body surface feature points on the skeleton.
The main reason is that the X-ray 3-dimensional helical computer tomography system 15 and the 3-dimensional magnetic resonance imaging system 16 normally pick up images of the subject in a supine position, which is different from the left lateral position for inspections with the ultrasonic endoscope 2, thus displacing the organs in the subject 37 under the effect of the gravity.
Thus, the first mapping is combined with the translation of the vector P′P″ as a correction value to obtain second mapping. This aligns the mapping point of the body cavity feature point P with the point P″ corresponding to the body cavity feature point in the voxel space. Moreover, another point on the subject 37, for example, the center 0″ of the ultrasonic tomogram, is also anatomically more accurately aligned with the body cavity feature point by the second mapping.
In the next step S4-7, the matching circuit 51 uses the newly created second conversion equation to convert, into position and orientation mapping data, the directional components (x0, y0, z0) of the position vector 00″ of the center 0″ of the ultrasonic tomogram on the orthogonal coordinate axis 0-xyz, the angular components (ψ, θ, φ) of the Euler angle indicating the orientation of the image position and orientation detecting coil 31 with respect to the orthogonal coordinate axis 0-xyz, and the directional components (xi, yi, zi) (i is a natural number between 1 and the total number of the insertion shape detecting coils 32) of the position vector of each of the plurality of insertion shape detecting coils 32 on the orthogonal coordinate axis 0-xyz, all the directional components being contained in the position and orientation data loaded in step S4-5.
As shown in
Q′Q″=P′P″
In the next step S4-8, the image index creation circuit 52 creates image index data. The insertion shape creation circuit 53 creates insertion shape data.
The synthesis circuit 58 synthesizes 3-dimensional human image data with image index data and insertion shape data to create synthesis 3-dimensional data.
The rotational transformation circuit 59 executes a rotation process on synthetic 3-dimensional data.
Each of the 3-dimensional guide image creation circuits A and B creates 3-dimensional guide image data.
The above processes are as described above.
In the next step S4-9, the mixing circuit 61 properly arranges the ultrasonic tomogram data and the 3-dimensional guide image data to create display mixture data.
The display circuit 62 converts the mixture data into an analog video signal.
On the basis of the analog video signal, the display device 14 properly arranges and displays the ultrasonic tomogram, the 3-dimensional guide image based on the observation of the subject 37 from the ventral side, and the 3-dimensional guide image based on the observation of the subject 37 from the caudal side, as shown in
The above processes are as described above.
In the next step S4-10, the control circuit 63 determines whether or not the operator depresses the scan control key 66 again, during steps S4-4 to S4-9.
If the operator has depressed the scan control key 66 again, the control circuit 63 ends the above process and outputs a scan control signal to the ultrasonic observation device 4 to instruct the radial scan control to be turned off. The ultrasonic transducer array 29 ends the radial scan.
If the operator has not depressed the scan control key 66 again, the process jumps to step S4-4.
The processing described in steps S4-4 to S4-9 is thus repeated. Then, the ultrasonic transducer array 29 performs one radial scan, and the ultrasonic observation device 4 creates ultrasonic tomogram data. Every time the ultrasonic observation device 4 inputs ultrasonic tomogram data to the mixing circuit 61, two new 3-dimensional guide images are created and shown on the display screen of the display device 14 together with a new ultrasonic tomogram; the 3-dimensional guide images are properly updated.
That is, as shown in
The present embodiment produces the following effects.
According to the present embodiment, the ultrasonic endoscope 2 comprises the rigid portion 21 fixedly having the ultrasonic transducer array 29 that acquires signals for creating ultrasonic tomograms of the interior of the subject 37, the flexible portion 22 located closer to the proximal end than the rigid portion 21, the rigid portion 21 and the flexible portion 22 being provided on the side of the ultrasonic endoscope which is inserted into the body cavity, the ultrasonic observation device 4 that creates ultrasonic tomograms of the interior of the subject 37 from echo signals acquired by the ultrasonic transducers 29a, the image position and orientation detecting coil 31 the position of which is spatially fixed to the rigid portion 21, the plurality of insertion shape detecting coils 32 provided along the flexible portion 22, the plurality of body surface detecting coils 7 that can come into contact with the subject 37, the transmission antenna 6 and the position and orientation calculation device 5 which detect the six degrees of freedom of the position and orientation of the image position and orientation detecting coil 31, the position of each of the plurality of insertion shape detecting coils 32, and the position or orientation of the body surface detecting coil 7 to output position and orientation data, the image index creation circuit 52 that creates the ultrasonic tomogram marker Mu indicating the position and orientation of the ultrasonic tomogram of the interior of the subject 37 created by the ultrasonic observation device 4, the synthesis circuit 58 that synthesizes the insertion shape of the distal end of the flexible portion 22 with the ultrasonic tomogram marker Mu and 3-dimensional human body image data based on the position/orientation data outputted by the position and orientation calculation device 5, and the 3-dimensional guide image creation circuits A and B that guide the positions and orientations of the flexible portion 22 and ultrasonic tomogram with respect to the subject 37.
Thus, the present embodiment can detect the insertion shapes of the rigid portion 21 and flexible portion 22 of the ultrasonic endoscope 2 and the direction of ultrasonic tomograms while minimizing invasive exposure to radiations so as to create the 3-dimensional guide image including both of them.
Further, the present embodiment has the following arrangements and performs the following operations. The image index creation circuit 52 synthesizes the ultrasonic tomogram marker Mu with the blue distal end direction marker Md and the yellow-green arrow-shaped 6 o'clock direction marker Mt to create image index data. The synthesis circuit 58 synthesizes 3-dimensional human body image data, image index data, and insertion shape data in the same voxel space. The mixing circuit 61 creates display mixture data including ultrasonic tomogram data from the ultrasonic observation device 4 and 3-dimensional guide image data which are properly arranged. The display circuit 62 converts the mixture data into an analog video signal. The display device 14 properly arranges the ultrasonic tomograms and 3-dimensional guide images on the basis of the analog video signal.
Thus, the present embodiment can guide the positional relationship between ultrasonic tomograms and an area of interest such as the pancreas. The present embodiment can also guide how the radial scan surface of the ultrasonic endoscope 2, the flexible portion 22, and the rigid portion 21 are oriented and shaped with respect to the body cavity wall such as the digestive tract.
This enables the operator to visually determine these relationships and to perform easily diagnosis, treatment, and the like on the area of interest.
The present embodiment further has the following arrangements and performs the following operations. The matching circuit 51 repeats the processing described in steps S4-4 to S4-9 and further repeats the following process. The matching circuit loads the position and orientation data obtained at the moment when the mixing circuit 61 loads the ultrasonic tomogram data. The matching circuit 51 combines the first conversion equation with the translation of the vector P′P″ to newly create a second conversion equation that expresses second mapping. The matching circuit 51 converts, into position and orientation mapping data, the directional components (x0, y0, z0) of the position vector 00″ of the center 0″ of the ultrasonic tomogram on the orthogonal coordinate axis 0-xyz, the angular components (ψ, θ, φ) of the Euler angle indicating the orientation of the image position and orientation detecting coil 31 with respect to the orthogonal coordinate axis 0-xyz, and the directional components (xi, yi, zi) (i is a natural number between 1 and the total number of the insertion shape detecting coils 32) of the position vector of each of the plurality of insertion shape detecting coils 32 on the orthogonal coordinate axis 0-xyz.
The present embodiment thus has the following effect. Even if the posture of the subject 37 changes during inspections with the ultrasonic endoscope 2, unless the positional relationship between the body surface feature points and the organs changes, the ultrasonic tomogram marker Mu, distal marker Md, 6 o'clock direction marker Mt, and insertion shape marker Ms on the 3-dimensional guide image anatomically align with ultrasonic tomogram, flexible portion 22, and rigid portion 21, respectively, more accurately.
The X-ray 3-dimensional helical computer tomography system 15 and the 3-dimensional magnetic resonance imaging system 16 normally pick up images of the subject in the supine position, which is different from the left lateral position for inspections with the ultrasonic endoscope. However, with the arrangements and operations of the present embodiment, the matching circuit 51 combines the first mapping with the translation of the vector P′P″ as a correction value to create the second conversion equation that expresses the second mapping.
Consequently, even if the organs in the subject 37 are displaced under the effect of gravity during ultrasonic endoscopic inspections in the left lateral position, the present embodiment enables more anatomically accurate alignment with a point in the subject 37 by the second mapping, for example, the center 0″ of the ultrasonic tomogram, than the X-ray 3-dimensional helical computer tomography system 15 and the 3-dimensional magnetic resonance imaging system 16. This enables the 3-dimensional guide image to more accurately guide the ultrasonic tomogram.
According to the present embodiment, the arrangements and operations of the 3-dimensional guide image creation circuit A are such that the circuit A creates 3-dimensional image data showing the cranial side in the right of the image and the caudal side in the left of the image and based on the observation of the subject 37 from the ventral side. For ultrasonic endoscopic inspections, the subject 37 is normally inspected in the left lateral position.
The present embodiment also displays 3-dimensional guide images in the left lateral position. This allows the subject 37 to be easily compared with 3-dimensional guide images, while allowing the operator to easily understand the 3-dimensional guide images. The present embodiment therefore can improve or properly support the operator's operations during diagnosis, treatment, or the like.
Further, according to the present embodiment, the 3-dimensional guide image creation circuits A and B create 3-dimensional guide images with the line of sight set in different directions. This enables the positional relationship between the ultrasonic tomogram and the area of interest such as the pancreas to be guided in the plurality of directions and also makes it possible to guide how the ultrasonic tomogram and the flexible portion 22 and rigid portion 21 of the ultrasonic endoscope 2 are oriented and shaped in the plurality of directions with respect to the body cavity wall such as the digestive tract. This makes the operator understand the images easily.
The present embodiment comprises the ultrasonic endoscope 2 including the treatment instrument channel 46 and the body cavity contact probe 8, which is inserted through the treatment instrument channel 46. However, the configuration is not limited to this.
Provided that the objective lens 25 focuses on the body cavity feature point via the optical observation window 24 and the rigid portion 21 itself can be accurately contacted with the body cavity feature point without using the body cavity contact probe 8, the image position and orientation detecting coil 31, fixed to the rigid portion 21, may be used instead of the body cavity detecting coil 42 in the body cavity contact probe 8.
In this case, the image position and orientation detecting coil 31 serves not only as an image position and orientation detecting device but also as a body cavity detecting device.
Furthermore, the present embodiment uses the electronic radial scanning ultrasonic endoscope 2 as an ultrasonic probe. However, it is possible to use a mechanical scanning ultrasonic endoscope such as a body cavity probe apparatus in accordance with the prior art disclosed in Japanese Patent Laid-Open No. 2004-113629, an electronic convex scanning ultrasonic endoscope having a fan-shaped group of ultrasonic transducers provided on one side of the insertion shaft, or a capsule-shaped ultrasonic sonde. The present invention is not limited to the ultrasonic scanning scheme. Alternatively, an ultrasonic probe without the optical observation window 24 may be used.
In the present embodiment, in the rigid portion 21 of the ultrasonic endoscope 2, the ultrasonic transducer is cut into small pieces like strips of paper which are arranged around the periphery of the insertion shaft as an annular array. However, the ultrasonic transducer array 29 may be provided all around the circumference through 360° or may lack in a certain part of the circumference. For example, the ultrasonic transducer 29 may be formed in a part spanning 270° or 180°.
Moreover, with the arrangements and operations of the present embodiment, the transmission antenna 6 and the reception coil are used as position detection means to detect positions and orientations on the basis of magnetic fields. However, the transmission and reception may be reversed. Utilizing magnetic fields to detect the position and orientation enables the formation of position (orientation) detection means of a simple configuration as well as a reduction in costs and sizes.
However, the position (orientation) detection means is not limited to the utilization of magnetic fields. The configuration and operation of the position (orientation) detection means may be such that the position and orientation are detected on the basis of acceleration or another means.
Further, the present embodiment sets the origin 0 at the particular position on the transmission antenna 6. However, the origin 0 may be set in another area having the same positional relationship as that of the transmission antenna 6.
Furthermore, the present embodiment fixes the image position and orientation detecting coil 31 to the rigid portion 21. However, the image position and orientation detecting coil 31 need not be provided inside the rigid portion 21 provided that the position of the image position and orientation detecting coil 31 is fixed with respect to the rigid portion 21.
Moreover, the present embodiment displays the organs on the 3-dimensional guide image data in different colors. However, the present invention is not limited to the use of the different colors (a variation in display color) but may use another aspect using luminance, lightness, chroma saturation, or the like. For example, the different organs may have the respective luminance values.
Further, with the arrangements and operations of the present embodiment, a plurality of two-dimensional CT or MRI images picked up by the X-ray 3-dimensional helical computer tomography system 15 and the 3-dimensional MRI system 16 are used as reference image data. However, it is possible to use 3-dimensional image data pre-acquired using another modality such as PET (Positoron Emission Tomography). Alternatively, it is possible to use 3-dimensional image data pre-acquired using what is called an extracorporeal body cavity probe apparatus, that is, a body cavity probe apparatus which externally applies ultrasonic waves.
Furthermore, with the arrangements and operations of the present embodiment, image data obtained from the subject 37 by the X-ray 3-dimensional helical computer tomography system 15 or the like is used as reference image data. However, it is possible to use image data on another person of the same sex and a similar physique.
Moreover, the present embodiment has the body surface detecting coil 7 comprising the four coils wound in one axial direction and releasably fixed to a plurality of body surface feature points on the subject's body surface using tapes, belts, bands, or the like, to simultaneously obtain position and orientation data on the body surface feature points. However, with the arrangements and operations of the present embodiment, rather than using one coil, for example, the body cavity detecting coil 42, it is possible to lay the subject 37 on the left side before inspections with the ultrasonic endoscope 2 and then to sequentially contact the distal end of the body cavity contact probe 8 with the plurality of body surface feature points to sequentially obtain position and orientation data on the body surface feature points.
Further, according to the present embodiment, the position and orientation calculation means calculated the positions of the body surface detecting coils 7 as position and orientation data. However, instead of the position, the direction of the winding axis may be calculated. Alternatively, both the position and the direction of the winding axis may be calculated. The increased degree of freedom for calculations by the position and orientation calculation device 5 with respect to each body surface detecting coil 7 enables a reduction in the number of body surface detecting coils 7 and thus can reduce the burden imposed on the operator and the subject 37 when the body surface detecting coil 7 is fixed to the subject 37 and during ultrasonic endoscopic inspections.
Furthermore, in the present embodiment, the body surface feature points have been described as the points on the body surface of the abdomen corresponding to the xiphoid process, the left anterior superior iliac spine, the right anterior superior iliac spine, and the spinous process of vertebral body and the body cavity feature point as the duodenal papilla. However, the present invention is not limited to this example. The feature points may be located on the body surface of the chest or in the chest cavity, or any other example may be used. In general, the orientation of the ultrasonic tomogram marker Mu may be more accurately determined when the body surface feature points are taken on these points where they are associated with the skeleton.
Moreover, according to the present embodiment, an input made by the operator via the mouse 12 and the keyboard 13 instructs the control circuit 63 to issue a rotation instruction signal to rotate 3-dimensional guide image data by 90°, allowing the subject to be observed from the caudal side. The 3-dimensional guide image creation circuit B thus creates 3-dimensional guide image data based on the observation of the subject from the caudal side. However, the present invention is not limited to this example. Alternatively, an input made by the operator via the mouse 12 and the keyboard 13 may allow 3-dimensional guide image to be rotated in real time with respect to the input at any axis or any angle.
Now, Embodiment 2 of the present invention will be described. The configuration of the present embodiment is the same as that of Embodiment 1. However, the present embodiment is different from Embodiment 1 only in the operation of the 3-dimensional guide image creation circuit B.
Now, the operation of the present embodiment will be described.
As described above, the present embodiment is different from Embodiment 1 only in the operation of the 3-dimensional guide image creation circuit B.
According to Embodiment 1, as shown in
Then, the following markers were moved or deformed on the 3-dimensional human body image data in conjunction with movement of the radial scan surface associated with the operator's manual operation of the flexible portion 22 and the rigid portion 21; the ultrasonic tomogram marker Mu, the distal marker Md, and the 6 o'clock direction marker Mt on the image index data as well as the insertion shape marker Ms and the coil position marker Mc on the insertion shape data.
According to the present embodiment, on the basis of the position and orientation mapping data, the 3-dimensional guide image creation circuit B creates guide images with the normal of the ultrasonic tomogram marker Mu set in the correct position with respect to the screen so that the normal coincides with the observation line, that is, the normal of the screen of the display device 14 and with the 6 o'clock direction marker Mt set so as to orient downward on the screen of the display device 14, as shown in
The 3-dimensional guide image data in
In the 3-dimensional guide image data in
For the other organs, the ultrasonic tomogram marker Mu is opaque so as to make invisible those parts of the organs which are located behind the ultrasonic tomogram marker Mu.
The remaining part of the operation is the same as that of Embodiment 1.
The present embodiment produces the following effects.
The arrangements and operations of the present embodiment are such that, on the basis of the position and orientation mapping data, the 3-dimensional guide image creation circuit B creates 3-dimensional guide images with the normal of the ultrasonic tomogram marker Mu set in the correct position with respect to the screen so that the normal coincides with the observation line, that is, the normal of the screen of the display device 14 and with the 6 o'clock direction marker Mt set so as to orient downward on the screen of the display device 14. This allows the direction of the 3-dimensional image to coincide with that of the ultrasonic tomogram placed next to the 3-dimensional guide image and displayed in real time on the screen of the display device 14. Thus, the operator can easily compare these images with each other to anatomically interpret the ultrasonic tomogram.
The other effects of the present embodiment are the same as those of Embodiment 1.
The variation described in Embodiment is applicable as a variation of the present embodiment.
Now, Embodiment 3 of the present invention will be described.
The configuration of the present embodiment is the same as that of Embodiment 2. The present embodiment is different from Embodiment 2 only in the operation of the 3-dimensional guide image creation circuit B.
Now, the operation of the present embodiment will be described.
As described above, the present embodiment is different from Embodiment 2 only in the operation of the 3-dimensional guide image creation circuit B.
According to Embodiment 2, as shown in
According to the present embodiment, as shown in
For the pancreas, the area in front of the ultrasonic tomogram marker Mu (the area closer to the operator) is created in dark green, whereas the area behind the ultrasonic tomogram marker Mu is created in light green. For the blood vessel, the area in front of the ultrasonic tomogram marker Mu (the area closer to the operator) is created in dark red, whereas the area behind the ultrasonic tomogram marker Mu is created in light red.
In
The remaining part of the operation is the same as that of Embodiment 2.
The present embodiment produces the following effects.
The arrangements and operations of the present embodiment are such that the 3-dimensional guide image creation circuit B creates 3-dimensional guide image data by setting the ultrasonic tomogram marker Mu among the image index data to be translucent so that not only the 6 o'clock direction marker Mt and distal marker Md on the image index data and the insertion shape marker Ms and coil position marker Mc on the insertion shape data but also those parts of the other organs which are located behind the ultrasonic tomogram marker Mu can be seen through and varying the luminance between the areas in front of and behind the ultrasonic tomogram marker Mu.
Thus, the operator can easily determine how to further move the flexible portion 22 and the rigid portion 21 in order to display the area of interest such as the diseased part on the ultrasonic tomogram. The operator can thus easily manipulate the flexible portion 22 and rigid portion 21 of the ultrasonic endoscope 2.
In particular, an organ such as the gallbladder which is flexible and mobile inside the subject 37 may not be shown on the ultrasonic tomogram though the organ is shown on the ultrasonic tomogram marker Mu. The 3-dimensional guide image in accordance with the present embodiment may serve as a landmark indicating that the operator can slightly further move the rigid portion 21 and the flexible portion 22 to display the gallbladder on the ultrasonic tomogram. The operator can thus easily manipulate the flexible portion 22 and rigid portion 21 of the ultrasonic endoscope 2.
The other effects are the same as those of Embodiment 1.
The arrangements and operations of the present embodiment are such that the ultrasonic tomogram marker Mu among the image index data set to be translucent so that not only the 6 o'clock direction marker Mt and distal marker Md on the image index data and the insertion shape marker Ms and coil position marker Mc on the insertion shape data but also those parts of the other organs which are located behind the ultrasonic tomogram marker Mu can be seen through. In a variation, the operator may freely vary transparency by providing a selective input via the mouse 12 and the keyboard 13.
The variation of Embodiment 2 is applicable as another variation.
Now, Embodiment 4 of the present invention will be described. The configuration of the present embodiment is the same as that of Embodiment 3. The present embodiment is different from Embodiment 3 only in the operation of the 3-dimensional guide image creation circuit B.
Now, the operation of the present embodiment will be described.
As described above, the present embodiment is different from Embodiment 3 only in the operation of the 3-dimensional guide image creation circuit B.
According to Embodiment 3, as shown in
For the pancreas, the area in front of the ultrasonic tomogram marker Mu (the area closer to the operator) was created in dark green, whereas the area behind the ultrasonic tomogram marker Mu was created in light green. For the blood vessel, the area in front of the ultrasonic tomogram marker Mu (the area closer to the operator) was created in dark red, whereas the part behind the ultrasonic tomogram marker Mu was created in light red.
According to the present embodiment, as shown in
For the pancreas, the area on the ultrasonic tomogram marker Mu is created in dark green, whereas the area behind the ultrasonic tomogram marker Mu is created in light green. For the blood vessel, the area on the ultrasonic tomogram marker Mu is created in dark red, whereas the area behind the ultrasonic tomogram marker Mu is created in light red.
The remaining part of the operation is the same as that of Embodiment 3.
The present embodiment produces the following effects.
The arrangements and operations of the present embodiment were such that the 3-dimensional guide image creation circuit B created 3-dimensional guide image data by not displaying one of the two areas separated from each other by the ultrasonic tomogram marker Mu among the image index data, that is, the distal end of the flexible portion 22 or the part of the screen of the display device 14 which is closer to the operator, and varying the luminance between the area on the ultrasonic tomogram marker Mu and the area behind the ultrasonic tomogram marker Mu.
Thus, the present embodiment prevents the organs displayed closer to the operator from obstructing the operator's observation of the 3-dimensional guide images. This allows the 3-dimensional guide images to be more easily compared with ultrasonic tomograms displayed, in real time, on the screen of the display device 14 next to the 3-dimensional guide images. This in turn facilitates the anatomical interpretation of the ultrasonic tomograms.
The other effects are the same as those of Embodiment 3.
The variation of Embodiment 3 is applicable as a variation of the present embodiment.
Now, Embodiment 5 of the present invention will be described. The configuration of the present embodiment is the same as that of Embodiment 1. The present embodiment is different from Embodiment 1 only in the operation of the 3-dimensional guide image creation circuit B.
Now, the operation of the present embodiment will be described.
As described above, the present embodiment is different from Embodiment 1 only in the operation of the 3-dimensional guide image creation circuit B.
According to Embodiment 1, as shown in
According to the present embodiment, as shown in
For the pancreas, the area which is closer to the distal marker Md than the ultrasonic tomogram marker Mu is created in dark green, whereas the area opposite to the distal marker Md and close to the ultrasonic tomogram marker Mu is created in light green. For the blood vessel, the area which lies closer to the distal marker Md than the ultrasonic tomogram marker Mu is created in dark red, whereas the area opposite to the distal marker Md and close to the ultrasonic tomogram marker Mu is created in light red.
The remaining part of the operation is the same as that of Embodiment 1.
The present embodiment produces the following effects.
The arrangements and operations of the present embodiment were such that the 3-dimensional guide image creation circuit B created 3-dimensional guide image data by setting the ultrasonic tomogram marker Mu among the image index data to be translucent so that not only the 6 o'clock direction marker Mt and distal marker Md on the image index data and the insertion shape marker Ms and coil position marker Mc on the insertion shape data but also those parts of the other organs which are located behind the ultrasonic tomogram marker Mu can be seen through, and varying the luminance between the area in front of the ultrasonic tomogram marker Mu and the area behind the ultrasonic tomogram marker Mu.
Thus, the operator can easily determine how to further move the flexible portion 22 and the rigid portion 21 in order to display the area of interest such as the diseased part on the ultrasonic tomogram. The operator can thus easily manipulate the ultrasonic endoscope 2.
In particular, an organ such as the gallbladder which is flexible and mobile inside the subject 37 may not be shown on the ultrasonic tomogram though the organ is shown on the ultrasonic tomogram marker Mu. The 3-dimensional guide image in accordance with the present embodiment may serve as a landmark indicating that the operator can slightly further move the rigid portion 21 and the flexible portion 22 to display the gallbladder on the ultrasonic tomogram. The operator can thus easily manipulate the ultrasonic endoscope 2.
The other effects are the same as those of Embodiment 1.
The arrangements and operations of the present embodiment were such that the ultrasonic tomogram marker Mu among the image index data was set to be translucent so that not only the 6 o'clock direction marker Mt and distal marker Md on the image index data and the insertion shape marker Ms and coil position marker Mc on the insertion shape data but also those parts of the other organs which were located behind the ultrasonic tomogram marker Mu could be seen through. In a variation, the operator may freely vary transparency via the mouse 12 and the keyboard 13.
The variation of Embodiment 1 is applicable as another variation.
Now, Embodiment 6 of the present invention will be described. Only differences from Embodiment 1 will be described.
With the image processing device 11 in accordance with Embodiment 1, the rigid portion 21 has the image position and orientation detecting coil 31 fixed to the position very close to the center of the ring of the ultrasonic transducer array 29.
According to the present embodiment, the rigid portion 21 has the image position and orientation detecting coil 31 fixed to a position very close to the CCD camera 26.
The direction in which the image position and orientation detecting coil 31 is fixed is the same as that in accordance with Embodiment 1. The CCD camera 26 has an optical axis which is present in a plane containing V and V12 in
The other arrangements are the same as those of Embodiment 1.
Now, the operation of the present embodiment will be described.
In the description of the image processing device 11 in accordance with Embodiment 1, the operator selects the X-ray 3-dimensional helical computer tomography system 15 as a data source. The communication circuit 54 loads a plurality of two-dimensional CT images as reference image data. Such reference image data as shown in
In the present embodiment, description will be given on an example in which the X-ray 3-dimensional helical computer tomography system 15 picks up images of the chest, particularly the trachea, the bronchus, and the carina without contrast and in which, in an area where the bronchus is diverted into two carinas, a carina a and a carina b, the ultrasonic endoscope 2 is inserted into the carina a.
The optical observation device 3 creates optical image data by aligning the 12 o'clock direction (upward direction) of optical images with a direction opposite to the direction in which V12 is projected on a plane containing V and V12 in
The 3-dimensional human body image creation circuit 57 extracts voxels with large luminance values (mainly the walls of the trachea, the bronchus, and the carina) from the interpolation circuit 56 and colors the voxels. The 3-dimensional human body image creation circuit 57 then fills the extracted voxels into the voxel space in the synthesis memory 58a of the synthesis circuit 58 as 3-dimensional human body image data.
In this case, the 3-dimensional human body image creation circuit 57 fills the voxels so that the address of the extracted voxel in the voxel space in the interpolation memory 56a is the same as that of the extracted voxel in the voxel space in the synthetic memory. For the 3-dimensional human body image data, the trachea wall, bronchus wall, and carina wall with a high luminance are extracted and colored like the flesh. The subject with his or her head on the right and his or her feet on the left is observed from the ventral side.
The image index creation circuit 52 creates image index data from position and orientation mapping data with a total six degrees of freedom including the directional components (x0, y0, z0) of the position vector 00″, on the orthogonal coordinate axis 0-xyz, of the position 0″ of the image position and orientation detecting coil 31 and the angular components (ψ, θ, φ) of the Euler angle indicating the orientation of the image position and orientation detecting coil 31 with respect to the orthogonal coordinate axis 0-xyz. The image index creation circuit 52 then outputs the image index data to the synthesis circuit 58.
The image index data is image data on the orthogonal coordinate axis 0′-x′y′z′ obtained by synthesizing an orange optical-image visual-field direction marker indicating the optical axis with a yellow-green optical-image up direction marker indicating the 12 o'clock direction of optical images.
As is the case with Embodiment 1, the insertion shape creation circuit 53 creates insertion shape data from the position and orientation mapping data including the directional components (x0, y0, z0) of the position vector 00″ of the position 0″ of the image position and orientation detecting coil 31 and the directional components (xi, yi, zi) of the position vector of each of the plurality of insertion shape detecting coil 32 on the orthogonal coordinate axis 0-xyz. The insertion shape creation circuit 53 then outputs the insertion shape data to the synthesis circuit 58.
This is shown in
The synthesis circuit 58 sequentially fills image index data and insertion shape data into the voxel space in the synthesis memory 58a. The synthesis circuit 58 thus sequentially fills the 3-dimensional human body image data, the image index data, and the insertion shape data into the same voxel space in the same synthesis memory 58a to synthesize these data into a set of synthetic 3-dimensional data.
The rotational transformation circuit 59 reads the synthetic 3-dimensional data and executes a rotating process on the synthetic 3-dimensional data in accordance with a rotation instruction signal from the control circuit 63.
The 3-dimensional guide image creation circuit A executes a rendering process such as hidden surface removal or shading on the synthetic 3-dimensional data to create 3-dimensional guide image data that can be outputted to the screen. The default direction of 3-dimensional guide image data is from the ventral side of human body.
Accordingly, the 3-dimensional guide image creation circuit A creates 3-dimensional guide image data based on the observation of the subject 37 from the ventral side. The 3-dimensional guide image creation circuit A outputs 3-dimensional guide image data based on the observation from the ventral side of the subject to the mixing circuit 61. The 3-dimensional guide image data is shown in
In the 3-dimensional guide image data in
The 3-dimensional guide image creation circuit B executes a rendering process such as hidden surface removal or shading on the synthetic 3-dimensional data subjected to a rotating process to create 3-dimensional guide image data that can be outputted to the screen.
In the present embodiment, by way of example, it is assumed that an input provided by the operator via the mouse 12 and the keyboard 13 instructs the control circuit 63 to issue a rotation instruction signal to rotate the 3-dimensional guide image data through 90° so that the subject can be observed from the caudal side.
Accordingly, the 3-dimensional guide image creation circuit B creates 3-dimensional guide image data based on the observation from the caudal side of the subject. The 3-dimensional guide image creation circuit B outputs 3-dimensional guide image data based on the observation from the caudal side of the subject to the mixing circuit 61. The 3-dimensional guide image data is shown in
In the 3-dimensional guide image data in
The mixing circuit 61 creates display mixture data by properly arranging the optical image data from the optical image observation device 3, the 3-dimensional guide image data from the 3-dimensional guide image creation circuit A based on the observation of the subject 37 from the ventral side, and the 3-dimensional guide image data from the 3-dimensional guide image creation circuit B based on the observation of the subject 37 from the caudal side.
The display circuit 62 converts the mixture data into an analog video signal.
On the basis of the analog video signal, the display device 14 properly arranges the optical image, the 3-dimensional guide image based on the observation of the subject 37 from the caudal side, and the 3-dimensional guide image based on the observation of the subject 37 from the ventral side for display.
As shown in
In the present embodiment, optical images are processed as real-time images.
Like Embodiment 1, the present embodiment creates and displays two new 3-dimensional guide images on the display screen of the display device 14 together with a new optical image while updating the images in real time. That is, as shown in
The remaining part of the operation is the same as that of Embodiment 1.
The present embodiment provides the following effects.
The arrangements and operations of the present embodiment are such that the 3-dimensional guide image data is created so that the wall of the bronchus and the walls of the carinas a and b located beyond the bronchus are translucent so that the optical-image visual-field direction marker and optical-image up direction marker on the image index data and the insertion shape marker Ms and coil position marker Mc on the insertion shape data are visible and such that the mixing circuit 61 and the display device 14 properly arrange the optical image, the 3-dimensional guide image based on the observation of the subject 37 from the ventral side, and the 3-dimensional guide image based on the observation of the subject 37 from the caudal side for display.
Thus, the present embodiment can prevent the operator from inadvertently inserting the ultrasonic endoscope 2 (or an endoscope as described in the variation described below) into the carina b instead of the carina a.
The other effects are the same as those of Embodiment 1.
In the above description, the ultrasonic endoscope is inserted into the deep side of the bronchus. However, in other cases, the operator can also insert the body cavity probe into the body cavity to perform smooth diagnosis and treatment because 3-dimensional guide image data is created so that the optical-image visual-field direction marker and optical-image up direction marker on the image index data and the insertion shape marker Ms and coil position marker Mc on the insertion shape data are visible. Thus, a body cavity probe is realized with which the operator can smoothly perform diagnosis and treatment.
Like Embodiment 1, the present embodiment uses the electronic radial scanning ultrasonic endoscope 2 having the optical observation system (the optical observation window 24, objective lens 25, the CCD camera 26, and the illumination light irradiation window (not shown)) serving as a body cavity probe as in the case of Embodiment 1. However, the body cavity probe may be an endoscope simply having an optical observation system in place of the ultrasonic endoscope 2.
The variation of Embodiment 1 is applicable as another variation.
For example, embodiments into which the above embodiments and the like are partly combined also belong to the present invention. Further, the block configuration of the image processing device 11 shown in
Moreover, the present invention is not limited to the above embodiments. Of course, many variations and applications may be made to the embodiments without departing from the spirit of the present invention.
Obviously, according to the present invention, significantly different embodiments can be constructed on the basis of the present invention without departing from the spirit and scope of the present invention. The present invention is not limited by any particular embodiment thereof but only by the accompanying claims.
Number | Date | Country | Kind |
---|---|---|---|
2006-180435 | Jun 2006 | JP | national |