The present invention relates to an endoscope apparatus that generates and enables display of a three-dimensional model image of a subject at the time of performing endoscopic observation.
Endoscopic observation support techniques of generating a three-dimensional model image of a luminal organ and presenting an unobserved area to a surgeon on the generated three-dimensional model image are known.
For example, in International Publication No. 2012/101888, a medical apparatus is described which generates an insertion route through which a distal end portion of an insertion portion is to be inserted as far as a target site, based on a three-dimensional image data of a subject acquired in advance, and displays the generated insertion route being superimposed on a tomographic image generated from three-dimensional image data. In the patent publication, it is further described that an insertion route which has already been passed through and an insertion route as far as the target position are displayed on the three-dimensional model image with different line types.
In Japanese Patent Application Laid-Open Publication No. 2016-002206, a medical information processing system is described in which an observation image of a subject and information about an observation site included in past examination information about the subject are displayed on a display device, and site observation completion information showing that observation of the observation site corresponding to the information displayed on the display device has been completed is registered. Furthermore, in the patent publication, a technique of displaying sites for which observation has been completed, a site to be observed next and unobserved sites are displayed, for example, by square marks, a triangle mark and circle marks, respectively.
By using such endoscopic observation support techniques, it is possible to visually determine approximate positions of and an approximate number of unobserved areas, which is useful for preventing oversight.
An endoscope apparatus according to one aspect of the present invention includes: an endoscope configured to acquire an image of an inside of a subject; and a processor including hardware; wherein the processor generates three-dimensional model data of the subject; generates a three-dimensional model image visually confirmable in a predetermined line-of-sight direction, based on the generated three-dimensional model data; generates progress information enabling a progress state of observation by the endoscope to be visually confirmed as a ratio on an observation target based on the three-dimensional model data; and associates the progress information with the three-dimensional model image and presents the progress information relative to the three-dimensional model image side by side.
Embodiments of the present invention will be described below with reference to drawings.
The endoscope apparatus is provided with an endoscope 1, a processing system 2 and a display device 4 and may be further provided with a database 3 as necessary. Description will be made below on a case where the database 3 is not provided, as an example. As for a case where the database 3 is provided, the case will be appropriately described.
The endoscope 1 is an image acquisition apparatus which, in order to observe an inside of a subject having a three-dimensional shape, acquires an image of the inside of the subject and is provided with an image pickup portion 11, an illumination portion 12 and a position/orientation detecting portion 13. The image pickup portion 11, the illumination portion 12 and the position/orientation detecting portion 13 are, for example, arranged on a distal end portion of an insertion portion of the endoscope 1 which is to be inserted into a subject.
Note that though renal pelvis calyces of a kidney are given as an example of a subject having a three-dimensional shape in the present embodiment, the present embodiment is not limited to renal pelvis calyces but is widely applicable to any subject if the subject has a plurality of ducts and endoscopic observation can be performed for the subject.
The illumination portion 12 radiates illumination light to an inside of a subject.
The image pickup portion 11 forms, by an optical system, an optical image of the inside of the subject to which the illumination light is radiated and performs photoelectric conversion by an image pickup device and the like to generate a picked-up image signal.
The position/orientation detecting portion 13 detects a three-dimensional position of the distal end portion of the insertion portion of the endoscope 1 to output the three-dimensional position as position information, and detects a direction to which the distal end portion of the insertion portion of the endoscope 1 faces to output the direction as orientation information. For example, if an xyz coordinate system is set, the position information is indicated by (x, y, z) coordinates, and the orientation information is indicated by an angle around an x axis, an angle around a y axis and an angle around a z axis (therefore, the position/orientation detecting portion 13 is also called, for example, a 6D sensor). Note that the position information and the orientation information about the endoscope 1 may be indicated by using any other appropriate method (for example, a polar coordinate system).
The processing system 2 is such that performs control of the endoscope 1, communicates with the database 3 as necessary, processes a picked-up image signal, position information and orientation information acquired from the endoscope 1 to generate image data for display or image data for recording, and outputs the image data to the display device 4 and the like. Note that the processing system 2 may be configured as a single apparatus or may be configured with a plurality of apparatuses such as a light source apparatus and a video processor.
The processing system 2 is provided with an image processing portion 21, a three-dimensional model generating portion 22, an image generating portion 23, a presentation control portion 24, an illumination control portion 25 and a control portion 26.
The image processing portion 21 generates a picked-up image from a picked-up image signal outputted from the image pickup portion 11 and performs various kinds of image processings, such as demosaicking processing (or synchronization processing), white balance processing, color matrix processing and gamma conversion processing, for the generated picked-up image to generates an endoscopic image EI (see
The three-dimensional model generating portion 22 generates three-dimensional model data of a subject. For example, the three-dimensional model generating portion 22 acquires endoscopic images EI generated by the image processing portion 21 (or endoscopic images EI image-processed by the image processing portion 21 to generate a three-dimensional model) and position information and orientation information detected by the position/orientation detecting portion 13 when picked-up images from which the endoscopic images EI have been generated were picked up, corresponding to a plurality of frames via the control portion 26.
Then, the three-dimensional model generating portion 22 is adapted to generate stereoscopic three-dimensional model data while causing a position relationship among the endoscopic images EI of the plurality of frames to be adjusted based on the position information and the orientation information about each frame. In this case, three-dimensional model data is gradually constructed as observation progresses, and therefore generation of a three-dimensional model image M3 (see
The method of generating the three-dimensional model data by the three-dimensional model generating portion 22 is not limited to the above. For example, if the endoscopic examination for the subject is endoscopic examination for second or subsequent time, and three-dimensional model data generated in the past endoscopic examinations is already recorded in the database 3, the three-dimensional model data may be used. Or if data acquired by performing contrast enhanced CT imaging for the subject is already recorded in the database 3, three-dimensional model data may be generated using the contrast enhanced CT data.
In the database 3, a renal pelvis calyx model to be a basis of a progress map PM as shown in
The image generating portion 23 generates a three-dimensional image M3 (see
The presentation control portion 24 presents progress information PI (see
The illumination control portion 25 is such that controls on/off or an amount of illumination light radiated by the illumination portion 12. Here, the illumination control portion 25 and the illumination portion 12 may be a light source device and a light guide or the like, respectively. Or the illumination control portion 25 and the illumination portion 12 may be a light emission control circuit and a light emission source such as an LED, respectively.
The control portion 26 is such that controls the whole processing system 2 and further controls the endoscope 1. The control portion 26 is connected to the image processing portion 21, the three-dimensional model generating portion 22, the image generating portion 23, the presentation control portion 24 and the illumination control portion 25 which have been described above.
The control portion 26 is provided with the progress information generating portion 27 configured to generate progress information PI showing a progress state of observation of a subject by the endoscope 1. A specific example of the progress information PI generated by the progress information generating portion 27 will be described later with reference to drawings.
The database 3 is connected to the processing system 2, for example, via an in-hospital system, and three-dimensional model data of subjects generated based on contrast enhanced CT data of the subjects, three-dimensional model data of the subjects generated based on the contrast enhanced CT data, three-dimensional model data of the subjects generated by past endoscopic examinations, or a renal pelvis calyx model to be a basis of a progress map PM are recorded.
The display device 4 is configured including one or more monitors and the like and displays a presentation image including an endoscopic image EI, a three-dimensional model image M3 and progress information PI outputted from the presentation control portion 24.
On the display screen 4i, an endoscopic image display portion 4a, a three-dimensional model image display portion 4b and a progress information display portion 4c are provided.
On the endoscopic image display portion 4a, an endoscopic image EI generated by the image processing portion 21 is displayed.
On the three-dimensional model image display portion 4b, a three-dimensional model image M3 generated by the image generating portion 23 is displayed. Since the three-dimensional model image M3 shown in
On the progress information display portion 4c, progress information PI is displayed. Note that though the progress information display portion 4c is a display portion that is a little smaller than the three-dimensional model image display portion 4b in the shown example, the display position and display size of each display portion may be changeable as described later.
The progress information PI includes, for example, a progress map PM and a calculus mark display PR.
The progress map PM is such that, for an observation target (here, for example, a kidney), the renal pelvis calyx structure of the observation target is modeled and displayed, and display aspects (for example, colors, patterns, combinations of color and pattern, or the like as described above) of observed areas OR and unobserved areas UOR are caused to be different (in
More specifically, a kidney is provided with calyces which are a plurality of partial areas forming a duct structure. Therefore, for example, information showing a ratio of the number of observed calyces to the total number of calyces of the kidney (or the total number of calyces estimated to be included in the kidney) can be displayed by causing the display aspects to be different.
More particularly, the calyces are classified into superior calyces, middle calyces and inferior calyces; and when progress information PI for each of the parts is displayed, a ratio of the number of observed calyces among the superior calyces to the total number of calyces existing as the superior calyces is displayed on the part for the superior calyces in the progress map PM, and results calculated similarly can be displayed for the middle calyces and the inferior calyces, respectively (see
Thus, it is possible to, by seeing the progress map PM, intuitively and more easily determine what percentage of the total number of observation targets has been observed.
The progress information PI, however, is not limited to being calculated based on the ratio of the number of partial areas but may be calculated based on a ratio of volume or a ratio of area.
In the case of performing calculation based on a ratio of volume, a ratio of volume of observed areas OR to volume of a prespecified area of a subject, for example, volume of all areas of the subject (if it is not known, estimated volume of all the areas of the subject) can be calculated and used as progress information PI.
In the case of performing calculation based on a ratio of area, a ratio of area of the observed area OR to area of the prespecified area of the subject, for example, area of all areas of the subject (if it is not known, estimated area of all the areas of the subject) can be calculated and used as progress information PI.
Or instead of calculating a ratio as progress information PI, the total number of partial areas the subject is provided with and the number of observed partial areas may be used as progress information PI.
In addition, the number of unobserved partial areas may be displayed as progress information PI (together with the total number of partial areas as necessary). Here, the number of unobserved partial areas is calculated by subtracting the number of observed partial areas from an estimated total number of partial areas.
Note that judgment that a calyx has been observed is not limited to being made by observation of an inside of the calyx having completely been (that is, 100%) performed. For example, the judgment may be made by 80% of the observation of the inside of the calyx having been performed, or an arbitrary ratio may be set beforehand.
Though the progress map PM shown in
The calculus mark display PR is a part where information showing the number of already marked targets relative to the number of targets to be marked is displayed. The targets to be marked in the present embodiment are, for example, calculi. That is, the number of calculi which have already been marked is displayed relative to the number of calculi acquired in advance by another method (for example, simple CT imaging).
More specifically, in the example shown in
Note that in the example shown in
Though one display screen 4i is provided in the example shown in
As shown in
In the second example of the progress information display portion 4c shown in
When the process is started, the total number of calyces based on the standard renal pelvis calyx model is acquired, and the total number of calculi of the subject which is already known is acquired first (step S1). Here, as for the number of calculi of the subject, it is desirable to acquire how many calculi exist, for example, in superior calyces, middle calyces and inferior calyces, respectively, but how many calculi exist in all the calyces may also be acquired as shown in
Then, observation of the calyces by the endoscope 1 is started (step S2).
During the observation of the calyces, it is judged whether a new calyx different from the standard renal pelvis calyx model has been found or not (step S3). If a new calyx is found, the total number of calyces to be observed is updated (step S4).
If the process of step S4 is performed, or if it is judged at step S2 that a new calyx has not been found, it is judged whether a new calculus other than the calculi acquired at step S1 has been found or not (step S5). If a new calculus has been found, the total number of calculi is updated (step S6).
If the process of step S6 is performed, or if it is judged at step S5 that a new calculus has not been found, it is judged whether one calyx has been observed or not (step S7).
Here, if it is judged that one calyx has been observed, a progress map PM showing a ratio of the number of observed calyces to the total number of calyces is generated, and display of the progress information display portion 4c is updated with the generated progress map PM (step S8). At this time, as shown in
If the process of step S8 is performed, or it is judged at step S7 that a calyx has not been observed, it is judged whether one calculus has newly been marked or not while the flow proceeds along the loop of step S3 described above to step S11 to be described later (step S9). If one calculus has been marked, the calculus mark display PR is updated (step S10).
After that, it is judged whether or not to end the endoscopic observation (step S11). If the endoscopic observation is not to be ended, the flow returns to step S3 described above, and the endoscopic observation is continued.
On the other hand, if it is judged at step S11 that the endoscopic observation is to be ended, the process is ended.
Note that though it is assumed in the above description that the accurate shape of the renal pelvis calyces of the subject is unknown at the stage of starting the endoscopic observation, it is possible to, in a case where the shape of the renal pelvis calyces is known beforehand, such as a case where the endoscopic observation is endoscopic observation for second or subsequent time or a case where contrast enhanced CT data is acquired beforehand, display the progress information PI more appropriately by using a renal pelvis calyx model adapted for the subject.
An example of using a renal pelvis calyx model adapted for a subject will be described with reference to
In the third example shown in
If a shape of renal pelvis calyces of a subject is unknown, a standard renal pelvis calyx model is used as a progress map PM, and progress information display is display showing an approximate degree of progress. If the shape of the renal pelvis calyces of the subject is known before endoscopic observation, a ratio of volume (or area) of observed areas OR to volume (or area) of all areas of the subject is information showing a degree of progress with a high accuracy as described above. In this case, progress rates NV may be further displayed as progress information PI as shown in
Note that though display by percentage is performed for the superior calyces, the middle calyces and the inferior calyces, respectively, here, display by percentage may be performed for each of all calyces or only for calyces in which calculi exist, in more detail.
In the example shown in
According to the first embodiment as described above, since progress information PI showing a progress state of observation of a subject by the endoscope 1 is generated and presented in association with a three-dimensional model image M3, it is possible to intuitively and more easily grasp a progress state of endoscopic observation, that is, a state about which stage the endoscopic observation has progressed to, so that usability is improved.
Further, since the progress information PI is adapted to include information showing a ratio of volume of observed areas OR to volume of all areas of the subject, accurate progress state display based on a volume ratio becomes possible.
Or if the progress information PI is adapted to include information showing a ratio of area of the observed area OR to area of all the areas of the subject, accurate progress state display based on an area ratio becomes possible.
If the progress information PI is adapted to include information showing a ratio of the number of observed partial areas to the total number of partial areas that the subject is provided with, it becomes possible to grasp a remaining process of the endoscopic observation in units of the number of partial areas.
In addition, since the progress information PI is adapted to further include information showing the number of targets (here, calculi) which have already been marked relative to the number of targets (calculi) to be marked, it becomes possible to easily grasp which stage marking of targets has progressed to.
Since the progress information PI and the three-dimensional model image M3 are presented side by side, it is possible to grasp more appropriately, for a three-dimensional observation target, up to which part endoscopic observation has been performed. Thereby, it is possible to prevent oversight of an unobserved area UOR existing at a position that cannot be visually confirmed.
Furthermore, even if an unobserved area UOR is hidden on a back side of the three-dimensional model image M3 when the use see the three-dimensional model image M3, the user can confirm existence of the unobserved area UOR by the progress information PI. Thereby, it is also possible to prevent oversight of an unobserved area UOR existing at a position that cannot be visually confirmed.
In the second embodiment, parts similar to parts of the first embodiment described above are given the same reference numerals, and description will be appropriately omitted. Description will be made mainly only on different points.
As shown in
The area dividing portion 28 divides a three-dimensional model image M3 generated by the image generating portion 23 into a plurality of divided areas RG (see
The progress information generating portion 27 performs image processing of at least one of the three-dimensional model image and the background image in a divided area RG including an unobserved area UOR, among the plurality of divided areas RG divided by the area dividing portion 28, so that the image is distinguishable from the other divided areas RG not including an unobserved area UOR in order to generate progress information PI. Since the progress information generating portion 27 generates information for grasping a progress state of endoscopic observation in a bird's eye view, the progress information generating portion 27 can be also called a bird's eye view information generating portion.
In the present embodiment, the three-dimensional model image M3 and the image-processed background image described above are used as a progress map PM as shown in
Note that since the three-dimensional model image M3 of the three-dimensional model image display portion 4b is, for example, rotatable as described above, such a configuration is also possible that, in the case of displaying a three-dimensional model image M3 similar to the three-dimensional model image of the three-dimensional model image display portion 4b on the progress information display portion 4c, the three-dimensional model image M3 of the progress information display portion 4c also rotates in synchronization with rotation of the three-dimensional model image M3 of the three-dimensional model image display portion 4b.
In the example shown in
Here, instead of causing the display aspect of the background image to be different, the display aspect of the three-dimensional model image M3 may be caused to be different, or the display aspects of the background image and the three-dimensional model image M3 may be caused to be different.
The example shown in
In this case, the display aspects may be caused to be gradually different according to sizes and the like of the unobserved areas UOR. That is, for a divided area RG including a small unobserved area UOR, the display aspect may be caused to be different a little. For a divided area RG including a large unobserved area UOR, the display aspect may be caused to be significantly different. For example, a divided area RG including a small unobserved area UOR may be displayed being painted in light color, and a divided area RG including a large unobserved area UOR may be displayed being painted in deep color.
Note that in the case of adopting such a three-dimensional model image M3 that is constructed as endoscopic observation progresses as described above, only a constructed part may be divided into divided areas RG.
According to the second embodiment as described above, advantageous effects almost similar to the advantageous effects of the first embodiment described above are obtained; and since progress information PI is presented being superimposed on a three-dimensional model image M3, it is not necessary to compare the three-dimensional model image M3 and the progress information PI and it is possible to grasp a progress state of endoscopic observation only by seeing the three-dimensional model image M3.
Since a display aspect showing whether an unobserved area UOR is included or not is caused to be different for each divided area RG, it is possible to grasp a gradual progress state for each area.
In the third embodiment, parts similar to the first and second embodiments described above are given the same reference numerals, and description will be appropriately omitted. Description will be made mainly only on different points.
As shown in
The duct length estimating portion 29 detects lengths of one or more observed ducts among a plurality of ducts that a subject includes, and estimates a length of an unobserved duct based on the detected lengths of the observed ducts.
The progress information generating portion 27 generates core line information about the observed ducts, generates core line information about the unobserved duct based on the lengths of the unobserved ducts estimated by the duct length estimating portion 29, and generates progress information PI in which the core line information about the observed ducts and the core line information about the unobserved duct are displayed in display aspects enabling both of the pieces of core line information to be distinguishable from each other. The progress information PI generated by the progress information generating portion 27 is displayed on the progress information display portion 4c as a progress map PM.
More specifically, it is assumed that calyces as ducts are observed by the endoscope 1, and one calyx becomes an observed area OR as shown in
In this case, the duct length estimating portion 29 detects a length L1 of a duct of the observed area OR as shown in
If the observed area OR is such a range as indicated by a solid line in
The estimation is performed on an assumption of L2=L1, for example, based on an assumption that sizes (or depths) of respective calyces are almost the same. If there are a plurality of observed calyces, and lengths of ducts of the plurality of calyces are already detected, an average value of the detected lengths, for example, can be set as the estimated length of the unobserved calyx.
Then, the progress information generating portion 27 generates core line information CL about the calyx in the observed area OR as indicated by a solid line in
Furthermore, based on the length L2 of the calyx in the unobserved area UOR estimated by the duct length estimating portion 29, the progress information generating portion 27 generates core line information as indicated by a dotted line in
At this time, the progress information generating portion 27 generates progress information PI by causing display aspects (for example, colors, patterns, or combinations of color and pattern as described above) of the core line of the observed area OR and the core line of the unobserved area UOR to be different so that the core line of the observed area OR and the core line of the unobserved area UOR are distinguishable from each other. As an example, one of the core lines of the observed area OR and the unobserved area UOR is shown as a red line, and the other is shown as a blue line. An aspect of causing the core line of the unobserved area UOR to be blinkingly displayed in order to further enhancing the unobserved area UOR displayed here is also possible.
By seeing the progress information PI as in
It is assumed that the observation of the calyces by the endoscope 1 has progressed to a state as shown in
At this time, the duct length estimating portion 29 can estimate that there are two calyces in the unobserved area UOR. Therefore, the duct length estimating portion 29 estimates, for lengths L2 and L3 of the two calyces in the unobserved area UOR, which are unobserved ducts, that L2=L1 and L3=L1 are satisfied, based on the detected length L1 of the duct of the observed area OR. Thereby, the progress information generating portion 27 generates the core line information CL as indicated by solid lines and dotted lines in
By seeing the progress information PI as in
It is assumed that the observation of the calyces by the endoscope 1 has further progressed to a state as shown in
At this time, based on the core line information about the observed area OR detected by the duct length estimating portion 29, the progress information generating portion 27 generates core line information CL as indicated by solid lines in
By seeing the progress information PI as in
Note that since it is assumed in the above description that core line information CL is generated based on three-dimensional model data constructed as endoscopic observation progresses, only one core line that indicates being unobserved is displayed in the state shown in
Though the core line information CL generated by the progress information generating portion 27 may be displayed as a progress map PM of the progress information display portion 4c (that is, together with a three-dimensional model image M3 of the three-dimensional model image display portion 4b side by side), the core line information CL may be displayed being superimposed on the three-dimensional model image M3 of the three-dimensional model image display portion 4b as shown in
By seeing the display as shown in
According to the third embodiment as described above, advantageous effects almost similar to the advantageous effects of the first and second embodiments described above are obtained; and since a length of an unobserved duct is estimated based on a detected length of an observed duct to generate core line information about the observed and unobserved ducts, and such progress information PI that displays whether observed or unobserved in display aspects enabling whether observed or unobserved to be distinguishable, it is possible to easily recognize a degree of progress of endoscopic observation.
Note that it is also possible to configure the endoscope apparatus such that any of the display aspect of the first embodiment, the display aspect of the second embodiment and the display aspect of the third embodiment as described above can be adopted so that, in one endoscopic examination, the user can select and switch to a desired display aspect. In this case, the user makes a setting for switching to the desired display aspect, for example, by operating an operation portion provided on the endoscope 1, which is not shown, or an operation portion provided on the processing system 2, which is not shown.
Each portion described above may be configured as a circuit. An arbitrary circuit may be implemented as a single circuit or as a combination of a plurality of circuits as long as the same function can be achieved. Furthermore, the arbitrary circuit is not limited to being configured as a dedicated circuit for achieving an intended function, but a configuration is also possible in which the intended function is achieved by causing a general-purpose circuit to execute a processing program.
Though description has been made above mainly on an endoscope apparatus, the present invention may include an operation method for causing an endoscope apparatus to operate as described above, a processing program for causing a computer to perform a process similar to a process of the endoscope apparatus, a computer-readable non-transitory recording medium in which the processing program is recorded, and the like.
Note that the present invention is not limited to the above embodiments as they are, but the components can be modified and embodied within a range not departing from the spirit of the invention at a stage of practicing the invention. Further, various aspects of the invention can be formed by appropriately combining a plurality of components disclosed in the above embodiments. For example, some components may be deleted from all the components shown in an embodiment. Furthermore, components from different embodiments may be appropriately combined. Thus, various modifications and applications are, of course, possible within a range not departing from the spirit of the invention.
Number | Date | Country | Kind |
---|---|---|---|
2016-104525 | May 2016 | JP | national |
This application is a continuation application of PCT/JP2017/011397 filed on Mar. 22, 2017 and claims benefit of Japanese Application No. 2016-104525 filed in Japan on May 25, 2016, the entire contents of which are incorporated herein by this reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2017/011397 | Mar 2017 | US |
Child | 16156076 | US |