1. Field of the Invention
The present invention relates to an observation apparatus in which an inserting section is inserted into an insertion subject for observation, an observation supporting device for use in such an observation apparatus, an observation supporting method, and a recording medium non-transitory storing a program which allows a computer to execute a procedure of the observation supporting device.
2. Description of the Related Art
As a supporting device in a case where an inserting section is inserted into an insertion subject for observation, for example, there is disclosed, in U.S. Pat. No. 6,846,286, a constitution to display a shape of an endoscope inserting section in a display section when the endoscope inserting section is inserted into a human body.
As to this constitution, in an endoscope device, flexible bend detecting optical fibers having bend detecting portions in which a quantity of light to be transmitted changes in accordance with a size of an angle of a bend are attached to a flexible band-like member in a state where the fibers are arranged in parallel, and the band-like member is inserted into and disposed in the endoscope inserting section along a substantially total length of the endoscope inserting section. Additionally, a bending state of the band-like member in a portion where each bend detecting portion is positioned is detected from the light transmission quantity of each bend detecting optical fiber, to display the bending state as the bending state of the endoscope inserting section in a monitor screen.
In general, there are only a few regions that become marks in an insertion subject, and hence when it is not easily judged only from an acquired image which region of the insertion subject is being observed, it is also not easily judged whether or not all required regions could be imaged (observed).
U.S. Pat. No. 6,846,286 mentioned above discloses that a shape of an inserting section is detected and displayed. However, there has not been suggested a method of detecting and displaying which region of the insertion subject is being imaged (observed).
The present invention has been developed in respect of the above, and an object thereof is to provide an observation apparatus, an observation supporting device, an observation supporting method and a program that can supply, to an operator, information to judge which region of an insertion subject is being imaged.
According to a first aspect of the invention, there is provided an observation apparatus comprising an inserting section to be inserted into an insertion subject, configured to include an image acquisition opening, an image acquisition section configured to receive light entering into the image acquisition opening and to acquire image, a relative position detecting section configured to detects a relative position, in relation to the insertion subject, of a portion of the inserting section which becomes a position detection object, an insertion subject shape acquiring section configured to acquire shape information of the insertion subject, an image acquisition position calculating section configured to calculate an image acquisition position that is at least one of an image acquisition region as a region of the insertion subject which is being acquired image by the image acquisition section, a part of the image acquisition region and a point in the image acquisition region, by use of the relative position and the shape information of the insertion subject, a display calculating section configured to calculate weighting information of the image acquisition position on the basis of a weighting index parameter, and to set a display format on the basis of the weighting information, and an output section configured to output the display format and the image acquisition position as display information.
According to a second aspect of the invention, there is provided an observation supporting device for use in an observation apparatus in which an inserting section is inserted into an insertion subject to acquire image of the inside of the insertion subject, the observation supporting device comprising a relative position information acquiring section is configured to acquire relative position information, in relation to the insertion subject, of a portion of the inserting section which becomes a position detection object, on the basis of displacement amount information of the inserting section, an insertion subject shape acquiring section is configured to acquire shape information of the insertion subject, an image acquisition position calculating section is configured to calculate an image acquisition position that is at least one of an image acquisition region as a region of the insertion subject which is being acquired image by the observation apparatus, a part of the image acquisition region and a point in the image acquisition region, by use of the relative position information and the shape information of the insertion subject, a display calculating section is configured to calculate weighting information of the image acquisition position on the basis of a weighting index parameter, and to set a display format on the basis of the weighting information, and an output section is configured to output the display format and the image acquisition position as display information.
According to a third aspect of the invention, there is provided an observation supporting method for use in an observation apparatus in which an inserting section is inserted into an insertion subject to acquire image of the inside of the insertion subject, the observation supporting method comprising acquiring relative position information, in relation to the insertion subject, of a position of the inserting section which becomes a detection object, on the basis of displacement amount information of the inserting section, acquiring shape information of the insertion subject, calculating an image acquisition position that is at least one of an image acquisition region as a region of the insertion subject which is being acquired image by the observation apparatus, a part of the image acquisition region and a point in the image acquisition region, by use of the relative position information and the shape information of the insertion subject, calculating weighting information of the image acquisition position on the basis of a weighting index parameter, and setting a display format on the basis of the weighting information, and outputting the display format and the image acquisition position as display information.
According to a fourth aspect of the invention, there is provided a recording medium non-transitory storing a program which allows a computer to execute a position information acquiring procedure of acquiring relative position information, in relation to an insertion subject, of a position of an inserting section which becomes a detection object, on the basis of displacement amount information of the inserting section in an observation apparatus in which the inserting section is inserted into the insertion subject to acquire image of the inside of the insertion subject, an insertion subject shape acquiring procedure of acquiring shape information of the insertion subject, an image acquisition position calculating procedure of calculating an image acquisition position that is at least one of an image acquisition region as a region of the insertion subject which is being acquired image by the observation apparatus, a part of the image acquisition region and a point in the image acquisition region, by use of the relative position information and the shape information of the insertion subject, a display calculating procedure of calculating weighting information of the image acquisition position on the basis of a weighting index parameter, and setting a display format on the basis of the weighting information, and an output procedure of outputting the display format and the image acquisition position as display information.
According to the present invention, it is possible to supply information to judge which region of an insertion subject is being imaged, and hence an operator can easily judge which region of the insertion subject is being imaged and whether or not all required regions could be imaged. Therefore, it is possible to provide an observation apparatus, an observation supporting device, an observation supporting method and a program which can prevent oversight of observation regions.
Furthermore, according to the present invention, it is possible to display an image acquisition position in a display format based on weighting information of the image acquisition position, and hence it is possible for an operator to easily judge an importance of the image acquisition position.
Advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. Advantages of the invention may be realized and obtained by means of the instrumentalities and combinations particularly pointed out hereinafter.
The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate embodiments of the invention, and together with the general description given above and the detailed description of the embodiments given below, serve to explain the principles of the invention.
Hereinafter, a mode for carrying out the present invention will be described with reference to the drawings.
As shown in
The inserting tool 3 is, for example, an endoscope device. The inserting tool 3 includes the inserting section 31 and an operating section 32 constituted integrally with the inserting section 31.
The inserting section 31 is a flexible tubular member and is insertable from an insertion port 21 of the insertion subject 2 into the insertion subject 2. In an end portion of the inserting section 31 in an inserting direction (hereinafter referred to as an inserting section distal end), an image acquisition opening 33 is disposed. Further, in the vicinity of the inserting section distal end in the inserting section 31, an image acquisition section 34 is included. The image acquisition section 34 receives light entering into the image acquisition opening 33 to acquire image. An image acquired by the image acquisition section 34 is output to the display device 7 through the observation supporting device 6.
It is to be noted that needless to say, the image acquisition section 34 may not be disposed in the vicinity of the inserting section distal end in the inserting section 31 but may be disposed in the operating section 32. In this case, the image acquisition section 34 is connected to the image acquisition opening 33 by a light guide or the like to guide the light entering into the image acquisition opening 33 to the image acquisition section 34.
In addition, the inserting section 31 includes a bending portion 35 in the vicinity of the inserting section distal end. The bending portion 35 is coupled with an operation lever 36 disposed in the operating section 32 by a wire, though not especially shown in the drawing. In consequence, the operation lever 36 is moved to pull the wire, thereby enabling a bending operation of the bending portion 35.
In addition, the fiber shape sensor 4 is disposed in the inserting section 31. The fiber shape sensor 4 includes optical fibers. Each optical fiber is provided with a bend detecting portion 41 in one portion thereof. In the bend detecting portion 41, a clad of the optical fiber is removed to expose a core thereof, and a light absorbing material is applied to the core to constitute the bend detecting portion. In the bend detecting portion 41, as shown in
In the fiber shape sensor 4 of this constitution, for the purpose of detecting the bend in an X-axis direction and the bend in a Y-axis direction shown in
It is to be noted that a portion other than the bending portion 35 of the inserting section 31 freely bends in accordance with an internal structure of the insertion subject 2 due to a flexibility of the inserting section 31. Therefore, the bend detecting portions 41 are preferably disposed not only in the bending portion 35 of the inserting section 31 but also on an operating section side from the bending portion, so that it is possible to also detect a bending state of the portion other than the bending portion 35 of the inserting section 31.
It is to be noted that as shown in
In addition, as shown in
The inserting section 31 is irradiated with the light emitted from the light source 51 through the projection lens 52. The light reflected by the inserting section 31 is received through the light receiving lens 53 by the optical pattern detecting portion 54. The optical pattern detecting portion 54 detects images of a plane of the inserting section 31 which is an optical pattern continuously at detection times t0, t1, t2, . . . , tn, . . . .
The displacement amount calculating portion 55 calculates a displacement amount by use of the optical patterns present in the images of two pieces of image data acquired by the optical pattern detecting portion 54 at different times. More specifically, as shown in
In addition, as shown in
The relative position information acquiring section 61 acquires relative position information, in relation to the insertion subject 2, of a portion of the inserting section 31 which becomes a position detection object, on the basis of the displacement amount information of the inserting section 31 which is input from the fiber shape sensor 4 and the insertion and rotation detecting section 5. That is, the relative position information acquiring section 61 cooperates with the fiber shape sensor 4 and the insertion and rotation detecting section 5 to function as a relative position detecting section that detects a relative position, in relation to the insertion subject 2, of the portion of the inserting section 31 which becomes the position detection object. The insertion subject shape acquiring section 62 acquires the shape information of the insertion subject 2.
The image acquisition position calculating section 63 calculates an image acquisition position that is at least one of an image acquisition region as a region of the insertion subject 2 being acquired image by the image acquisition section 34, a part of the image acquisition region and a point in the image acquisition region, by use of the above relative position and the above shape information of the insertion subject 2. The display calculating section 64 calculates weighting information of the image acquisition position on the basis of a weighting index parameter, and sets a display format on the basis of the weighting information. Furthermore, the output section 65 outputs this display format and the above image acquisition position as the display information. The display information output from the observation supporting device 6 is displayed by the display device 7.
Hereinafter, an operation of the observation supporting device 6 will be described in detail with reference to an operation flowchart of
First, the insertion subject shape acquiring section 62 acquires the shape information (insertion subject shape information) including position information of a range of the insertion subject 2 which becomes an image acquisition object in relation to the insertion subject 2 (step S11). For example, this insertion subject shape information is constituted on the basis of data from the outside or inside of the insertion subject 2 before the inserting section 31 is inserted into the insertion subject 2.
That is, the insertion subject shape information based on the data from the outside is constituted by utilizing an apparatus that can detect the information by use of the light transmitted through the insertion subject 2, for example, a CT diagnosis apparatus, an ultrasonic diagnosis apparatus or an X-ray apparatus.
In addition, the insertion subject shape information based on the data from the inside is constituted by utilizing locus data obtained when the inserting section 31 is moved in a space of the insertion subject 2 or by connecting position information obtained when the inserting section distal end comes in contact with the insertion subject 2. When the position information obtained during the contact between the inserting section distal end and the insertion subject 2 is utilized, a size of the space can be detected, and the insertion subject shape information can more exactly be acquired. Furthermore, when the insertion subject 2 is a human organ, the information may be constituted by presuming a physical constitution, and when the insertion subject 2 is a structure, the information may be constituted by inputting the shape through a drawing.
It is to be noted that when the insertion subject shape information is acquired by the insertion subject shape acquiring section 62, the insertion subject shape information may directly be acquired from an apparatus such as the CT diagnosis apparatus by connecting the apparatus that constitutes the insertion subject shape information, or the insertion subject shape information may be acquired by storing the insertion subject shape information output from the apparatus once in a storage medium and reading the stored insertion subject shape information or by downloading the insertion subject shape information via a network. Furthermore, the insertion subject shape acquiring section 62 is not limited to that interface or data reader and the acquiring section itself may be the apparatus that constitutes the insertion subject shape information.
The insertion subject shape information acquired by the insertion subject shape acquiring section 62 is output to the image acquisition position calculating section 63 and the display calculating section 64.
In addition, the relative position information acquiring section 61 acquires the displacement amount information of the inserting section 31 (step S12), and acquires a shape of the inserting section 31 and a position and a direction of the inserting section distal end to the insertion subject 2 (step S13).
Specifically, the relative position information acquiring section 61 includes a function of obtaining the shape of the inserting section 31, a function of obtaining the insertion amount and the rotation amount of the inserting section 31, and a function of obtaining the position and direction of the inserting section distal end in relation to the insertion subject 2.
That is, such a relational equation between a change ΔQ of the light transmission quantity of the fiber shape sensor 4 and a bend amount φ of the bend detecting portion 41 as in the following equation (1) is beforehand obtained and stored in the relative position information acquiring section 61.
φ=f(ΔQ) (1)
Furthermore, the relative position information acquiring section 61 calculates the bend amount of each bend detecting portion 41 from the light transmission quantity given as the displacement amount information from the fiber shape sensor 4 in accordance with this stored equation (1). Furthermore, the shape of the inserting section 31 is obtained from the bend amount of each bend detecting portion 41 and an arrangement interval of the respective bend detecting portions 41 which is given as foresight information.
In addition, coefficients a and b to convert the displacement amount on the image which is calculated by the displacement amount calculating portion 55 into an actual insertion amount and an actual rotation amount of the inserting section 31 are beforehand obtained and stored in the relative position information acquiring section 61. Furthermore, the relative position information acquiring section 61 multiplies the displacement amount on the image which is calculated by the displacement amount calculating portion 55 by the stored coefficients a and b as in the following equation (2) to calculate an insertion amount m and a rotation amount θ.
m=a×Δx
θ=b×Δy (2)
Afterward, the relative position information acquiring section 61 calculates the shape of the inserting section 31 in relation to the insertion subject 2 from the calculated shape of the inserting section 31 and the calculated insertion amount and rotation amount of the inserting section 31 in relation to the insertion subject 2. Furthermore, the section 61 calculates the relative position information, in relation to the insertion subject 2, of the portion of the inserting section 31 which becomes the position detection object, i.e., the position and direction of the inserting section distal end in relation to the insertion subject 2 (the position of the image acquisition opening 33 and a direction opposite to an incident direction of the light) from the shape of the inserting section 31 in relation to the insertion subject 2. The relative position information in relation to the insertion subject 2 which is obtained in this manner is output to the image acquisition position calculating section 63. In addition, shape information indicating the shape of the inserting section 31 in relation to the insertion subject 2 and information of an inserting section distal position in the above relative position information are output to the display calculating section 64.
Furthermore, the image acquisition position calculating section 63 calculates the image acquisition position from the relative position information obtained by the relative position information acquiring section 61 and the insertion subject shape information acquired by the insertion subject shape acquiring section 62 (step S14).
Specifically, for example, as shown in
In general, a region of interest in an observation object is at the center of the viewing field, and hence the center of the viewing field is often more important than a periphery thereof. It is to be noted that here, the description has been given as to the example where the intersection is obtained as the image acquisition position P, but the viewing field (the image acquisition region 83) that is the region of the insertion subject 2 being acquired image by the image acquisition section 34 may be calculated as the image acquisition position P. In consequence, a range in which image is acquired by the image acquisition section 34 can be grasped. In addition, a partial region 84 or a point in the viewing field (the image acquisition region 83) may be calculated as the image acquisition position P. For example, when the image acquisition region 83 cannot exactly be detected, a small region is calculated in consideration of an error, so that a region that is not imaged can be prevented from being wrongly detected as the imaged region. That is, an omission of observation can be prevented.
Image acquisition position information indicating the thus obtained image acquisition position P is output to the display calculating section 64.
Afterward, the display calculating section 64 calculates the weighting information of the image acquisition position P on the basis of the weighting index parameter, and executes an operation of setting the display format on the basis of the weighting information. Here, a case where a speed of the image acquisition position is used as the weighting index parameter is described as an example.
The display calculating section 64 first judges whether or not t is larger than 0, i.e., whether or not two or more pieces of data of the image acquisition position P are present (step S15). Here, when t is not larger than 0 (one piece of information of the image acquisition position P is only present), the step returns to the above step S12 to repeat the abovementioned operation. That is, immediately after the processing of the operation of the observation apparatus 1, only one piece of information of the image acquisition position P is obtained, and in this case, the speed cannot be calculated, and hence the shape of the inserting section 31 and the image acquisition position P are calculated again.
When two or more pieces of information of the image acquisition position are present, the display calculating section 64 performs the calculation of a speed V of the image acquisition position (step S16). That is, the display calculating section 64 obtains a speed Vn of the image acquisition position P, which is the weighting index parameter, from the image acquisition position P at the current time t and an image acquisition position Pn−1 obtained at one previous image acquisition time tn−1 in accordance with the following equation (3):
V
n=(Pn−Pn−1)/(tn−tn−1) (3)
Furthermore, the display calculating section 64 calculates weighting information w of the image acquisition position from this obtained speed V (step S17). For example, as shown in
w=f(V) (4)
That is, when the moving speed of the image acquisition position is fast, it is judged that an operator cannot perform the observation or that the operator is not performing the observation but is just moving the inserting section distal end, and the weighting information is made smaller. Conversely, when the moving speed of the image acquisition position is slow, it is judged that the operator can perform the observation, and the weighting information is enlarged.
Furthermore, the display calculating section 64 sets the display format on the basis of this weighting information (step S18). That is, the display calculating section 64 holds the current image acquisition position P and locus information of the image acquisition position which is a past image acquisition position into an unshown internal memory or the like. Furthermore, the current image acquisition position and a locus of the image acquisition position are set so as to change the display format on the basis of the weighting information of the image acquisition position. For example, as shown in
Furthermore, the output section 65 outputs at least the above display format and the above image acquisition position (the current image acquisition position and the locus of the image acquisition position) as the display information (step S19). Afterward, the processing returns to the above step S12 to repeat the above operation.
The above display information can further include the shape information indicating the shape of the inserting section 31 in relation to the insertion subject 2, the insertion subject shape information, the image acquired by the image acquisition section 34 and the like. That is, as shown in
Here, the first two-dimensional view 72 is a view showing a state where the shape of the insertion subject 2 is divided by a Y-Z plane and opened in a right-left direction in a coordinate of the insertion subject 2 as shown in
Furthermore, there is prepared such display information as to display a current position display 74 showing the current image acquisition position, a position locus display 75 showing the locus of the image acquisition position and an inserting section shape schematic display 76 showing the shape of the inserting section 31, on these two-dimensional views 72 and 73.
In addition, as shown in
As described above, according to the present first embodiment, the observation supporting device 6 calculates the image acquisition position and the weighting information of the image acquisition position, and sets the display format of the image acquisition position on the basis of the weighting information of the image acquisition position, to output the display format and the image acquisition position as the display information. Therefore, it is possible for the operator to easily judge which region of the insertion subject 2 is being acquired image and whether images of all required regions can be acquired, and oversight of image acquisition regions can be prevented. Furthermore, importance of the image acquisition position can easily be judged by the operator.
In addition, the observation supporting device 6 detects the shape of the inserting section 31 with the fiber shape sensor 4, and detects the insertion amount and rotation amount of the inserting section 31 with the insertion and rotation detecting section 5. Therefore, the shape of the inserting section 31 in relation to the insertion subject 2 and a position and a direction of the image acquisition opening 33 can be detected.
It is to be noted that the fiber shape sensor 4 the insertion and rotation detecting section 5 optically detects the shape of the inserting section 31 inserted into the insertion subject 2 and a position and a direction of the image acquisition opening 33 as described above, but may detect the same by another method. For example, a coil is disposed in the vicinity of at least the image acquisition opening 33 in the inserting section 31 and a current is passed through the coil to generate a magnetic field which is received on the outside, or a magnetic field distribution generated on the outside is received by the coil, so that the position or direction of the coil, i.e., the image acquisition opening 33 can be detected. It is to be noted that when the coils are disposed in a longitudinal direction of the inserting section 31, the shape of the inserting section 31 can also be detected.
Furthermore, the weighting information is calculated in proportion to the speed of the image acquisition position. When the speed is fast, the weighting information is made smaller, and when the speed is slow, the weighting information is enlarged, so that it can be judged that the image acquisition position where the weighting information is small moves fast and therefore cannot be observed.
In addition, the current position display 74 showing the image acquisition position and the position locus display 75 showing the locus of the image acquisition position can be changed and displayed on the basis of the weighting information of the image acquisition position. For example, when the weighting information is large, i.e., the speed of the image acquisition position is slow, the display colors of the current position display 74 and the position locus display 75 can be changed so as to deepen. In addition, when the weighting information is small, i.e., the speed of the image acquisition position is fast, the display colors of the current position display 74 and the position locus display 75 can be changed so as to lighten. When the display is changed in this manner, the importance of the information of the image acquisition position is visually easily recognized.
Additionally, it has been described that the weighting information is proportional to the speed of the image acquisition position, but the weighting information may be represented by another relational equation such as an exponential function.
In addition, it has been described that the weighting index parameter is the speed of the image acquisition position, but the parameter may be the speed of the inserting section distal end, i.e., the image acquisition opening 33.
Furthermore, as the weighting index parameter, the following parameters are usable.
(a) That is, an image acquisition distance that is a distance D between the image acquisition position P and the position of the image acquisition opening 33 can be used as the weighting index parameter. For example, as shown in
Furthermore, concerning the distance D between the image acquisition opening 33 and the image acquisition position P, another way to attach weighting information may be as shown in
(b) Alternatively, as shown in
(c) In addition, a stop time of the position of the image acquisition section 34 in relation to the insertion subject 2 or a stop time of the image acquisition position may be used as the weighting index parameter. In this case, it is considered that the stop time is an observation time, and as shown in
(d) It is to be noted that the speed of the image acquisition position or the speed of the image acquisition opening 33 in relation to the insertion subject 2 may be a movement amount of the image acquisition position or the position of the image acquisition opening 33 in relation to the insertion subject 2 in an exposure time of the image acquisition section 34. When the movement amount in the exposure time is large, a blurring amount of the image is large and hence the weighting information is made smaller, and when the movement amount is small, the blurring amount is small and hence the weighting information is enlarged. In this case, the display calculating section 64 needs to include a function of calculating the exposure time.
(e) In addition, a temporal change of a bend amount of the bending portion 35 may be used as the weighting index parameter. That is, as shown in
Furthermore, the weighting index parameter may be based on the image acquired by the image acquisition section 34. As the weighting index parameter in this case, the following parameters are usable.
(a) That is, as shown in
(b) In addition, the blurring amount of the image may be used as the weighting index parameter. That is, as shown in
In addition, when the image is used as the weighting index parameter, the whole image is not used, but a limited predetermined range in the image may be used as the weighting index parameter. For example, a region of interest in the insertion subject 2 is usually caught at a center of the image, and hence the center of the image is often more important than a periphery thereof. Therefore, as shown in
On the other hand, when the insertion subject 2 is tubular, the center of the image includes an inner part of a tube and cannot be often observed, and hence the range is limited to the periphery of the image. This can be realized by, for example, setting the predetermined range of the image in which the display calculating section 64 calculates the weighting information to a range of the image in which a distance between the image acquisition opening 33 and the insertion subject 2 in an image acquisition range 87 (see
In addition, as the display format on the basis of the weighting information, it has been described that the change of the display of the image acquisition position is the change of the depth of the color, but the change may be a change to another color or a change of transparency. Alternatively, a set of points may be displayed, and the change may be a change of a density of the points as shown in
In addition, it has been described that the inserting section 31 is the flexible tubular member, but the inserting section may have an inflexible. When the inserting section 31 has the inflexible in this manner, the fiber shape sensor 4 is not required, and the insertion and rotation detecting section 5 detects the position of the inserting section distal end in relation to the insertion subject 2. It is to be noted that the direction of the inserting section distal end can be obtained on the basis of, for example, a movement history of the image acquisition region 83 which is detected from the acquired image by the pattern recognition or the like.
In addition, the description has been given as to the example where one weighting index parameter is used, but the weighting index parameters may be set and the weighting index parameters may be calculated. For example, when the first weighting index parameter is the speed of the image acquisition position and the second weighting index parameter is the distance between the image acquisition opening 33 and the image acquisition position, an operation of the observation supporting device 6 is as shown in
That is, the above operation of the step S11 to the step S16 as described with reference to
w
1
=f(V) (5)
Additionally, in parallel with this calculation, the display calculating section 64 calculates the distance D between the image acquisition position P and the position of the image acquisition opening 33 (step S21). It is to be noted that when the speed V of the image acquisition position is obtained in the above step S16, the current acquired image and one previous acquired image are used, but this distance D is a distance at a time when the current image is acquired. Furthermore, the display calculating section 64 calculates second weighting information w2 from this obtained distance D (step S22). For example, the weighting is performed in accordance with a relation (the following equation (6)) in which, as shown in
w
2
=f(D) (6)
Furthermore, as represented by the following equation (7), a sum of the first weighting information w1 and the second weighting information w2 is calculated (a product may be calculated or another calculating method may be used) to obtain final weighting information w.
w=w
1
+w
2 (7)
Afterward, the processing advances to the abovementioned step S18, in which the display calculating section 64 sets the display format on the basis of the obtained final weighting information w.
When the weighting is performed from the weighting index parameters in this manner, an accuracy of the importance of the image acquisition position information enhances.
The description has been given as to the example where the history of the image acquisition position information is displayed as the position locus display 75, but further, the history of the position of the inserting section distal end, e.g., the image acquisition opening 33 may be displayed. This fact will be described with reference to
In
When the position of the image acquisition opening 33 is recognized, a specific position of the image acquisition object which is reached is recognized. When the current position is exactly recognized, the observation or treatment to be carried out at the current position or investigation of a path from the current position to a target position can be performed by using this information, without presuming that the current position would be this place. Therefore, it is not necessary to repeat trial and error in reaching the target position, nor is it necessary to confirm whether or not the target position was reached, by various methods including, for example, a method of observing the acquired image. As a result, there is a high possibility that the target position can be reached at one time by taking the path close to the shortest course from the current position to the target position, so that time can be reduced and furthermore, a situation concerning the position can be grasped, which leads to a calmed and assured operation.
Furthermore, in addition to the history of the position of the image acquisition opening 33, a history of a one-dimensional direction in which the image acquisition opening 33 is directed may be displayed. The direction in which the image acquisition opening 33 is directed is, for example, the center of the viewing field (the image acquisition region 83).
It is to be noted that depending on the optical system for the image acquisition, in the present example, the direction in which the image acquisition opening 33 present at the inserting section distal end is directed is the center of the viewing field and is the middle of the acquired image.
When the position and direction of the inserting section distal end are recognized, a position reached and a direction in the image acquisition object are recognized. An observation viewing field direction and the viewing field center are seen from the current position and direction. When the reaching position and direction or the observation viewing field direction and viewing field center are exactly recognized, it is possible to perform the observation or treatment to be carried out in accordance with the current position and direction, or the investigation of the path from the current position to the target position and the shape or operating method of the inserting section 31 during the movement, by use of this information without presuming that the current position and direction would be such the position and direction. In particular, when the direction of the inserting section distal end is recognized, it is possible to investigate an operating method or procedure such as insertion/extraction or bending for the purpose of reaching the target position or direction.
The history of the direction in which the image acquisition opening 33 is directed may three-dimensionally be shown to indicate the direction including a posture or rotation of the inserting section distal end. As shown in
A second embodiment of the present invention is different from the above first embodiment in the following respects. That is, an observation supporting device 6 concerned with the present second embodiment sets a threshold value to a weighting index parameter and determines weighting information of an image acquisition position by comparison with the threshold value.
Hereinafter, a part different from the above first embodiment will only be described.
Afterward, similarly to the above first embodiment, the processing advances to the above step S18, in which the display calculating section 64 sets a display format on the basis of the obtained final weighting information w.
It is to be noted that as the display format, similarly to the above first embodiment, a change of a color, a change of a transparency or a change of a density of points may be used, but when information of the image acquisition position is divided into two types of weighting information by the comparison with the threshold value as in the present embodiment, as shown in
As described above, according to the present second embodiment, a weighting index parameter such as the speed V of the image acquisition position is compared with the threshold value (e.g., Vt) to calculate the weighting information, so that the information of the image acquisition position can be divided into the two types of weighting information. Therefore, for example, it can be seen whether the information is image acquisition position information when the speed is faster than that of the threshold value or image acquisition position information when the speed is slower.
In addition, the threshold value Vt of the speed of the image acquisition position is set to the maximum speed of the image acquisition position where the person (the operator) can recognize the image, so that it is seen that the speed is in a range in which an image acquisition section 34 performs the image acquisition but the person cannot recognize, i.e., cannot observe.
Furthermore, as to the image acquisition position, on the basis of the weighting information of the image acquisition position, a locus of the image acquisition position is displayed when the weighting information is large, i.e., the speed V of the image acquisition position is slower than the threshold value Vt, and the locus of the image acquisition position is not displayed when the weighting information is small, i.e., the speed V of the image acquisition position is faster than the threshold value Vt. In consequence, a locus of the range in which the movement of the image acquisition position is so fast that the operator cannot observe is not displayed as an observed range.
It is to be noted that the threshold value is not limited to one value, and the threshold values may be used.
In addition, the weighting index parameter is not limited to the speed of the image acquisition position, and needless to say, various parameters can be applied as described in the above first embodiment.
For example, when the weighting index parameter is a distance between an inserting section distal end and an insertion subject 2, the threshold value can be a range of a subject field depth. When the weighting index parameter is a brightness of the image, the threshold value can be presence/absence of halation and black defects. Furthermore, as the threshold value, the operator may input any value.
In addition, the description has been given as to the example where one weighting index parameter is used, but the weighting index parameters may be set and the weighting index parameters may be calculated. For example, when a first weighting index parameter is the speed of the image acquisition position and a second weighting index parameter is a distance between an image acquisition opening 33 and the image acquisition position, an operation of the observation supporting device 6 is as shown in
That is, the above operation of the step S11 to the step S16 as described with reference to
Additionally, in parallel with this determination, the display calculating section 64 calculates a distance D between the image acquisition position P and a position of the image acquisition opening 33 (step S21). Furthermore, the display calculating section 64 compares the distance D with a threshold value Dt beforehand stored in the display calculating section 64 (step S29), and determines second weighting information from the comparison result. That is, when the distance D between the image acquisition position P and the position of the image acquisition opening 33 is the threshold value Dt or less, the display calculating section 64 determines that the second weighting information is large (step S30), and when the distance D is larger than the threshold value Dt, the section determines that the second weighting information is small (step S31).
When the first and second weighting index parameters are compared with the threshold values in this manner, respectively, to determine the size of the first and second weighting information, the display calculating section 64 next judges whether or not both of the first and second weighting information are large (step S32). When both of the first and second weighting information are large, the display calculating section 64 determines that the final weighting information is large (step S33).
On the other hand, when both of the first and second weighting information are not large, the display calculating section 64 further judges whether or not both of the first and second weighting information are small (step S34). When both of the first and second weighting information are small, the display calculating section 64 determines that the final weighting information is small (step S35).
In addition, when both of the first and second weighting information are not small, i.e., when one information is large and the other information is small, the display calculating section 64 determines that the final weighting information is medium (step S36).
In consequence, the display calculating section 64 determines the final weighting information from three stages of weighting information. Afterward, the step advances to the abovementioned step S18, in which the display calculating section 64 sets the display format on the basis of the obtained final weighting information w.
The weighting is performed from the weighting index parameters in this manner, so that an accuracy of importance of the image acquisition position information enhances.
The present invention has been described above on the basis of the embodiments, but needless to say, the present invention is not restricted to the abovementioned embodiments and various modifications or applications are possible within the gist of the present invention.
For example, a program of software to realize the function shown in the flowchart of
Additional advantages and modifications will readily occur to those skilled in the art. Therefore, the invention in its broader aspects is not limited to the specific details, and representative devices shown and described herein. Accordingly, various modifications may be made without departing from the spirit or scope of the general inventive concept as defined by the appended claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
2012-229255 | Oct 2012 | JP | national |
This application is a Continuation Application of PCT Application No. PCT/JP2013/077680, filed Oct. 10, 2013 and based upon and claiming the benefit of priority from the prior Japanese Patent Application No. 2012-229255, filed Oct. 16, 2012, the entire contents of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2013/077680 | Oct 2013 | US |
Child | 14688244 | US |