The present invention relates to an ultrasound endoscope system and a method of operating an ultrasound endoscope system that observe a state of an observation target part in a body of a subject using an ultrasonic wave.
For example, in an ultrasound endoscope system, for the primary purpose of observation of pancreas, a gallbladder, or the like using a trans-digestive tract, an ultrasound endoscope having an endoscope observation part and an ultrasound observation part at a distal end is inserted into a digestive tract of a subject, and an endoscope image inside the digestive tract and an ultrasound image of a part outside a wall of the digestive tract are captured.
In the ultrasound endoscope system, an observation target adjacent part inside the digestive tract is irradiated with illumination light from an illumination part provided at the distal end of the ultrasound endoscope, reflected light of illumination light is received by an imaging part provided at the distal end of the ultrasound endoscope, and an endoscope image is generated from an imaging signal of reflected light. Ultrasonic waves are transmitted and received to and from an observation target part, such as an organ outside the wall of the digestive tract, by a plurality of ultrasound transducers provided at the distal end of the ultrasound endoscope, and an ultrasound image is generated from reception signals of the ultrasonic waves.
In a case of observing the endoscope image, an operator (a user of the ultrasound endoscope system) comparatively easily knows a position of the distal end portion of the ultrasound endoscope in a digestive organ of the subject, a direction of the distal end portion of the ultrasound endoscope, and a part being observed at this moment in such a manner that, in a case where the ultrasound endoscope is inserted into the digestive tract of the subject, for example, an inner wall of an esophagus comes into view, and in a case where a distal end portion of the ultrasound endoscope is further pushed forward, an inner wall of a stomach comes into view.
In contrast, there is a problem in that an operator who is unaccustomed to an ultrasound image extremely hardly knows what is displayed in the ultrasound image, for example, pancreas or a gallbladder or a blood vessel, a bile duct, or a pancreatic duct, and a range of an organ, such as pancreas or a gallbladder, displayed in the ultrasound image. There is also a problem in that the operator who is unaccustomed to an ultrasound image does not know a position of the distal end portion of the ultrasound endoscope, a direction of the distal end portion of the ultrasound endoscope, and a part being observed at this moment during the observation of the ultrasound image, and gets lost in the body of the subject.
Here, examples of the related art having relation to the invention are JP1994-233761A (JP-1106-233761A), JP2010-069018A, JP1990-045045A (JP-H02-045045A), and JP2004-113629A.
JP1994-233761A (JP-H06-233761A) describes that an intended part in an image of a diagnosis part inside a subject is roughly extracted, global information for recognizing the intended part is predicted using a neural network, a contour of the intended part is recognized using the global information, and a recognition result is displayed along with an original image.
JP2010-069018A describes that position and alignment data of a distal end portion of an ultrasound endoscope is generated based on an electric signal from a coil, insertion shape data for indicating an insertion shape of the ultrasound endoscope is generated from the position and alignment data, a guide image is generated by combining the insertion shape data with three-dimensional biological tissue model data of a tissue structure of an organ group or the like of a subject, and a video signal of a composite image, in which an ultrasound image and the guide image are composed, is generated and displayed on a monitor.
JP2010-069018A describes that the composite image is displayed such that a stereoscopic guide image and a cross-sectional guide image are disposed in a left region of a screen, and the ultrasound image is disposed in a right region of the screen.
JP2010-069018A describes a button for enlarging or reducing a display range of the ultrasound image.
JP1990-045045A (JP-H02-045045A) describes that an ultrasonic tomographic image and an optical image of a subject are displayed at one place adjacently within a screen of a display device to observe such that both images can be observed simultaneously.
JP2004-113629A describes that an ultrasound image and a schematic view are displayed on the same screen, the schematic view is a schema diagram or an actual optical image of a human body, and a scanning plane and an insertion shape of an ultrasound endoscope are displayed together in the schematic view.
JP2004-113629A describes that a region of a scanning position of the ultrasound endoscope is detected from a signal of a position and a direction of the ultrasound endoscope detected using a coil to output ultrasound scanning region data, part name data corresponding to the ultrasound scanning region data is read from a part name storage unit, and a part name is superimposedly displayed on an ultrasound image.
JP1994-233761A (JP-H06-233761A) describes that a contour of an intended part in an image of a diagnosis part is recognized, but does not describe that a name of the intended part is recognized from the image.
JP2010-069018A and JP2004-113629A describe that the position and the orientation of the distal end portion of the ultrasound endoscope are detected using the coil, but does not describe that the position and the orientation of the distal end portion of the ultrasound endoscope are detected from the ultrasound image without needing an additional component, such as the coil.
JP2010-069018A, JP1990-045045A (JP-H02-045045A), and JP2004-113629A describe that the guide image, such as a schema diagram, the endoscope image, and the ultrasound image are displayed in combination, but does not describe that a combination of the images is switched and displayed easily to see in response to an instruction from a user. Furthermore, there is no description that the guide image, such as a schema diagram, the endoscope image, and the ultrasound image are displayed in combination.
JP2004-113629A describes that the part name is superimposedly displayed on the ultrasound image, but does not describe that the name of the part displayed in the ultrasound image is recognized from the ultrasound image without using additional data, such as part name data, and displayed.
JP1994-233761A (JP-H06-233761A), JP2010-069018A, JP1990-045045A (JP-H02-045045A), and JP2004-113629A do not describe that the name of the organ and the range of the organ displayed in the ultrasound image are displayed simultaneously, the position and the orientation of the distal end portion of the ultrasound endoscope and the range of the organ are displayed simultaneously, or the name of the organ displayed in the ultrasound image, the position and the orientation of the distal end portion of the ultrasound endoscope, and the range of the organ displayed in the ultrasound image are displayed simultaneously.
Accordingly, a first object of the invention is to provide an ultrasound endoscope system and a method of operating an ultrasound endoscope system capable of recognizing a name of an organ displayed in an ultrasound image, a range of the organ, and a position and an orientation of a distal end portion of an ultrasound endoscope from the ultrasound image, and displaying the recognized information on a monitor.
A second object of the invention is to provide an ultrasound endoscope system and a method of operating an ultrasound endoscope system capable of switching and displaying an endoscope image, an ultrasound image, and an anatomical schema diagram easily to see in response to an instruction from a user.
To achieve the above-described objects, the invention provides an ultrasound endoscope system comprising an ultrasound endoscope that has an ultrasound transducer at a distal end, an ultrasound observation device that makes the ultrasound transducer transmit and receive an ultrasonic wave and generates an ultrasound image for diagnosis from a reception signal of the ultrasonic wave, an ultrasound image recognition unit that learns at least one of a relationship between an ultrasound image for learning and a name of an organ displayed in the ultrasound image for learning or a relationship between the ultrasound image for learning and a position of a distal end portion of the ultrasound endoscope at the time of imaging of the ultrasound image for learning, on a plurality of the ultrasound images for learning in advance, and recognizes at least one of a name of an organ displayed in the ultrasound image for diagnosis or a position of the distal end portion of the ultrasound endoscope from the ultrasound image for diagnosis based on a learning result, and a display controller that displays at least one of the name of the organ or the position of the distal end portion of the ultrasound endoscope recognized by the ultrasound image recognition unit, on a monitor.
Here, it is preferable that the ultrasound endoscope system further comprises an instruction acquisition unit that acquires an instruction input from a user, and in response to an instruction from the user, the display controller superimposedly displays the name of the organ on the ultrasound image for diagnosis and superimposedly displays the position of the distal end portion of the ultrasound endoscope on an anatomical schema diagram.
It is preferable that, in response to an instruction from the user, the display controller displays two or more images including at least one of the ultrasound image for diagnosis with the name of the organ superimposedly displayed or the anatomical schema diagram with the position of the distal end portion of the ultrasound endoscope superimposedly displayed from among the ultrasound image for diagnosis with the name of the organ not displayed, the ultrasound image for diagnosis with the name of the organ superimposedly displayed, the anatomical schema diagram with the position of the distal end portion of the ultrasound endoscope not displayed, and the anatomical schema diagram with the position of the distal end portion of the ultrasound endoscope superimposedly displayed, in parallel within a screen of the monitor.
It is preferable that the ultrasound endoscope further has an illumination part and an imaging part at the distal end, the ultrasound endoscope system further comprises an endoscope processor that makes the imaging part receive reflected light of illumination light emitted from the illumination part and generates an endoscope image for diagnosis from an imaging signal of the reflected light, and an instruction acquisition unit that acquires an instruction input from a user, and the display controller displays the endoscope image for diagnosis within a screen of the monitor in response to an instruction from the user.
It is preferable that the ultrasound endoscope system further comprises an endoscope image recognition unit that recognizes a lesion region displayed in the endoscope image for diagnosis from the endoscope image for diagnosis, and in response to an instruction from the user, the display controller displays the endoscope image for diagnosis with the lesion region superimposedly displayed, on the monitor.
It is preferable that, in response to an instruction from the user, the display controller displays two or more images including at least one of the ultrasound image for diagnosis with the name of the organ superimposedly displayed or an anatomical schema diagram with the position of the distal end portion of the ultrasound endoscope superimposedly displayed from among the endoscope image for diagnosis with the lesion region not displayed, the endoscope image for diagnosis with the lesion region superimposedly displayed, the ultrasound image for diagnosis with the name of the organ not displayed, the ultrasound image for diagnosis with the name of the organ superimposedly displayed, an anatomical schema diagram with the position of the distal end portion of the ultrasound endoscope not displayed, and an anatomical schema diagram with the position of the distal end portion of the ultrasound endoscope superimposedly displayed, in parallel within the screen of the monitor.
It is preferable that one image of the two or more images displayed on the monitor is displayed as an image of interest to be greater than other images.
It is preferable that the display controller switches and displays the image of interest from the one image to one of the other images in response to an instruction from the user.
It is preferable that, in response to an instruction from the user, the display controller displays the ultrasound image for diagnosis with the name of the organ not displayed, the ultrasound image for diagnosis with the name of the organ superimposedly displayed, and the anatomical schema diagram with the position of the distal end portion of the ultrasound endoscope superimposedly displayed, in parallel within the screen of the monitor.
It is preferable that the ultrasound image recognition unit operates in a case where the ultrasound image for diagnosis or an anatomical schema diagram is displayed within the screen of the monitor, and the endoscope image recognition unit operates in a case where the endoscope image for diagnosis is displayed within the screen of the monitor.
It is preferable that the ultrasound image recognition unit learns at least one of the relationship between the ultrasound image for learning and the name of the organ displayed in the ultrasound image for learning or a relationship between the ultrasound image for learning and the position and an orientation of the distal end portion of the ultrasound endoscope at the time of imaging of the ultrasound image for learning, on the plurality of ultrasound images for learning in advance, and recognizes at least one of the name of the organ displayed in the ultrasound image for diagnosis or the position and an orientation of the distal end portion of the ultrasound endoscope from the ultrasound image for diagnosis based on a learning result, and the display controller displays at least one of the name of the organ recognized by the ultrasound image recognition unit or the position and the orientation of the distal end portion of the ultrasound endoscope recognized by the ultrasound image recognition unit, on the monitor.
It is preferable that the ultrasound endoscope system further comprises an instruction acquisition unit that acquires an instruction input from a user, and in response to an instruction from the user, the display controller displays an anatomical schema diagram with the position and the orientation of the distal end portion of the ultrasound endoscope superimposedly displayed, on the monitor.
It is preferable that the ultrasound image recognition unit further learns a relationship between the ultrasound image for learning and a range of the organ displayed in the ultrasound image for learning, on the plurality of ultrasound images for learning in advance, and recognizes the range of the organ displayed in the ultrasound image for diagnosis from the ultrasound image for diagnosis based on a learning result, and the display controller further displays the range of the organ recognized by the ultrasound image recognition unit, on the monitor.
It is preferable that the display controller colors an internal region of the range of the organ recognized by the ultrasound image recognition unit and displays the range of the organ with the internal region colored, on the monitor or provides a frame indicating the range of the organ recognized by the ultrasound image recognition unit, colors the frame, and displays the range of the organ with the frame colored, on the monitor.
It is preferable that the display controller colors the internal region or the frame in a different color for each type of organ with the range recognized by the ultrasound image recognition unit.
It is preferable that the ultrasound endoscope system further comprises an instruction acquisition unit that acquires an instruction input from a user, and a color registration unit that registers a relationship between the type of the organ and the color of the internal region or the frame in response to an instruction from the user, and the display controller colors the internal region or the frame in a color designated by the instruction from the user or colors the internal region or the frame in a color of the internal region or the frame corresponding to the type of the organ registered in the color registration unit.
It is preferable that the ultrasound image recognition unit further calculates a confidence factor of the name of the organ recognized by the ultrasound image recognition unit, and the display controller decides at least one of a display method of the name of the organ displayed on the monitor or a coloring method of the internal region or the frame depending on the confidence factor.
It is preferable that, in a case of superimposedly displaying the name of the organ on the ultrasound image for diagnosis, the display controller decides at least one of the color of the name of the organ or the color of the internal region or the frame depending on brightness of the ultrasound image for diagnosis displayed behind a display region of the name of the organ.
It is preferable that the ultrasound endoscope system further comprises an instruction acquisition unit that acquires an instruction input from a user, and the display controller switches whether to display only one, both, or none of the name of the organ recognized by the ultrasound image recognition unit and the range of the organ with the internal region or the frame colored, in response to an instruction from the user.
It is preferable that the display controller decides a position where the name of the organ recognized by the ultrasound image recognition unit is displayed on the monitor, depending on the range of the organ recognized by the ultrasound image recognition unit.
It is preferable that the display controller decides whether or not to display the name of the organ recognized by the ultrasound image recognition unit on the monitor, depending on the range of the organ recognized by the ultrasound image recognition unit.
It is preferable that the ultrasound endoscope system further comprises an instruction acquisition unit that acquires an instruction input from a user, and an organ registration unit that registers a type of an organ for displaying a range in response to an instruction from the user, and in a case where an organ with a range recognized by the ultrasound image recognition unit is an organ registered in the organ registration unit, the display controller displays the range of the organ recognized by the ultrasound image recognition unit, on the monitor.
It is preferable that the ultrasound endoscope system further comprises an instruction acquisition unit that acquires an instruction input from a user, and the display controller sequentially switches a type of an organ for displaying a range in response to an instruction from the user.
It is preferable that the ultrasound image recognition unit is incorporated in the ultrasound observation device.
It is preferable that the ultrasound endoscope further has an illumination part and an imaging part at the distal end, the ultrasound endoscope system further comprises an endoscope processor that makes the imaging part receive reflected light of illumination light emitted from the illumination part and generates an endoscope image for diagnosis from an imaging signal of the reflected light, and the ultrasound image recognition unit is incorporated in the endoscope processor.
It is preferable that the ultrasound endoscope further has an illumination part and an imaging part at the distal end, the ultrasound endoscope system further comprises an endoscope processor that makes the imaging part receive reflected light of illumination light emitted from the illumination part and generates an endoscope image for diagnosis from an imaging signal of the reflected light, and the ultrasound image recognition unit is provided outside the ultrasound observation device and the endoscope processor.
The invention provides a method of operating an ultrasound endoscope system, the method comprising a step of, with an ultrasound image recognition unit, learning at least one of a relationship between an ultrasound image for learning and a name of an organ displayed in the ultrasound image for learning or a relationship between the ultrasound image for learning and a position of a distal end portion of an ultrasound endoscope at the time of imaging of the ultrasound image for learning, on a plurality of the ultrasound images for learning in advance, a step of, with an ultrasound observation device, making an ultrasound transducer provided at a distal end of the ultrasound endoscope transmit and receive an ultrasonic wave and generates an ultrasound image for diagnosis from a reception signal of the ultrasonic wave, a step of, with the ultrasound image recognition unit, recognizing at least one of a name of an organ displayed in the ultrasound image for diagnosis or a position of the distal end portion of the ultrasound endoscope from the ultrasound image for diagnosis based on a learning result, and a step of, with a display controller, displaying at least one of the name of the organ or the position of the distal end portion of the ultrasound endoscope recognized by the ultrasound image recognition unit, on a monitor.
Here, it is preferable that, in response to an instruction from a user, the name of the organ is superimposedly displayed on the ultrasound image for diagnosis, and the position of the distal end portion of the ultrasound endoscope is superimposedly displayed on an anatomical schema diagram.
It is preferable that, in response to an instruction from the user, two or more images including at least one of the ultrasound image for diagnosis with the name of the organ superimposedly displayed or the anatomical schema diagram with the position of the distal end portion of the ultrasound endoscope superimposedly displayed from among the ultrasound image for diagnosis with the name of the organ not displayed, the ultrasound image for diagnosis with the name of the organ superimposedly displayed, the anatomical schema diagram with the position of the distal end portion of the ultrasound endoscope not displayed, and the anatomical schema diagram with the position of the distal end portion of the ultrasound endoscope superimposedly displayed are displayed in parallel within a screen of the monitor.
It is preferable that one image of the two or more images displayed on the monitor is displayed as an image of interest to be greater than other images.
It is preferable that the image of interest is switched from the one image to one of the other images and displayed in response to an instruction from the user.
It is preferable that at least one of the relationship between the ultrasound image for learning and the name of the organ displayed in the ultrasound image for learning or a relationship between the ultrasound image for learning and the position and orientation of the distal end portion of the ultrasound endoscope at the time of imaging of the ultrasound image for learning is learned on the plurality of ultrasound images for learning in advance,
at least one of the name of the organ displayed in the ultrasound image for diagnosis or the position and the orientation of the distal end portion of the ultrasound endoscope is recognized from the ultrasound image for diagnosis based on a learning result, and at least one of the name of the organ recognized by the ultrasound image recognition unit or the position and the orientation of the distal end portion of the ultrasound endoscope recognized by the ultrasound image recognition unit is displayed on the monitor.
It is preferable that the method further comprises a step of, with the ultrasound image recognition unit, learning a relationship between the ultrasound image for learning and a range of the organ displayed in the ultrasound image for learning, on the plurality of ultrasound images for learning in advance, a step of, with the ultrasound image recognition unit, recognizing the range of the organ displayed in the ultrasound image for diagnosis from the ultrasound image for diagnosis based on a learning result, and a step of, with the display controller, further displaying the range of the organ recognized by the ultrasound image recognition unit, on the monitor.
It is preferable that the ultrasound image recognition unit, the display controller, the instruction acquisition unit, and the endoscope image recognition unit are hardware or a processor that executes a program, and it is preferable that the color registration unit and the organ registration unit are hardware or a memory.
According to the invention, since the name of the organ displayed in the ultrasound image for diagnosis and the range of the organ are displayed on the monitor, for example, even a user who is unaccustomed to an ultrasound image can correctly recognize what is displayed in the ultrasound image and the range of the organ displayed ultrasound image. Furthermore, since the position and the orientation of the distal end portion of the ultrasound endoscope are displayed on the monitor, for example, even a user who is unaccustomed to an ultrasound image can correctly recognize a position of the distal end portion of the ultrasound endoscope, a direction of the distal end portion of the ultrasound endoscope, and a part being observed at this moment, and does not get lost in a body of a subject.
According to the invention, it is possible to switch and display an endoscope image, an ultrasound image, and an anatomical schema diagram easily to see. While an image of interest in which the user is interested changes occasionally, the user can switch the image of interest at any timing, and thus, it is possible to allow the user to display and view an image in which the user is interested on the occasion, as an image of interest to be greater than other images.
An ultrasound endoscope system according to an embodiment (the embodiment) of the invention will be described below in detail referring to preferred embodiments shown in the accompanying drawings.
The embodiment is a representative embodiment of the invention, but is merely an example and does not limit the invention.
The outline of an ultrasound endoscope system 10 according to the embodiment will be described referring to
The ultrasound endoscope system 10 is used to make observation (hereinafter, referred to as ultrasonography) a state of an observation target part in a body of a patient as a subject using ultrasonic waves. Here, the observation target part is a part that is hardly examined from a body surface side of the patient, and is, for example, pancreas or a gallbladder. With the use of the ultrasound endoscope system 10, it is possible to perform ultrasonography of a state of the observation target part and the presence or absence of an abnormality by way of digestive tracts, such as esophagus, stomach, duodenum, small intestine, and large intestine, which are body cavities of the patient.
The ultrasound endoscope system 10 acquires an ultrasound image and an endoscope image, and as shown in
The ultrasound endoscope 12 comprises an insertion part 22 that is inserted into the body cavity of the patient, an operating part 24 that is operated by an operator (user), such as a physician or a technician, and an ultrasound transducer unit 46 (see
Here, the “endoscope image” is an image that is obtained by imaging a body cavity inner wall of the patient using an optical method. Furthermore, the “ultrasound image” is an image that is obtained by receiving reflected waves (echoes) of ultrasonic waves transmitted from the inside of the body cavity of the patient toward the observation target part and imaging reception signals.
The ultrasound endoscope 12 will be described below in detail.
The ultrasound observation device 14 is connected to the ultrasound endoscope 12 through a universal cord 26 and an ultrasound connector 32a provided in an end portion of the universal cord 26. The ultrasound observation device 14 performs control such that the ultrasound transducer unit 46 of the ultrasound endoscope 12 transmits the ultrasonic waves. The ultrasound observation device 14 generates the ultrasound image by imaging the reception signals when the ultrasound transducer unit 46 receives the reflected waves (echoes) of the transmitted ultrasonic waves. In other words, the ultrasound observation device 14 makes a plurality of ultrasound transducers 48 of the ultrasound transducer unit 46 transmit and receive ultrasonic waves and generates an ultrasound image for diagnosis (hereinafter, simply referred to as an ultrasound image) from reception signals of the ultrasonic waves.
The ultrasound observation device 14 will be described below in detail.
The endoscope processor 16 is connected to the ultrasound endoscope 12 through the universal cord 26 and an endoscope connector 32b provided in the end portion of the universal cord 26. The endoscope processor 16 acquires image data of an observation target adjacent part imaged by the ultrasound endoscope 12 (in detail, the solid-state imaging element 86 described below) and executes predetermined image processing on the acquired image data to generate an endoscope image. In other words, the endoscope processor 16 makes the imaging part provided at the distal end of the ultrasound endoscope 12 receive reflected light of illumination light emitted from the illumination part provided at the distal end of the ultrasound endoscope 12 similarly and generates an endoscope image for diagnosis (hereinafter, simply referred to as an endoscope image) from an imaging signal of the reflected light.
Here, the “observation target adjacent part” is a portion that is at a position adjacent to the observation target part in the inner wall of the body cavity of the patient.
In the embodiment, the ultrasound observation device 14 and the endoscope processor 16 are configured with two devices (computers) provided separately. However, the invention is not limited thereto, and both of the ultrasound observation device 14 and the endoscope processor 16 may be configured with one device.
The light source device 18 is connected to the ultrasound endoscope 12 through the universal cord 26 and a light source connector 32c provided in the end portion of the universal cord 26. The light source device 18 irradiates the observation target adjacent part with white light composed of three primary color light of red light, green light, and blue light or light having a specific wavelength in imaging the observation target adjacent part using the ultrasound endoscope 12. Light emitted from the light source device 18 propagates through the ultrasound endoscope 12 through a light guide (not shown) included in the universal cord 26 and is emitted from the ultrasound endoscope 12 (in detail, the illumination windows 88 described below). With this, the observation target adjacent part is illuminated with light from the light source device 18.
The monitor 20 is connected to the ultrasound observation device 14 and the endoscope processor 16, and displays an anatomical schema diagram and the like in addition to an ultrasound image generated by the ultrasound observation device 14 and an endoscope image generated by the endoscope processor 16.
As a display method of the ultrasound image and the endoscope image, a method in which one image is switched to one of other images and displayed on the monitor 20 or a method in which two or more images are simultaneously arranged and displayed may be applied.
In the embodiment, although the ultrasound image and the endoscope image are displayed on one monitor 20, a monitor for ultrasound image display, a monitor for endoscope image display, and a monitor for an anatomical schema diagram may be provided separately. Alternatively, the ultrasound image and the endoscope image may be displayed in a display form other than the monitor 20, for example, in a form of being displayed on a display of a terminal carried with the operator.
The console 100 is an example of an instruction acquisition unit that acquires an instruction input from the operator (user), and is a device that is provided to allow the operator to input necessary information in a case of ultrasonography, to issue an instruction to start ultrasonography to the ultrasound observation device 14, and the like. The console 100 is configured with, for example, a keyboard, a mouse, a trackball, a touch pad, a touch panel, and the like. In a case where the console 100 is operated, a CPU (control circuit) 152 (see
Specifically, the operator inputs examination information (for example, examination order information including date, an order number, and the like and patient information including a patient ID, a patient name, and the like) through the console 100 in a state before starting ultrasonography. After the input of the examination information is completed, in a case where the operator issues an instruction to start ultrasonography through the console 100, the CPU 152 of the ultrasound observation device 14 controls the respective units of the ultrasound observation device 14 such that ultrasonography is executed based on the input examination information.
Furthermore, the operator can set various control parameters through the console 100 in executing ultrasonography. As the control parameters, for example, a selection result of a live mode and a freeze mode, a set value of a display depth (depth), a selection result of an ultrasound image generation mode, and the like are exemplified.
Here, the “live mode” is a mode where ultrasound images (video) obtained at a predetermined frame rate are displayed successively (displayed in real time). The “freeze mode” is a mode where an image (static image) of one frame of ultrasound images (video) generated in the past is read from a cine memory 150 described below and displayed.
In the embodiment, a plurality of ultrasound image generation modes are selectable, and specifically, include brightness (B) mode, a color flow (CF) mode, and a pulse wave (PW) mode. The B mode is a mode where amplitude of an ultrasound echo is converted into brightness and a tomographic image is displayed. The CF mode is a mode where an average blood flow speed, flow fluctuation, intensity of a flow signal, flow power, or the like are mapped to various colors and superimposedly displayed on a B mode image. The PW mode is a mode where a speed (for example, a speed of a blood flow) of an ultrasound echo source detected based on transmission and reception of a pulse wave is displayed.
The above-described ultrasound image generation modes are merely examples, and modes other than the above-described three kinds of modes, for example, an amplitude (A) mode, a motion (M) mode, a contrast radiography mode, and the like may be further included.
Next, the configuration of the ultrasound endoscope 12 will be described referring to
As described above, the ultrasound endoscope 12 has the insertion part 22 and the operating part 24. As shown in
Furthermore, as shown in
The bending portion 42 is a portion consecutively provided on a proximal end side (a side opposite to a side on which the ultrasound transducer unit 46 is provided) than the distal end portion 40, and is freely bent. The flexible portion 43 is a portion that connects the bending portion 42 and the operating part 24, has flexibility, and is provided in an elongated and extended state.
A plurality of pipe lines for air and water supply and a plurality of pipe lines for suction are formed inside each of the insertion part 22 and the operating part 24. In addition, a treatment tool channel 45 of which one end communicates with the treatment tool lead-out port 44 is formed inside each of the insertion part 22 and the operating part 24.
Next, the ultrasound observation part 36, the endoscope observation part 38, the water supply tank 21a, the suction pump 21b, and the operating part 24 among the components of the ultrasound endoscope 12 will be described in detail.
The ultrasound observation part 36 is a portion that is provided to acquire an ultrasound image, and is disposed on the distal end side in the distal end portion 40 of the insertion part 22. As shown in
The ultrasound transducer unit 46 corresponds to an ultrasound probe (probe), transmits ultrasonic waves using an ultrasound transducer array 50, in which a plurality of ultrasound transducers 48 described below are arranged, inside a body cavity of a patient, receives reflected waves (echoes) of the ultrasonic waves reflected by the observation target part, and outputs reception signals. The ultrasound transducer unit 46 according to the embodiment is a convex type, and transmits ultrasonic waves radially (in an arc shape). Note that the type (model) of the ultrasound transducer unit 46 is not particularly limited to the convex type, and other types may be used as long as ultrasonic waves can be transmitted and received. For example, a radial type, a linear type, or the like may be used.
As shown in
The ultrasound transducer array 50 has a plurality of ultrasound transducers 48 arranged in a one-dimensional array. In more detail, the ultrasound transducer array 50 is configured by arranging N (for example, N=128) ultrasound transducers 48 at regular intervals in a convex bent shape along an axial direction of the distal end portion 40 (a longitudinal axis direction of the insertion part 22). The ultrasound transducer array 50 may be configured by arranging a plurality of ultrasound transducers 48 in a two-dimensional array.
Each of the N ultrasound transducers 48 is configured by disposing electrodes on both surfaces of a piezoelectric element (piezoelectric body). As the piezoelectric element, barium titanate (BaTiO3), lead zirconate titanate (PZT), potassium niobate (KNbO3), or the like is used.
The electrodes have an individual electrode (not shown) individually provided for each of a plurality of ultrasound transducers 48 and a transducer ground (not shown) common to a plurality of ultrasound transducers 48. The electrodes are electrically connected to the ultrasound observation device 14 through the coaxial cables 56 and the FPC 60.
A pulsed drive voltage is supplied as an input signal (transmission signal) from the ultrasound observation device 14 to each ultrasound transducer 48 through the coaxial cables 56. In a case where the drive voltage is applied to the electrodes of the ultrasound transducer 48, the piezoelectric element expands and contracts to drive (vibrate) the ultrasound transducer 48. As a result, a pulsed ultrasonic wave is output from the ultrasound transducer 48. In this case, the amplitude of the ultrasonic wave output from the ultrasound transducer 48 has magnitude according to intensity (output intensity) when the ultrasound transducer 48 outputs the ultrasonic wave. Here, the output intensity is defined as the magnitude of sound pressure of the ultrasonic wave output from the ultrasound transducer 48.
Each ultrasound transducer 48 vibrates (is driven) with reception of a reflected wave (echo) of the ultrasonic wave, and the piezoelectric element of each ultrasound transducer 48 generates an electric signal. The electric signal is output as a reception signal of the ultrasonic wave from the ultrasound transducer 48 toward the ultrasound observation device 14. In this case, the magnitude (voltage value) of the electric signal output from the ultrasound transducer 48 is magnitude according to reception sensitivity when the ultrasound transducer 48 receives the ultrasonic wave. Here, the reception sensitivity is defined as a ratio of the amplitude of the electric signal output from the ultrasound transducer 48 with reception of the ultrasonic wave to the amplitude of the ultrasonic wave transmitted from the ultrasound transducer 48.
In the embodiment, the N ultrasound transducers 48 are driven sequentially by an electronic switch, such as a multiplexer 140 (see
Then, the above-described series of steps (that is, the supply of the drive voltage, the transmission and reception of the ultrasonic waves, and the output of the electric signal) are repeatedly performed while shifting the positions of the drive target transducers among the N ultrasound transducers 48 one by one (one ultrasound transducer 48 at a time). Specifically, the above-described series of steps are started from m drive target transducers on both sides of the ultrasound transducer 48 positioned at one end among the N ultrasound transducers 48. Then, the above-described series of steps are repeated each time the positions of the drive target transducers are shifted due to switching of the opening channel by the multiplexer 140. Finally, the above-described series of steps are repeatedly performed N times in total up to m drive target transducers on both sides of the ultrasound transducer 48 positioned at the other end among the N ultrasound transducers 48.
The backing material layer 54 supports each ultrasound transducer 48 of the ultrasound transducer array 50 from a rear surface side. Furthermore, the backing material layer 54 has a function of attenuating ultrasonic waves propagating to the backing material layer 54 side among ultrasonic waves emitted from the ultrasound transducers 48 or ultrasonic waves (echoes) reflected by the observation target part. A backing material is a material having rigidity, such as hard rubber, and an ultrasonic wave attenuation material (ferrite, ceramics, or the like) is added as necessary.
The acoustic matching layer 74 is superimposed on the ultrasound transducer array 50, and is provided for acoustic impedance matching between the body of the patient and the ultrasound transducer 48. The acoustic matching layer 74 is provided, whereby it is possible to increase the transmittance of the ultrasonic wave. As a material of the acoustic matching layer 74, various organic materials of which a value of acoustic impedance is closer to that of the body of the patient than the piezoelectric element of the ultrasound transducer 48 can be used. As the material of the acoustic matching layer 74, specifically, epoxy-based resin, silicone rubber, polyimide, polyethylene, and the like are exemplified.
The acoustic lens 76 superimposed on the acoustic matching layer 74 converges ultrasonic waves emitted from the ultrasound transducer array 50 toward the observation target part. The acoustic lens 76 is made of, for example, silicone-based resin (millable silicone rubber (HTV rubber), liquid silicone rubber (RTV rubber), or the like), butadiene-based resin, polyurethane-based resin, or the like, and powder of titanium oxide, alumina, silica, or the like is mixed as necessary.
The FPC 60 is electrically connected to the electrodes of each ultrasound transducer 48. Each of a plurality of coaxial cables 56 is wired to the FPC 60 at one end. Then, in a case where the ultrasound endoscope 12 is connected to the ultrasound observation device 14 through the ultrasound connector 32a, each of a plurality of coaxial cables 56 is electrically connected to the ultrasound observation device 14 at the other end (a side opposite to the FPC 60 side).
The endoscope observation part 38 is a portion that is provided to acquire an endoscope image, and is disposed on a proximal end side than the ultrasound observation part 36 in the distal end portion 40 of the insertion part 22. As shown in
The observation window 82 is attached in a state inclined with respect to the axial direction (the longitudinal axis direction of the insertion part 22) in the distal end portion 40 of the insertion part 22. Light reflected by the observation target adjacent part and incident from the observation window 82 is formed on an imaging surface of the solid-state imaging element 86 by the objective lens 84.
The solid-state imaging element 86 photoelectrically converts reflected light of the observation target adjacent part transmitted through the observation window 82 and the objective lens 84 and formed on the imaging surface, and outputs an imaging signal. As the solid-state imaging element 86, a charge coupled device (CCD), a complementary metal oxide semiconductor (CMOS), or the like can be used. A captured image signal output from the solid-state imaging element 86 is transmitted to the endoscope processor 16 by the universal cord 26 by way of the wiring cable 92 extending from the insertion part 22 to the operating part 24.
The illumination windows 88 are provided at both side positions of the observation window 82. An exit end of the light guide (not shown) is connected to the illumination windows 88. The light guide extends from the insertion part 22 to the operating part 24, and an incidence end of the light guide is connected to the light source device 18 connected through the universal cord 26. Illumination light emitted from the light source device 18 is transmitted through the light guide, and the observation target adjacent part is irradiated with illumination light from the illumination windows 88.
The cleaning nozzle 90 is an ejection hole formed in the distal end portion 40 of the insertion part 22 to clean the surfaces of the observation window 82 and the illumination windows 88, and air or a cleaning liquid is ejected from the cleaning nozzle 90 toward the observation window 82 and the illumination windows 88. In the embodiment, the cleaning liquid ejected from the cleaning nozzle 90 is water, in particular, degassed water. Note that the cleaning liquid is not particularly limited, and other liquids, for example, normal water (water that is not degassed) may be used.
The water supply tank 21a is a tank that stores degassed water, and is connected to the light source connector 32c by an air and water supply tube 34a. Degassed water is used as the cleaning liquid that is ejected from the cleaning nozzle 90.
The suction pump 21b sucks aspirates (including degassed water supplied for cleaning) into the body cavity through the treatment tool lead-out port 44. The suction pump 21b is connected to the light source connector 32c by a suction tube 34b. The ultrasound endoscope system 10 may comprise an air supply pump that supplies air to a predetermined air supply destination, or the like.
Inside the insertion part 22 and the operating part 24, the treatment tool channel 45 and an air and water supply pipe line (not shown) are provided.
The treatment tool channel 45 communicates a treatment tool insertion port 30 and the treatment tool lead-out port 44 provided in the operating part 24. Furthermore, the treatment tool channel 45 is connected to a suction button 28b provided in the operating part 24. The suction button 28b is connected to the suction pump 21b in addition to the treatment tool channel 45.
The air and water supply pipe line communicates with the cleaning nozzle 90 on one end side, and is connected to an air and water supply button 28a provided in the operating part 24 on the other end side. The air and water supply button 28a is connected to the water supply tank 21a in addition to the air and water supply pipe line.
The operating part 24 is a portion that is operated by the operator at the time of a start of ultrasonography, during diagnosis, at the time of an end of diagnosis, and the like, and has one end to which one end of the universal cord 26 is connected. Furthermore, as shown in
In a case where each of a pair of angle knobs 29 is moved rotationally, the bending portion 42 is remotely operated to be bent and deformed. With the deformation operation, it is possible to direct the distal end portion 40 of the insertion part 22, in which the ultrasound observation part 36 and the endoscope observation part 38 are provided, to a desired direction.
The treatment tool insertion port 30 is a hole formed such that the treatment tool (not shown), such as a forcep, is inserted thereinto, and communicates with the treatment tool lead-out port 44 through the treatment tool channel 45. The treatment tool inserted into the treatment tool insertion port 30 is introduced from the treatment tool lead-out port 44 into the body cavity after passing through the treatment tool channel 45.
The air and water supply button 28a and the suction button 28b are two-stage switching type push buttons, and are operated to switch opening and closing of the pipe line provided inside each of the insertion part 22 and the operating part 24.
Here, although description of the detailed configuration of the endoscope processor 16 will not be repeated, the endoscope processor 16 comprises an endoscope image recognition unit 170 in addition to general components known in the related art for capturing an endoscope image.
The endoscope image recognition unit 170 learns a relationship between an endoscope image for learning and a lesion region displayed in the endoscope image for learning, on a plurality of endoscope images for learning in advance and recognizes the lesion region displayed in the endoscope image for diagnosis from the endoscope image for diagnosis generated by the endoscope processor 16 based on a learning result.
The endoscope image for learning is an existing endoscope image that is used for the endoscope image recognition unit 170 to learn a relationship between an endoscope image and a lesion region displayed in the endoscope image, and for example, various endoscope images captured in the past can be used.
As shown in
The lesion region detection unit 102 detects the lesion region from the endoscope image for diagnosis based on the learning result. The lesion region detection unit 102 comprises a plurality of detection units corresponding to a plurality of positions in the body cavity. Here, as an example, as shown in
The first to eleventh detection units 102A to 102K are learned models. A plurality of learned models are models learned using respective data sets having different endoscope images for learning. In detail, a plurality of learned models are models learned a relationship between an endoscope image for learning and a lesion region displayed in the endoscope image for learning in advance using respective data sets having endoscope images for learning obtained by imaging different positions in the body cavity.
That is, the first detection unit 102A is a model learned using a data set having endoscope images for learning of rectum, the second detection unit 102B is a model that learns using a data set having endoscope images for learning of an S-shaped colon, the third detection unit 102C is a model that learns using a data set having endoscope images for learning of a descending colon, the fourth detection unit 102D is a model that learns using a data set having endoscope images for learning of a transverse colon, the fifth detection unit 102E is a model that learns using a data set having endoscope images for learning of an ascending colon, the sixth detection unit 102F is a model that learns using a data set having endoscope images for learning of a cecum, the seventh detection unit 102G is a model that learns using a data set having endoscope images for learning of ileum, the eighth detection unit 102H is a model that learns using a data set having endoscope images for learning of a jejunum, the ninth detection unit 102I is a model that learns using a data set having endoscope images for learning of a duodenum, the tenth detection unit 102J is a model that learns using a data set having endoscope images for learning of a stomach, and the eleventh detection unit 102K is a model that learns using a data set having endoscope images for learning of an esophagus.
A learning method is not particularly limited as long as it is possible to learn the relationship between the endoscope image and the lesion region from a plurality of endoscope images for learning, and to generate a learned model.
As the learning method, for example, deep learning that uses a hierarchical structure type neural network as an example of machine learning, which is one of artificial intelligence (AI) techniques, can be used.
Machine learning other than deep learning may be used, an artificial intelligence technique other than machine learning may be used, or a learning method other than an artificial intelligence technique may be used.
A learned model may be generated using only the endoscope images for learning. In this case, the learned model is not updated, and the same learned model can be used constantly.
Alternatively, a configuration may be made in which a learned model is generated using endoscope images for diagnosis in addition to the endoscope images for learning. In this case, a learned model is updated at any time by learning a relationship between an endoscope image for diagnosis and a lesion region displayed in the endoscope image for diagnosis.
Subsequently, the positional information acquisition unit 104 acquires information regarding a position in a body cavity of an endoscope image. Here, the operator, such as a physician, input information regarding the position using the console 100. The positional information acquisition unit 104 acquires information regarding the position input from the console 100.
As information regarding a position in a body cavity of an image, information, such as rectum, an S-shaped colon, a descending colon, a transverse colon, an ascending colon, a cecum, ileum, a jejunum, a duodenum, a stomach, and an esophagus, is input. A configuration may be made in which such position candidates are selectably displayed on the monitor 20, and the operator, such as a physician, selects the position using the console 100.
Subsequently, the selection unit 106 selects a detection unit corresponding to information regarding the position acquired by the positional information acquisition unit 104, from the lesion region detection unit 102. That is, the selection unit 106 selects the first detection unit 102A in a case where information regarding the position is rectum, selects the second detection unit 102B in a case where information regarding the position is an S-shaped colon, selects the third detection unit 102C in a case where information regarding the position is a descending colon, selects the fourth detection unit 102D in a case where information regarding the position is a transverse colon, selects the fifth detection unit 102E in a case where information regarding the position is an ascending colon, selects the sixth detection unit 102F in a case where information regarding the position is a cecum, selects the seventh detection unit 102G in a case where information regarding the position is ileum, selects the eighth detection unit 102H in a case where information regarding the position is a jejunum, selects the ninth detection unit 102I in a case where information regarding the position is a duodenum, selects the tenth detection unit 102J in a case where information regarding the position is a stomach, and selects the eleventh detection unit 102K in a case where information regarding the position is an esophagus.
Subsequently, the lesion region detection controller 108 makes the detection unit selected by the selection unit 106 detect a lesion region from the endoscope image. The lesion region herein is not limited to a region caused by illness, and includes a region in a state different from a normal state in appearance. Examples of the lesion region include a polyp, cancer, a colon diverticulum, inflammation, a scar from treatment, such as endoscopic mucosal resection (EMR) scar or endoscopic submucosal dissection (ESD) scar, a clipped spot, a bleeding point, perforation, and angiodysplasia.
In the endoscope image recognition unit 170, the positional information acquisition unit 104 acquires information regarding the position in the body cavity of the endoscope image.
Subsequently, the selection unit 106 selects a detection unit corresponding to information regarding the position acquired by the positional information acquisition unit 104, from the lesion region detection unit 102.
Subsequently, the lesion region detection controller 108 performs control such that the detection unit selected by the selection unit 106 detects the lesion region from the endoscope image for diagnosis based on a learning result.
The ultrasound observation device 14 makes the ultrasound transducer unit 46 transmit and receive ultrasonic waves and generates an ultrasound image by imaging reception signals output from the ultrasound transducers 48 (in detail, the drive target transducers) at the time of reception of the ultrasonic waves. The ultrasound observation device 14 displays the endoscope image transferred from the endoscope processor 16, the anatomical schema diagram, and the like on the monitor 20, in addition to the generated ultrasound image.
As shown in
The reception circuit 142 and the transmission circuit 144 are electrically connected to the ultrasound transducer array 50 of the ultrasound endoscope 12. The multiplexer 140 selects a maximum of m drive target transducers from among the N ultrasound transducers 48 and opens the channels.
The transmission circuit 144 has a field programmable gate array (FPGA), a pulser (pulse generation circuit 158), a switch (SW), and the like, and is connected to the multiplexer 140 (MUX). An application-specific integrated circuit (ASIC), instead of the FPGA, may be used.
The transmission circuit 144 is a circuit that supplies a drive voltage for ultrasonic wave transmission to the drive target transducers selected by the multiplexer 140 in response to a control signal sent from the CPU 152 for transmission of ultrasonic waves from the ultrasound transducer unit 46. The drive voltage is a pulsed voltage signal (transmission signal), and is applied to the electrodes of the drive target transducers through the universal cord 26 and the coaxial cables 56.
The transmission circuit 144 has a pulse generation circuit 158 that generates a transmission signal based on a control signal. Under the control of the CPU 152, the transmission circuit 144 generates a transmission signal for driving a plurality of ultrasound transducers 48 to generate ultrasonic waves using the pulse generation circuit 158 and supplies the transmission signal to a plurality of ultrasound transducers 48. In more detail, under the control of the CPU 152, in a case of performing ultrasonography, the transmission circuit 144 generates a transmission signal having a drive voltage for performing ultrasonography using the pulse generation circuit 158.
The reception circuit 142 is a circuit that receives electric signals output from the drive target transducers, which receive the ultrasonic waves (echoes), that is, reception signals. Furthermore, the reception circuit 142 amplifies reception signals received from the ultrasound transducers 48 in response to a control signal sent from the CPU 152 and delivers the signals after amplification to the A/D converter 146. The A/D converter 146 is connected to the reception circuit 142, converts the reception signals received from the reception circuit 142 from analog signals to digital signals and outputs the digital signals after conversion to the ASIC 148.
The ASIC 148 is connected to the A/D converter 146. As shown in
In the embodiment, although the above-described functions (specifically, the phase matching unit 160, the B mode image generation unit 162, the PW mode image generation unit 164, the CF mode image generation unit 166, and the memory controller 151) are realized by a hardware circuit, such as the ASIC 148, the invention is not limited thereto. The above-described functions may be realized by making a central processing unit (CPU) and software (computer program) for executing various kinds of data processing cooperate with each other.
The phase matching unit 160 executes processing of giving a delay time to the reception signals (reception data) digitized by the A/D converter 146 and performing phasing addition (performing addition after matching the phases of the reception data). With the phasing addition processing, sound ray signals in which the focus of the ultrasound echo is narrowed are generated.
The B mode image generation unit 162, the PW mode image generation unit 164, and the CF mode image generation unit 166 generate an ultrasound image based on the electric signals (strictly, sound ray signals generated by phasing addition on the reception data) output from the drive target transducers among a plurality of ultrasound transducers 48 when the ultrasound transducer unit 46 receives the ultrasonic waves.
The B mode image generation unit 162 is an image generation unit that generates a B mode image as a tomographic image of the inside (the inside of the body cavity) of the patient. The B mode image generation unit 162 performs correction of attenuation due to a propagation distance on each of the sequentially generated sound ray signals according to a depth of a reflection position of the ultrasonic wave through sensitivity time gain control (STC). Furthermore, the B mode image generation unit 162 executes envelope detection processing and logarithm (Log) compression processing on the sound ray signal after correction to generate a B mode image (image signal).
The PW mode image generation unit 164 is an image generation unit that generates an image indicating a speed of a blood flow in a predetermined direction. The PW mode image generation unit 164 extracts a frequency component by performs fast Fourier transform on a plurality of sound ray signals in the same direction among the sound ray signals sequentially generated by the phase matching unit 160. Thereafter, the PW mode image generation unit 164 calculates the speed of the blood flow from the extracted frequency component and generates a PW mode image (image signal) indicating the calculated speed of the blood flow.
The CF mode image generation unit 166 is an image generation unit that generates an image indicating information regarding a blood flow in a predetermined direction. The CF mode image generation unit 166 generates an image signal indicating information regarding the blood flow by obtaining autocorrelation of a plurality of sound ray signals in the same direction among the sound ray signals sequentially generated by the phase matching unit 160. Thereafter, the CF mode image generation unit 166 generates a CF mode image (image signal) as a color image, in which information relating to the blood flow is superimposed on the B mode image generated by the B mode image generation unit 162, based on the above-described image signal.
The memory controller 151 stores the image signal generated by the B mode image generation unit 162, the PW mode image generation unit 164, or the CF mode image generation unit 166 in the cine memory 150.
The DSC 154 is connected to the ASIC 148, converts (raster conversion) the signal of the image generated by the B mode image generation unit 162, the PW mode image generation unit 164, or the CF mode image generation unit 166 into an image signal compliant with a normal television signal scanning system, executes various kinds of necessary image processing, such as gradation processing, on the image signal, and then, outputs the image signal to the ultrasound image recognition unit 168.
The ultrasound image recognition unit 168 learns at least one of a relationship between an ultrasound image for learning and a name of an organ (a name of an observation target part) displayed in the ultrasound image for learning or a relationship between the ultrasound image for learning and a position and an orientation of the distal end portion 40 of the ultrasound endoscope at the time of imaging of the ultrasound image for learning, on a plurality of ultrasound images for learning in advance, and recognizes at least one of a name of an organ displayed in an ultrasound image for diagnosis or the position and the orientation of the distal end portion 40 of the ultrasound endoscope 12 from the ultrasound image for diagnosis generated by the ultrasound observation device 14 based on the learning result.
The ultrasound image for learning is an existing ultrasound image that is used for the ultrasound image recognition unit 168 to learn the relationship of the ultrasound image, the name of the organ, and the position and the orientation of the distal end portion 40 of the ultrasound endoscope 12, and for example, various ultrasound images captured in the past can be used.
As shown in
The organ name detection unit 112 detects the name of the organ displayed in the ultrasound image for diagnosis from the ultrasound image for diagnosis based on a learning result. The organ name detection unit 112 comprises a plurality of detection units corresponding to a plurality of positions to be an observation target part in the body of the subject. Here, as an example, the organ name detection unit 112 comprises first to eleventh detection units 112A to 112K. The first detection unit 112A corresponds to a confluence of an aorta, a celiac artery, and a superior mesenteric artery, the second detection unit 112B corresponds to a pancreatic body, the third detection unit 112C corresponds to a pancreatic tail, the fourth detection unit 112D corresponds to a confluence of a splenic vein, a superior mesenteric vein, and a portal vein, the fifth detection unit 112E corresponds to a pancreatic head, the sixth detection unit 112F corresponds to a gallbladder, the seventh detection unit 112G corresponds to a portal vein, the eighth detection unit 112H corresponds to a common bile duct, the ninth detection unit 112I corresponds to a gallbladder, the tenth detection unit 112J corresponds to a pancreatic uncinate process, and the eleventh detection unit 112K corresponds to a papilla.
The first to eleventh detection units 112A to 112K are learned models. A plurality of learned models are models learned using respective data sets having different ultrasound images for learning. In detail, a plurality of learned models are models learned a relationship between an ultrasound image for learning and a name of an organ displayed in the ultrasound image for learning in advance using data sets having ultrasound images for learning obtained by imaging different positions each to be an observation target part in the body of the subject.
That is, the first detection unit 112A is a model learned using a data set having ultrasound images for learning of a confluence of an aorta, a celiac artery, and a superior mesenteric artery, the second detection unit 112B is a model learned using a data set having ultrasound images for learning of a pancreatic body, the third detection unit 112C is a model learned using a data set having ultrasound images for learning of a pancreatic tail, the fourth detection unit 112D is a model learned using a data set having ultrasound images for learning of a confluence of a splenic vein, a superior mesenteric vein, and a portal vein, the fifth detection unit 112E is a model learned using a data set having ultrasound images for learning of a pancreatic head, the sixth detection unit 112F is a model learned using a data set having ultrasound images for learning of a gallbladder, the seventh detection unit 112G is a model learned using a data set having ultrasound images for learning of a portal vein, the eighth detection unit 112H is a model learned using a data set having ultrasound images for learning of a common bile duct, the ninth detection unit 112I is a model learned using a data set having ultrasound images for learning of a gallbladder, the tenth detection unit 112J is a model learned using a data set having ultrasound images for learning of a pancreatic uncinate process, and the eleventh detection unit 112K is a model learned using a data set having ultrasound images for learning of a papilla.
An observation route (a movement route of the distal end portion 40 of the ultrasound endoscope 12) in the body in a case of capturing an ultrasound image and representative observation points (the position and the orientation of the distal end portion 40 of the ultrasound endoscope 12) are generally determined. For this reason, it is possible to learn an ultrasound image at a representative observation point, a name of an organ displayed in the ultrasound image, and a position and an orientation of the distal end portion 40 of the ultrasound endoscope 12 at the observation point in association with one another.
Hereinafter, a representative observation point (the position and the orientation of the distal end portion 40 of the ultrasound endoscope 12) in the body in a case of capturing an ultrasound image will be described.
Examples of the representative observation point in the body include (1) to (11) described below.
(1) confluence of aorta, celiac artery, and superior mesenteric artery
(2) pancreatic body
(3) pancreatic tail
(4) confluence of splenic vein, superior mesenteric vein, and portal vein
(5) pancreatic head
(6) gallbladder
(7) portal vein
(8) common bile duct
(9) gallbladder
(10) pancreatic uncinate process
(11) papilla
Here, (1) confluence of aorta, celiac artery, and superior mesenteric artery, (2) pancreatic body, (3) pancreatic tail, (4) confluence of splenic vein, superior mesenteric vein, and portal vein, (5) pancreatic head, and (6) gallbladder are representative observation points from a stomach, (7) portal vein, (8) common bile duct, and (9) gallbladder are representative observation points from a duodenal bulb, and (10) pancreatic uncinate process and (11) papilla are representative observation points from a pars descendens duodeni.
For example, an observation procedure in a case of performing observation in an order of (1) confluence of aorta, celiac artery, and superior mesenteric artery, (3) pancreatic tail, (4) confluence of splenic vein, superior mesenteric vein, and portal vein, and (5) pancreatic head as the observation points will be described.
In
In a case of observing (1) confluence of aorta, celiac artery, and superior mesenteric artery, in a case where the distal end portion 40 of the insertion part 22 of the ultrasound endoscope 12 is rotated clockwise while following the hepatic vein, the inferior vena cava is visualized. In a case where the distal end portion 40 is further rotated clockwise, the aorta is visualized. In a case where the distal end portion 40 is further pushed forward along the aorta, as shown in
In
In
Subsequently, in a case of observing (4) confluence of splenic vein, superior mesenteric vein, and portal vein, in a case where the distal end portion 40 is rotated counterclockwise in an up-angled state, a left adrenal gland is visualized. In a case where the distal end portion 40 is rotated counterclockwise from the pancreatic tail while following the splenic vein, the pancreatic body is visualized. In a case where the distal end portion 40 further follows the splenic vein, as shown in
In
Subsequently, in a case of observing (5) pancreatic head, in a case where the distal end portion 40 is pushed forward while rotating counterclockwise, and follows the pancreatic duct from the confluence of the splenic vein, the superior mesenteric vein, and the portal vein, as shown in
Although an example of observation points in a case of performing observation has been described, the invention is not limited thereto, and the operator can observe desired observation points in a desired order.
Subsequently, the position and orientation detection unit 114 detects the position and the orientation of the distal end portion 40 of the ultrasound endoscope 12 from the ultrasound image for diagnosis based on the learning result.
As the position of the distal end portion 40 of the ultrasound endoscope 12, for example, the above-described observation points, that is, (1) confluence of aorta, celiac artery, and superior mesenteric artery, (2) pancreatic body, (3) pancreatic tail, (4) confluence of splenic vein, superior mesenteric vein, and portal vein, (5) pancreatic head, and (6) gallbladder (representative observation points from the stomach), (7) portal vein, (8) common bile duct, and (9) gallbladder (representative observation points of the duodenal bulb), and (10) pancreatic uncinate process and (11) papilla (representative observation points of the pars descendens duodeni) are detected.
As the orientation of the distal end portion 40 of the ultrasound endoscope 12, the orientations of the distal end portion 40 of the ultrasound endoscope 12 in a case of observing the parts (1) to (11) described above are detected.
The position and orientation detection unit 114 is a learned model. The learned model is a model learned a relationship of an ultrasound image for learning and the position and the orientation of the distal end portion 40 of the ultrasound endoscope at the time of imaging of the ultrasound image for learning, on a plurality of ultrasound images for learning in advance using a data set having endoscope images for learning obtained by imaging different positions each to be an observation target part in the body of the subject.
A learning method is not particularly limited as long as it is possible to learn the relationship of the ultrasound image, the name of the organ, and the position and the orientation of the distal end portion 40 of the ultrasound endoscope 12 from a plurality of ultrasound images for learning, and to generate a learned model.
As the learning method, for example, deep learning that uses a hierarchical structure type neural network as an example of machine learning, which is one of artificial intelligence (AI) techniques, can be used.
Machine learning other than deep learning may be used, an artificial intelligence technique other than machine learning may be used, or a learning method other than an artificial intelligence technique may be used.
A learned model may be generated using only the ultrasound images for learning. In this case, the learned model is not updated, and the same learned model can be used constantly.
Alternatively, a configuration may be made in which a learned model is generated using the ultrasound images for diagnosis in addition to the ultrasound images for learning. In this case, a learned model is updated at any time by learning a relationship of an ultrasound image for diagnosis, a name of an organ displayed in the ultrasound image for diagnosis, and a position and an orientation of the distal end portion 40 of the ultrasound endoscope 12 when the ultrasound image for diagnosis is captured.
It is not essential that the ultrasound image recognition unit 168, and in addition, the position and orientation detection unit 114 detects the orientation of the distal end portion 40 of the ultrasound endoscope 12.
As in the embodiment, in a case where the ultrasound transducer unit 46 is a convex type, a transmission direction of an ultrasonic wave changes with the orientation of the distal end portion 40 of the ultrasound endoscope 12, and thus, it is desirable to detect the orientation of the distal end portion 40 of the ultrasound endoscope 12.
On the other hand, in a case where the ultrasound transducer unit 46 is a radial type, an ultrasonic wave is transmitted over the entire circumference in a radial direction of the ultrasound endoscope 12 regardless of the orientation of the distal end portion 40 of the ultrasound endoscope 12, and thus, there is no need to detect the orientation of the distal end portion 40 of the ultrasound endoscope 12.
Subsequently, the selection unit 116 selects a detection unit corresponding to the position of the distal end portion 40 of the ultrasound endoscope 12 detected by the position and orientation detection unit 114, from the organ name detection unit 112.
That is, the selection unit 116 selects the first detection unit 112A in a case where the position of the distal end portion 40 of the ultrasound endoscope 12 is (1) confluence of aorta, celiac artery, and superior mesenteric artery, selects the second detection unit 112B in a case where the position of the distal end portion 40 of the ultrasound endoscope 12 is (2) pancreatic body, selects the third detection unit 112C in a case where the position of the distal end portion 40 of the ultrasound endoscope 12 is (3) pancreatic tail, selects the fourth detection unit 112D in a case where the position of the distal end portion 40 of the ultrasound endoscope 12 is (4) confluence of splenic vein, superior mesenteric vein, and portal vein, selects the fifth detection unit 112E in a case where the position of the distal end portion 40 of the ultrasound endoscope 12 is (5) pancreatic head, selects the sixth detection unit 112F in a case where the position of the distal end portion 40 of the ultrasound endoscope 12 is (6) gallbladder, selects the seventh detection unit 112G in a case where the position of the distal end portion 40 of the ultrasound endoscope 12 is (7) portal vein, selects the eighth detection unit 112H in a case where the position of the distal end portion 40 of the ultrasound endoscope 12 is (8) common bile duct, selects the ninth detection unit 112I in a case where the position of the distal end portion 40 of the ultrasound endoscope 12 is (9) gallbladder, selects the tenth detection unit 112J in a case where the position of the distal end portion 40 of the ultrasound endoscope 12 is (10) pancreatic uncinate process, and selects the eleventh detection unit 112K in a case where the position of the distal end portion 40 of the ultrasound endoscope 12 is (11) papilla.
Subsequently, the organ name detection controller 118 makes the detection unit selected by the selection unit 116 detect a name of an organ displayed in the ultrasound image for diagnosis from the ultrasound image for diagnosis. As the name of the organ, while all observation target parts in the body of the subject that can be observed using the ultrasound observation device 14 are included, for example, a liver, pancreas, a spleen, a kidney, an adrenal gland, an aorta, a celiac artery, a splenic artery, a superior mesenteric artery, an inferior vena cava, a hepatic vein, a portal vein, a splenic vein, a superior mesenteric vein, a gallbladder, a common bile duct, a pancreatic duct, and a papilla can be exemplified.
In the ultrasound image recognition unit 168, the position and orientation detection unit 114 detects the position and the orientation of the distal end portion 40 of the ultrasound endoscope 12 from the ultrasound image for diagnosis based on the learning result.
Subsequently, the selection unit 116 selects a detection unit corresponding to the position of the distal end portion 40 of the ultrasound endoscope 12 detected by the position and orientation detection unit 114, from the organ name detection unit 112.
Subsequently, the organ name detection controller 118 performs control such that the detection unit selected by the selection unit 116 detects the name of the organ displayed in the ultrasound image for diagnosis from the ultrasound image for diagnosis based on the learning result.
Subsequently, the display controller 172 displays at least one of the name of the organ or the position of the distal end portion 40 of the ultrasound endoscope 12 recognized by the ultrasound image recognition unit 168, on the monitor 20. Alternatively, the display controller 172 displays at least one of the name of the organ recognized by the ultrasound image recognition unit 168 or the position and the orientation of the distal end portion 40 of the ultrasound endoscope 12 recognized by the ultrasound image recognition unit 168 similarly, on the monitor 20.
The display controller 172 superimposedly displays the lesion region on the endoscope image, superimposedly displays the name of the organ on the ultrasound image, or superimposedly displays the position and the orientation of the distal end portion 40 of the ultrasound endoscope 12 on the anatomical schema diagram in response to an instruction from the operator.
In other words, the display controller 172 displays one image or two or more images from among the endoscope image with the lesion region not displayed, the endoscope image with the lesion region superimposedly displayed, the ultrasound image with the name of the organ not displayed, the ultrasound image with the name of the organ superimposedly displayed, the anatomical schema diagram with the position of the distal end portion 40 of the ultrasound endoscope 12 not displayed, and the anatomical schema diagram with the position of the distal end portion 40 of the ultrasound endoscope 12 superimposedly displayed, in parallel within a screen of the monitor 20 in response to an instruction from the operator.
As an embodiment, the display controller 172 displays two or more images including at least one of the ultrasound image with the name of the organ superimposedly displayed or the anatomical schema diagram with the position of the distal end portion 40 of the ultrasound endoscope 12 superimposedly displayed, in parallel within the screen of the monitor 20.
The name of the organ is superimposedly displayed near the organ, for example, on the organ, for example, on the ultrasound image superimposedly, and the position and the orientation of the distal end portion 40 of the ultrasound endoscope 12 are superimposedly displayed, for example, on the anatomical schema diagram. The lesion region is superimposedly displayed, for example, on the endoscope image in a state in which the lesion region is surrounded by a frame line.
The cine memory 150 has a capacity for accumulating an image signal for one frame or image signals for several frames. An image signal generated by the ASIC 148 is output to the DSC 154, and is stored in the cine memory 150 by the memory controller 151. In a freeze mode, the memory controller 151 reads out the image signal stored in the cine memory 150 and outputs the image signal to the DSC 154. With this, an ultrasound image (static image) based on the image signal read from the cine memory 150 is displayed on the monitor 20.
The CPU 152 functions as a controller that controls the respective units of the ultrasound observation device 14. The CPU 152 is connected to the reception circuit 142, the transmission circuit 144, the A/D converter 146, the ASIC 148, and the like, and controls the equipment. Specifically, the CPU 152 is connected to the console 100, and controls the respective units of the ultrasound observation device 14 in compliance with examination information, control parameters, and the like input through the console 100.
In a case where the ultrasound endoscope 12 is connected to the ultrasound observation device 14 through the ultrasound connector 32a, the CPU 152 automatically recognizes the ultrasound endoscope 12 by a plug and play (PnP) system or the like.
Next, as an operation example of the ultrasound endoscope system 10, a flow of a series of processing (hereinafter, also referred to as diagnosis processing) regarding ultrasonography will be described referring to
In a case where power is supplied to the respective units of the ultrasound endoscope system 10 in a state in which the ultrasound endoscope 12 is connected to the ultrasound observation device 14, the endoscope processor 16, and the light source device 18, the diagnosis processing is started with the power supply as a trigger. In the diagnosis processing, as shown in
Subsequently, in a case where there is a diagnosis start instruction from the operator (in S003, Yes), the CPU 152 performs control on the respective units of the ultrasound observation device 14 to perform the diagnosis step (S004). The diagnosis step progresses along the flow shown in
Subsequently, the CPU 152 determines whether or not the ultrasonography ends (S037). In a case where the ultrasonography does not end (in S037, No), the process returns to the diagnosis step S031, and the generation of the ultrasound image in each image generation mode is repeatedly performed until a diagnosis end condition is established. As the diagnosis end condition, for example, a condition that the operator gives an instruction to end diagnosis through the console 100, or the like is exemplified.
On the other hand, in a case where diagnosis end condition is established and the ultrasonography ends (in S037, Yes), the diagnosis step ends.
Subsequently, returning to
Next, a display method of an endoscope image, an ultrasound image, and an anatomical schema diagram will be described.
The operator can display at least one of the endoscope image, the ultrasound image, or the anatomical schema diagram within the screen of the monitor 20 by operates the console 100 to give an instruction.
In this case, the display controller 172 displays one image or two or more images from among an endoscope image (with a lesion region displayed or not displayed), an ultrasound image (with a name of an organ displayed or not displayed), and the anatomical schema diagram (with a position and an orientation of the distal end portion 40 of the ultrasound endoscope 12 displayed or not displayed), in parallel within the monitor 20 in response to an instruction from the operator. The display controller 172 displays one image of the two or more images displayed on the monitor 20 as an image of interest to be greater than other images.
In the ultrasound endoscope system 10, the ultrasound image recognition unit 168 operates in a case where the ultrasound image or the anatomical schema diagram is displayed within the screen of the monitor 20, and the endoscope image recognition unit 170 operates in a case where the endoscope image is displayed within the screen of the monitor 20.
With this, in response to an instruction from the operator, it is possible to display the endoscope image with the lesion region superimposedly displayed, on the monitor 20, to display the ultrasound image with the name of the organ superimposedly displayed, on the monitor 20, or to display the anatomical schema diagram with the position and the orientation of the distal end portion 40 of the ultrasound endoscope 12 superimposedly displayed, on the monitor 20.
It is not essential to superimposedly display the name of the organ on the ultrasound image or to superimposedly display the position and the orientation of the distal end portion 40 of the ultrasound endoscope 12 on the anatomical schema diagram. For example, the name of the organ may be displayed on the monitor 20 separately from the ultrasound image, and the position and the orientation of the distal end portion 40 of the ultrasound endoscope 12 may be displayed on the monitor 20 separately from the anatomical schema diagram.
In the ultrasound endoscope system 10, the name of the organ displayed in the ultrasound image is displayed on the monitor 20 to be superimposed, for example, on the ultrasound image, and thus, for example, even an operator who is unaccustomed to an ultrasound image can correctly recognize what is displayed in the ultrasound image. The position and the orientation of the distal end portion 40 of the ultrasound endoscope 12 is displayed on the monitor 20 to be superimposed, for example, on the anatomical schema diagram, and thus, for example, even an operator who is unaccustomed to an ultrasound image can correctly recognize a position of the distal end portion 40 of the ultrasound endoscope 12, a direction of the distal end portion 40 of the ultrasound endoscope 12, and a part being observed at this moment, and does not get lost in the body of the subject. The lesion region is displayed on the monitor 20 to be superimposed on the endoscope image, and thus, it is possible to correctly recognize the lesion region.
For example, the operator can display the ultrasound image and the anatomical schema diagram in parallel within the screen of the monitor 20.
In this case, the display controller 172 displays the ultrasound image with the name of the organ superimposedly displayed and the anatomical schema diagram with the position and the orientation of the distal end portion 40 of the ultrasound endoscope 12 superimposedly displayed, in parallel within the screen of the monitor 20 in response to an instruction from the operator. Furthermore, one image of the ultrasound image and the anatomical schema diagram displayed on the monitor 20 is displayed as an image of interest to be greater than the other image.
As shown in
The operator can display an ultrasound image (first ultrasound image for diagnosis) with the name of the organ superimposedly displayed, an ultrasound image (second ultrasound image for diagnosis) that is the same ultrasound image as the first ultrasound image for diagnosis and on which the name of the organ is not displayed, and an anatomical schema diagram with the position and the orientation of the distal end portion 40 of the ultrasound endoscope 12 superimposedly displayed, within the screen of the monitor 20.
In this case, the display controller 172 displays, for example, the first ultrasound image for diagnosis, the second ultrasound image for diagnosis, and the anatomical schema diagram with the position and the orientation of the distal end portion 40 of the ultrasound endoscope 12 superimposedly displayed, in parallel within the screen of the monitor 20 in response to an instruction from the operator. Furthermore, one image of the first ultrasound image for diagnosis, the second ultrasound image for diagnosis, and the anatomical schema diagram displayed on the monitor 20 is displayed as an image of interest to be greater than other images.
As shown in
The operator can display an endoscope image, an ultrasound image, and the anatomical schema diagram within the screen of the monitor 20.
In this case, the display controller 172 displays, for example, the endoscope image, the ultrasound image with the name of the organ superimposedly displayed, and the anatomical schema diagram with the position and the orientation of the distal end portion 40 of the ultrasound endoscope 12 superimposedly displayed, in parallel within the screen of the monitor 20 in response to an instruction from the operator. Furthermore, one image of the endoscope image, the ultrasound image, and the anatomical schema diagram displayed on the monitor 20 is displayed as an image of interest to be greater than other images.
As shown in
Although a display method of an endoscope image, an ultrasound image, and the anatomical schema diagram has been described, the invention is not limited thereto, one image can be displayed within the screen of the monitor 20 or two or more images can be arbitrarily combined and displayed in parallel within the screen of the monitor 20.
As the initial screen, the positions where the endoscope image, the ultrasound image, and the anatomical schema diagram are disposed, and one image displayed to be greater than other images from among the images displayed on the monitor 20 can be arbitrarily set. For example, as the initial screen, the first ultrasound image for diagnosis and the second ultrasound image for diagnosis shown in
The operator can switch and display an image of interest from among the images displayed on the monitor 20.
For example, it is assumed that an endoscope image, an ultrasound image, and an anatomical schema diagram are displayed within the screen of the monitor 20.
In this case, the display controller 172 switches and displays an image of interest from one image from among the endoscope image, the ultrasound image, and the anatomical schema diagram displayed on the monitor 20 to one of other images in response to an instruction from the operator.
As shown in an upper left portion of
For example, in a case where the endoscope image is selected as an image of interest by the operator from the state in the upper left portion of
In a case where the anatomical schema diagram is selected as an image of interest by the operator from the state in the upper left portion of
An operation in a case where the anatomical schema diagram is selected as an image of interest by the operator from the state in the upper right portion of
In a case where the ultrasound image is selected as an image of interest by the operator from the state in the upper right portion of
An operation in a case where the ultrasound image is selected as an image of interest by the operator from the state in the lower portion of
In the ultrasound endoscope system 10, it is possible to switch and display an endoscope image, an ultrasound image, and an anatomical schema diagram easily to see. While an image of interest in which the operator is interested changes occasionally, the operator can switch the image of interest at any timing, and thus, it is possible to allow the operator to display and view an image in which the user is interested on the occasion, as an image of interest to be greater than other images.
Although an example where an image of interest is switched among an endoscope image, an ultrasound image, and an anatomical schema diagram displayed on the monitor 20 has been described, the invention is not limited thereto, and the same operation is performed even in a case where an image of interest is switched between two or more images displayed on the operation monitor 20.
Next, disposition places of the ultrasound image recognition unit 168 and the endoscope image recognition unit 170 will be described.
In the embodiment, although the ultrasound image recognition unit 168 is incorporated in the ultrasound observation device 14, the invention is not limited thereto, and the ultrasound image recognition unit 168 may be incorporated, for example, in the endoscope processor 16 or may be provided outside the ultrasound observation device 14 and the endoscope processor 16.
As in the embodiment, in a case where the ultrasound image recognition unit 168 is incorporated in the ultrasound observation device 14, as shown in
In a case where the ultrasound image recognition unit 168 is incorporated in the endoscope processor 16, as shown in
In a case where the ultrasound image recognition unit 168 is provided outside the ultrasound observation device 14 and the endoscope processor 16, as shown in
In this case, the ultrasound image may be transferred from the ultrasound observation device 14 to the endoscope processor 16, and the endoscope image and the ultrasound image may be further transferred from the endoscope processor 16 to the ultrasound image recognition unit 168. Alternatively, the endoscope image may be transferred from the endoscope processor 16 to the ultrasound observation device 14, and may be further transferred from the endoscope processor 16 to the ultrasound image recognition unit 168 instead of being transferred from the ultrasound observation device 14 to the ultrasound image recognition unit 168.
The display controller 172 is disposed between a final image signal that is output to the monitor 20, and the monitor 20.
In a case where the ultrasound image recognition unit 168 is incorporated in the ultrasound observation device 14, the display controller 172 can be incorporated, for example, in the ultrasound observation device 14 or can be provided between the ultrasound observation device 14 and the monitor 20.
In a case where the ultrasound image recognition unit 168 is incorporated in the endoscope processor 16, the display controller 172 can be incorporated, for example, in the endoscope processor 16 or can be provided between the endoscope processor 16 and the monitor 20.
In a case where the ultrasound image recognition unit 168 is provided outside the ultrasound observation device 14 and the endoscope processor 16, the display controller 172 can be provided, for example, outside the ultrasound observation device 14 and the endoscope processor 16.
The display controller 172 displays one image or two or more images from among the endoscope image (with the lesion region displayed or not displayed), the ultrasound image (with the name of the organ displayed or not displayed), and the anatomical schema diagram (with the position and the orientation of the distal end portion 40 of the ultrasound endoscope 12 displayed or not displayed), in parallel within the screen of the monitor 20 in response to an instruction from the operator.
The disposition place of the endoscope image recognition unit 170 can be decided in the same manner as the disposition place of the ultrasound image recognition unit 168. That is, in the embodiment, although the endoscope image recognition unit 170 is incorporated in the endoscope processor 16, the invention is not limited thereto, and endoscope image recognition unit 170 may be incorporated, for example, in the ultrasound observation device 14 or may be provided outside the ultrasound observation device 14 and the endoscope processor 16.
In this way, in the ultrasound endoscope system 10, the disposition places of the ultrasound image recognition unit 168 and the endoscope image recognition unit 170 are not fixed, and the ultrasound image recognition unit 168 and the endoscope image recognition unit 170 can be provided at any disposition places.
Next, an ultrasound endoscope system according to a second embodiment will be described referring to
The configuration of the ultrasound endoscope system of the second embodiment is the same as the configuration of the ultrasound endoscope system 10 of the first embodiment except that the ultrasound observation device 14B is provided instead of the ultrasound observation device 14 provided in the ultrasound endoscope system 10 of the first embodiment, and thus, detailed description of other identical components will not be repeated.
Hereinafter, the ultrasound observation device 14B will be described.
The configuration of the ultrasound observation device 14B shown in
The ultrasound image recognition unit 168B functions in the same manner as the ultrasound image recognition unit 168 of the first embodiment in regards to the learning and the recognition of the name of the organ displayed in the ultrasound image for diagnosis and the position and the orientation of the distal end portion 40 of the ultrasound endoscope 12.
In addition, the ultrasound image recognition unit 168B learns a relationship between an ultrasound image for learning and a range of an organ (a region of an organ) displayed in the ultrasound image for learning in advance on a plurality of ultrasound images for learning, and recognizes a range of an organ displayed in an ultrasound image for diagnosis from the ultrasound image for diagnosis based on a learning result.
The ultrasound image for learning is an existing ultrasound image that is used for the ultrasound image recognition unit 168B to learn a relationship between an ultrasound image and a range of an organ, and for example, various ultrasound images captured in the past can be used.
The configuration of the ultrasound image recognition unit 168B shown in
The organ range detection unit 120 detects a range of an organ displayed in an ultrasound image for diagnosis from the ultrasound image for diagnosis based on a learning result. The organ range detection unit 120 comprises a plurality of detection units corresponding to a plurality of positions each to be an observation target part in the body of the subject. Here, as an example, the organ range detection unit 120 comprises first to eleventh detection units 120A to 120K. The first detection unit 120A corresponds to a confluence of an aorta, a celiac artery, and a superior mesenteric artery, the second detection unit 120B corresponds to a pancreatic body, the third detection unit 120C corresponds to a pancreatic tail, the fourth detection unit 120D corresponds to a confluence of a splenic vein, a superior mesenteric vein, and a portal vein, the fifth detection unit 120E corresponds to a pancreatic head, the sixth detection unit 120F corresponds to a gallbladder, the seventh detection unit 120G corresponds to a portal vein, the eighth detection unit 120H corresponds to a common bile duct, the ninth detection unit 120I corresponds to a gallbladder, the tenth detection unit 120J corresponds to a pancreatic uncinate process, and the eleventh detection unit 120K corresponds to a papilla.
The first to eleventh detection units 120A to 120K are learned models. A plurality of learned models are models learned using respective data sets having different ultrasound images for learning. In detail, a plurality of learned models are models learned a relationship between an ultrasound image for learning and a range of an organ displayed in the ultrasound image for learning in advance using data sets having ultrasound images for learning obtained by imaging different positions each to be an observation target part in the body of the subject.
That is, the first detection unit 120A is a model learned using a data set having ultrasound images for learning of the confluence of the aorta, the celiac artery, and the superior mesenteric artery, the second detection unit 120B is a model learned using a data set having ultrasound images for learning of the pancreatic body, the third detection unit 120C is a model learned using a data set having ultrasound images for learning of the pancreatic tail, the fourth detection unit 120D is a model learned using a data set having ultrasound images for learning of the confluence of the splenic vein, the superior mesenteric vein, and the portal vein, the fifth detection unit 120E is a model learned using a data set having ultrasound images for learning of the pancreatic head, the sixth detection unit 120F is a model learned using a data set having ultrasound images for learning of the gallbladder, the seventh detection unit 120G is a model learned using a data set having ultrasound images for learning of the portal vein, the eighth detection unit 120H is a model learned using a data set having ultrasound images for learning of the common bile duct, the ninth detection unit 120I is a model learned using a data set having ultrasound images for learning of the gallbladder, the tenth detection unit 120J is a model learned using a data set having ultrasound images for learning of the pancreatic uncinate process, the eleventh detection unit 120K is a model learned using a data set having ultrasound images for learning of the papilla.
As described above, the observation route in the body in a case of capturing the ultrasound image and the representative observation points are generally determined. For example, it is possible to learn an ultrasound image at a representative observation point and a range of an organ displayed in the ultrasound image in association with each other.
A learning method is not particularly limited as long as it is possible to learn the relationship between the ultrasound image and the range of the organ from a plurality of ultrasound images for learning, and to generate a learned model. An update method and the like of the learning method and the learned model are as described above.
The selection unit 116B functions in the same manner as the selection unit 116 of the first embodiment in regard to the selection of the detection unit from the organ name detection unit 112.
In addition, the selection unit 116B selects a detection unit corresponding to the position of the distal end portion 40 of the ultrasound endoscope 12 detected by the position and orientation detection unit 114, from the organ range detection unit 120.
That is, the selection unit 116B selects the first detection unit 120A in a case where the position of the distal end portion 40 of the ultrasound endoscope 12 is (1) confluence of aorta, celiac artery, and superior mesenteric artery, selects the second detection unit 120B in a case where the position of the distal end portion 40 of the ultrasound endoscope 12 is (2) pancreatic body, selects the third detection unit 120C in a case where the position of the distal end portion 40 of the ultrasound endoscope 12 is (3) pancreatic tail, selects the fourth detection unit 120D in a case where the position of the distal end portion 40 of the ultrasound endoscope 12 is (4) confluence of splenic vein, superior mesenteric vein, and portal vein, selects the fifth detection unit 120E in a case where the position of the distal end portion 40 of the ultrasound endoscope 12 is (5) pancreatic head, selects the sixth detection unit 120F in a case where the position of the distal end portion 40 of the ultrasound endoscope 12 is (6) gallbladder, selects the seventh detection unit 120G in a case where the position of the distal end portion 40 of the ultrasound endoscope 12 is (7) portal vein, selects the eighth detection unit 120H in a case where the position of the distal end portion 40 of the ultrasound endoscope 12 is (8) common bile duct, selects the ninth detection unit 120I in a case where the position of the distal end portion 40 of the ultrasound endoscope 12 is (9) gallbladder, selects the tenth detection unit 120J in a case where the position of the distal end portion 40 of the ultrasound endoscope 12 is (10) pancreatic uncinate process, and selects the eleventh detection unit 120K in a case where the position of the distal end portion 40 of the ultrasound endoscope 12 is (11) papilla.
The organ name and range detection controller 118B functions in the same manner as the organ name detection controller 118 of the first embodiment in regard to the control of the organ name detection unit 112.
In addition, the organ name and range detection controller 118B makes the detection unit selected by the selection unit 116B from the organ range detection unit 120 detect a range or an organ displayed in an ultrasound image for diagnosis from the ultrasound image for diagnosis.
As the organ having the range detected by the organ range detection unit 120, all observation target parts in the body of the subject that can be observed using the ultrasound observation device 14 are included.
In the ultrasound image recognition unit 168B, the position and orientation detection unit 114 detects the position and the orientation of the distal end portion 40 of the ultrasound endoscope 12 from the ultrasound image for diagnosis based on the learning result.
Subsequently, the selection unit 116B selects a detection unit corresponding to the position of the distal end portion 40 of the ultrasound endoscope 12 detected by the position and orientation detection unit 114, from the organ name detection unit 112 and the organ range detection unit 120.
Subsequently, the organ name and range detection controller 118B performs control such that the detection unit selected by the selection unit 116B detects the name of the organ displayed in the ultrasound image for diagnosis and the range of the organ from the ultrasound image for diagnosis based on the learning result.
In the ultrasound endoscope system of the second embodiment, for example, the name of the organ displayed in the ultrasound image and the range of the organ are displayed on the monitor 20 to be superimposed on the ultrasound image, and the position and the orientation of the distal end portion 40 of the ultrasound endoscope 12 are displayed on the monitor 20 to be superimposed on the anatomical schema diagram. For this reason, for example, even an operator who is unaccustomed to an ultrasound image can correctly recognize what is displayed in the ultrasound image and a range of an organ displayed in the ultrasound image. Furthermore, the operator can correctly recognize the position of the distal end portion 40 of the ultrasound endoscope 12, the direction of the distal end portion 40 of the ultrasound endoscope 12, and a part being observed, and does not get lost in the body of the subject.
Returning to
In addition, the display controller 172B displays the range of the organ recognized by the ultrasound image recognition unit 168B, on the monitor 20.
The color registration unit 174 registers a relationship between a type of an organ and a color of a range of an organ in response to an instruction from the operator. In more detail, a relationship between a type of an organ and a color of an internal region of a range of an organ or a frame indicating the range of the organ is registered. The relationship between the type of the organ and the color of the internal region or the frame of the organ registered in the color registration unit 174 is output to the display controller 172B.
The frame (hereinafter, referred to as the frame of the organ) indicating the range of the organ is a contour of the organ, and is a boundary line between the organ and another organ.
The internal region of the range of the organ (hereinafter, referred to as the internal region of the organ) is a region in a closed space surrounded by the frame of the organ.
The organ registration unit 176 registers a type of an organ for displaying a range in response to an instruction from the operator. The type of the organ for displaying the range registered in the organ registration unit 176 is output to the display controller 172B.
Next, a display method of the name of the organ and the range of the organ will be described.
The operator can designate whether or not to display a range of an organ displayed in an ultrasound image.
In a case where designation is made to display the range of the organ in response to an instruction from the operator, as shown in
On the other hand, in a case where designation is made not to display the range of the organ in response to an instruction from the operator, the display controller 172B does not display the range of the organ.
The internal region or the frame of the organ is colored and displayed, whereby it is possible to allow the operator to clearly recognize the range of the organ.
The given color is a color that is set in advance in the display controller 172B, and is not particularly limited but is desirably a color other than white or black such that the operator easily recognizes the range of the organ in the ultrasound image.
It is desirable that the display controller 172B colors the internal region or the frame of the organ for each type of the organ having the range recognized by the ultrasound image recognition unit 168B, in a different color.
The display controller 172B can color, for example, the internal regions or the frames of two or more adjacent organs in colors having hue at equal intervals or colors in a given hue range including colors having hue at equal intervals such that the operator easily identifies a difference in color. For example, the display controller 172B can color an internal region or a frame of a blood vessel, an internal region or a frame of a vessel in which a body fluid other than blood flows, and an internal region or a frame of an organ other than the blood vessel and the vessel in colors having hue at equal intervals or colors in a given hue range including colors having hue at equal intervals similarly.
For example, in a case where two organs are adjacent, an internal region or a frame of one organ is colored in a complementary color of a color of an internal region or a frame of the other organ or a color in a given hue range including the complementary color. In a case where three organs are adjacent, internal regions or frames of the three organs are colored in colors having hue at equal intervals, such as red (R), green (G), and blue (B).
With this, even though two or more organs are adjacent, the operator can clearly recognize the range of each organ based on the difference in color.
The operator can color a range of an organ in a color designated by the operator or can color a range of an organ in a color registered in advance by the operator.
In a case where a color for coloring a range of an organ is designated in response to an instruction from the operator, the display controller 172B colors the internal region or the frame of the organ in the color designated by the instruction from the operator. Alternatively, in a case where a relationship between a type of an organ and a color of an internal region or a frame of an organ is registered in the color registration unit 174, the display controller 172B colors the internal region or the frame of the organ in a color of an internal region or a frame of an organ corresponding to the type of the organ registered in the color registration unit 174.
In this way, the operator can color the range of the organ in a color desired by the operator, and thus, for example, it is possible to the operator to easily identify a type of an organ based on a color in such a manner that red is a liver, for example.
In a case of detecting the name of the organ displayed in the ultrasound image for diagnosis, the organ name detection unit 112 of the ultrasound image recognition unit 168B can calculate a confidence factor of the name of the organ recognized by the organ name detection unit 112.
The confidence factor of the name of the organ represents a probability that the name of the organ recognized by the organ name detection unit 112 is a correct name. For example, in a case where detection is made that the name of the organ is a liver, the confidence factor of the name of the organ is calculated in such a manner that the probability that the name of the organ displayed in the ultrasound image for diagnosis is a liver is 90%.
In this case, the display controller 172B can decide at least one of the display method of the name of the organ or the coloring method of the internal region or the frame of the organ displayed on the monitor 20 depending on the confidence factor calculated by the ultrasound image recognition unit 168B.
In a case of displaying the name of the organ, the display controller 172B can perform at least one of, for example, an operation to decide the size of the name of the organ displayed on the monitor 20, an operation to decide the color for coloring the internal region or the frame of the organ, or an operation to decide whether or not to display a specific character, depending on the confidence factor.
For example, in a case where the confidence factor is comparatively low, the display controller 172B decreases the size of the name of the organ, decreases the density of the color for coloring the internal region or the frame of the organ, or displays the specific character, such as “?”, to express that the probability that the name of the organ is a liver is comparatively low, like “liver?” in a case where it is assumed that the name of the organ is a liver, compared to a case where the confidence factor is comparatively high.
In a case of coloring the internal region or the frame of the organ, the display controller 172B can decide, for example, the density of the color for coloring the internal region or the frame of the organ depending on the confidence factor.
For example, in a case where the confidence factor is comparatively low, the density of the color for coloring the internal region or the frame of the organ decreases, and in a case where the confidence factor is comparatively higher, the density of the color for coloring the internal region or the frame of the organ increases.
In this way, the display method of the name of the organ and the coloring method of the internal region or the frame of the organ are decided, whereby it is possible to allow the operator to determine whether or not the name of the organ recognized by the ultrasound image recognition unit 168B is correct.
In a case where the name of the organ is superimposedly displayed on the ultrasound image for diagnosis, it is desirable that the display controller 172B decides at least one of a color of the name of the organ or the color of the internal region or the frame of the organ depending on the brightness of the ultrasound image for diagnosis displayed behind a display region of the name of the organ.
In a case where the brightness of the ultrasound image behind the display region of the name of the organ is comparatively high, the display controller 172B increases the density of the color of the name of the organ or decreases the density of the color of the internal region or the frame of the organ.
With this, even though the name of the organ and the ultrasound image in the background are superimposed or the name of the organ and the colored internal region or the frame of the organ in the background are superimposed, it is possible to allow the operator to easily see the name of the organ.
The operator can designate whether or not to display the name of the organ and whether or not to color the range of the organ.
The display controller 172B switches whether to display only one, both, or none of the name of the organ recognized by the ultrasound image recognition unit 168B and the range of the organ with the internal region or the frame colored, in response to an instruction from the operator.
For example, in a case where designation is made to display only the name of the organ in response to an instruction from the operator, as shown in
The display controller 172B can determine a position where the name of the organ recognized by the ultrasound image recognition unit 168B is displayed on the monitor 20 or can determine whether or not to display the name of the organ on the monitor 20, depending on the range of the organ recognized by the ultrasound image recognition unit 168B.
The display controller 172B does not display the name of the organ, for example, in a case where the range of the organ recognized by the ultrasound image recognition unit 168B is comparatively small, and displays the name of the organ in a case where the range of the organ is comparatively large. In a case where the range of the organ is comparatively small, the name of the organ is displayed near the range of the organ, not within the range of the organ, and in a case where the range of the organ is comparatively large, the name of the organ is displayed within the range of the organ.
The operator can display a range of only an organ of a type registered in advance by the operator.
In a case where a type of an organ for displaying a range is registered in the organ registration unit 176, only in a case where an organ having a range recognized by the ultrasound image recognition unit 168B is an organ registered in the organ registration unit 176, the display controller 172B displays the range of the organ recognized by the ultrasound image recognition unit 168B, on the monitor 20.
In this way, the operator can display a range of only an organ of a desired type, whereby it is possible to allow the operator to easily recognize the organ of the desired type.
The operator can designate a type of an organ for displaying a range.
For example, in a case where the type of the organ for displaying the range is sequentially designated in response to an instruction from the operator, the display controller 172B sequentially switches the type of the organ for displaying the range accordingly.
In the device of the invention, for example, the hardware configurations of processing units that execute various kinds of processing, such as the endoscope image recognition unit 170 (the lesion region detection unit 102, the positional information acquisition unit 104, the selection unit 106, and the lesion region detection controller 108), the ultrasound image recognition units 168 and 168B (the organ name detection unit 112, the organ range detection unit 120, the position and orientation detection unit 114, the selection unit 116, the selection unit 116B, the organ name detection controller 118, and the organ name and range detection controller 118B), the display controllers 172 and 172B, and the console (instruction acquisition unit) 100 may be dedicated hardware or may be various processors or computers that execute programs. The hardware configurations of the cine memory 150, the color registration unit 174, and the organ registration unit 176 may be dedicated hardware or may be a memory, such as a semiconductor memory.
Various processors include a central processing unit (CPU) that is a general-purpose processor executing software (program) to function as various processing units, a programmable logic device (PLD) that is a processor capable of changing a circuit configuration after manufacture, such as a field programmable gate array (FPGA), a dedicated electric circuit that is a processor having a circuit configuration dedicatedly designed for executing specific processing, such as an application specific integrated circuit (ASIC), and the like.
One processing unit may be configured of one of various processors described above or may be configured of a combination of two or more processors of the same type or different types, for example, a combination of a plurality of FPGAs, a combination of an FPGA and a CPU, or the like. Furthermore, a plurality of processing units may be configured of one among various processors or may be configured using one processor obtained by combining two or more of a plurality of processing units.
For example, as represented by a computer, such as a server or a client, there is a form in which one processor is configured of a combination of one or more CPUs and software, and the processor functions as a plurality of processing units. For example, as represented by system on chip (SoC) or the like, there is a form in which a processor that implements all functions of a system including a plurality of processing units into one integrated circuit (IC) chip is used.
In addition, the hardware configuration of various processors is, more specifically, an electric circuit (circuitry), in which circuit elements, such as semiconductor elements, are combined.
For example, a method according to the embodiment of the invention can be implemented by a program that causes a computer to execute respective steps. Furthermore, it is possible to provide a computer-readable recording medium having the program recorded thereon.
Although the invention has been described above in detail, the invention is not limited to the above-described embodiment, and various improvements and alterations may be of course made without departing the spirit and scope of the invention.
10: ultrasound endoscope system
12: ultrasound endoscope
14, 14B: ultrasound observation device
16: endoscope processor
18: light source device
20: monitor
21
a: water supply tank
21
b: suction pump
22: insertion part
24: operating part
26: universal cord
28
a: air and water supply button
28
b: suction button
29: angle knob
30: treatment tool insertion port
32
a: ultrasound connector
32
b: endoscope connector
32
c: light source connector
34
a: air and water supply tube
34
b: suction tube
36: ultrasound observation part
38: endoscope observation part
40: distal end portion
42: bending portion
43: flexible portion
44: treatment tool lead-out port
45: treatment tool channel
46: ultrasound transducer unit
48: ultrasound transducer
50: ultrasound transducer array
54: backing material layer
56: coaxial cable
60: FPC
74: acoustic matching layer
76: acoustic lens
82: observation window
84: objective lens
86: solid-state imaging element
88: illumination window
90: cleaning nozzle
92: wiring cable
100: console
102: lesion region detection unit
104: positional information acquisition unit
106, 116: selection unit
108: lesion region detection controller
102A to 102K: first to eleventh detection units
112: organ name detection unit
114: position and orientation detection unit
118: organ name detection controller
112A to 112K: first to eleventh detection units
140: multiplexer
142: reception circuit
144: transmission circuit
146: A/D converter
148: ASIC
150: cine memory
151: memory controller
152: CPU
154: DSC
158: pulse generation circuit
160: phase matching unit
162: B mode image generation unit
164: PW mode image generation unit
166: CF mode image generation unit
168, 168B: ultrasound image recognition unit
170: endoscope image recognition unit
172, 172B: display controller
174: color registration unit
176: organ registration unit
Number | Date | Country | Kind |
---|---|---|---|
2019-035846 | Feb 2019 | JP | national |
2019-156614 | Aug 2019 | JP | national |
This application is a Continuation of PCT International Application No. PCT/JP2019/045429 filed on Nov. 20, 2019, which claims priority under 35 U.S.C. § 119(a) to Japanese Patent Application No. 2019-035846 filed on Feb. 28, 2019 and Japanese Patent Application No. 2019-156614 filed on Aug. 29, 2019. Each of the above applications is hereby expressly incorporated by reference, in its entirety, into the present application.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2019/045429 | Nov 2019 | US |
Child | 17399837 | US |