The present invention relates to a system and a method for processing ultrasound images.
More in particular, the present invention to a system and a method for processing ultrasound images in interaction with a medical instrument for treating and diagnosing pathologies.
Standard ultrasound examinations are known which have been developed to enable a medical operator to intervene with medical instruments in order to treat and diagnose pathologies, such as, for example, a needle for amniocentesis/biopsy, simultaneously with the ultrasound examination.
An operator performing an ultrasound examination positions an ultrasound probe on a patient's body in the position closest to the area in which he or she will have to intervene with the instrument, so that the ultrasound image produced on the ultrasound beam plane is displayed on a monitor, and performs the function of guiding the medical operator during the intervention.
The operator who has to intervene on the patient inserts an instrument into the body following a path deduced from the ultrasound image produced and visible on the monitor; the operator inserts the instrument into the body along the plane of the ultrasound beam, according to practice, in order to clearly visualize the needle and be able to act in an optimal manner.
A serious drawback of this approach is that when the instrument is not perfectly aligned with the ultrasound plane, the instrument is only partially visible on the monitor, and the visibility is often reduced to a single point of intersection with the ultrasound plane, making its use complex and dangerous.
The incomplete visibility of the instrument could cause an interference of the instrument itself with organs and tissues along the path of access to the organ and with the tissue that is the target for performing the diagnostic or therapeutic function for which the instrument is intended.
There is a strong need to improve the interaction between an instrument and the practice of ultrasound examinations in order to overcome the problems described, and in general to avoid interference with an internal organ/tissue not visualized/visualizable with a standard ultrasound scan.
The general object of the present invention is to provide an ultrasound image processing system/method that overcomes the problems of the prior art.
A specific object of the present invention is to provide an ultrasound image processing system/method that simplifies the interaction between the medical instrument and ultrasound images.
Another object of the present invention is to provide an ultrasound image processing system/method that is simple for a medical operator to use.
A further object of the present invention is to provide an ultrasound image processing system/method that is minimally risky for the patient.
In a first aspect of the invention, these and other objects are achieved by an ultrasound image processing system comprising:
Preferably, said processing unit is further configured to:
The ultrasound image processing system preferably comprises augmented and virtual reality glasses comprising said camera coupled to said display.
Said first position sensors are preferably provided on said ultrasound instrument. Said ultrasound instrument preferably comprises an ultrasound probe.
The ultrasound image processing system preferably comprises second position sensors provided in fixed positions relative to said ultrasound instrument.
Said third position sensors are preferably provided on said medical instrument. Said medical instrument preferably comprises an interventional needle.
Preferably, said virtualized image preferably comprises:
A positional reference system is preferably defined by said second position sensors and by respective second position data.
Said georeferenced ultrasound image and said virtualized and georeferenced image of the medical instrument are preferably georeferenced relative to said positional reference system.
In a second aspect of the invention, these and other objects are achieved by an ultrasound image processing method comprising the steps of:
Preferably, there are provided the further steps of:
Preferably, there is provided a step of providing augmented and virtual reality glasses comprising said camera coupled to said display.
Preferably, there is provided a step of:
The invention as described enables an interaction between a medical instrument and the practice of ultrasound examination such as to overcome the problems illustrated, thus achieving the following technical effects:
The stated technical effects/advantages and other technical effects/advantages of the invention will emerge in greater detail from the description that follows, accompanied by an example of an embodiment, given by way of illustration and not limitation with reference to the appended drawings.
FIG. 1A1 is an overall view of one embodiment of the invention with respective sensors integral with an ultrasound instrument and with a medical instrument;
FIG. 1A2 is an overall view of a variant of the embodiment of the invention with sensors applied to a patient's body;
The invention describes a system for processing ultrasound images and tracing medical instruments.
A first aspect of the invention describes an ultrasound image processing system comprising an ultrasound instrument, a medical instrument, a camera in data connection with said instruments, and an image processing device configured to process images originating from the camera and/or from the instruments in order to define an overall image in which the reciprocal positioning of the entire medical instrument relative to a georeferenced ultrasound image determined by the ultrasound instrument is visible.
With particular reference to
The ultrasound instrument is coupled to first position sensors S1 (
The technical effect achieved is a real-time detection of the position of the ultrasound instrument and a consequent georeferencing relative to the instrument itself.
In one variant, second position sensors (markers) Mi (i=1 . . . n) are provided, in addition to those coupled to the ultrasound instrument, and also positioned in fixed positions relative to the patient; they are shown as M1, M2, M3 and M4 in FIG. 1A2.
In other words, the second position sensors Mi are applied to the patient's body. This allows the ultrasound image to be georeferenced relative to the patient's body.
The technical effect achieved is that the ultrasound image picked up is rendered stable relative to the positioning of the second position sensors Mi applied to the patient, also with several repetitions.
Every image acquired by the ultrasound instrument thus becomes referenced and in a stable position relative to the patient's body, also independently of the successive acquisitions by the ultrasound instrument.
The stable referencing operation can be repeated multiple times to render all the stable georeferenced images visible simultaneously.
In a preferred embodiment of the invention, the ultrasound instrument 20 comprises an ultrasound probe.
With particular reference to
The ultrasound image processing system further comprises a medical instrument 30, in particular adapted to be inserted into the patient's body.
The medical instrument 30 is coupled to third position sensors S2 (
According to the invention, the third position sensors S2 are positioned on the medical instrument in a position proximal to a handgrip thereof, thus remaining outside the patient's body during a generation of images of interaction between the ultrasound and medical instruments.
In this manner, the third position sensors S2 can be acquired directly by a camera, as described below, as the visual path between the third sensors and the camera itself is in no way obstructed. The technical effect achieved is a real-time detection of the position of the medical instrument.
The tip of the instrument that comes into contact with and penetrates the patient's body is situated in a position distal from the handgrip.
In a preferred embodiment of the invention, the medical instrument 30 comprises an interventional needle.
In particular, the interventional needle is a needle for biopsy/amniocentesis or the like.
With reference to
The camera 10, according to the invention, is adapted to capture a real scene (Img_Real) of the ultrasound instrument 20 and of the medical instrument 30 during a reciprocal interaction thereof.
In particular, according to the invention, for the purpose of detecting the reciprocal interaction of said instruments it is not necessary for there to be coplanarity between the reciprocal planes of action.
Associated with the camera 10 there is a display 11 configured to display the images/image stream acquired by the camera.
In one embodiment of the invention, shown in particular in
The ultrasound image processing system, according to the invention, further comprises an image processing device 50 (
The image processing device 50 is further configured to process the images 20_Eco_Img and 30_Img originating respectively from the ultrasound instrument 20 and from the camera 10.
The image processing device 50 is adapted to receive the ultrasound image 20_Eco_Img from the ultrasound instrument 20.
In other words, the ultrasound instrument 20 is adapted to transmit the ultrasound image 20_Eco_Img to the image processing device 50.
According to the invention, the camera 10 is adapted to pick up first position data DS1_pos of the ultrasound instrument 20 by means of the first position sensors S1 and to transmit the first position data DS1_pos to the image processing device 50 for a corresponding processing.
Alternatively or additionally, the camera 10 is adapted to pick up second position data DM1i of the ultrasound instrument 20 by means of the second position sensors Mi and to transmit the second position data DMi to the image processing device 50 for a corresponding processing.
The camera 10 is further adapted to
According to the invention, the image processing device 50 comprises an electronic processing unit 150 configured to process the images originating from the camera 10 and/or from the instruments 20 and 30.
In the invention presented and in the subsequent claims, the electronic processing unit 150 is presented as divided into distinct functional modules (memory modules or operating modules) for the sole purpose of describing the functions thereof in a clear and complete manner.
The electronic processing unit 150 can consist of a single electronic device, suitably programmed to perform the functions described, and the different modules can correspond to hardware entities and/or routine software forming part of the programmed device.
Alternatively or additionally, said functions can be carried out by a plurality of electronic devices over which the aforesaid functional modules can be distributed. The electronic processing unit 150 can further rely on one or more processors to execute the instructions contained in the memory modules.
The aforesaid functional modules can also be distributed over various local or remote computers based on the architecture of the network they reside in.
The system of the invention enables numerous advantages/technical effects to be obtained for the different individuals involved.
According to the invention, with particular reference to
In other words, the processing unit 150 comprises a first processing module 152A1 configured to process the ultrasound image 20_Eco_Img and the first position data DS1_pos, thereby determining a georeferenced ultrasound image 20_Eco_Img_geo.
Preferably, the ultrasound image 20_Eco_Img_geo is georeferenced relative to a positional reference system Geo_Ref defined by the second position sensors Mi and respective second position data DMi together.
The second position sensors Mi are preferably optical or electromagnetic sensors.
With particular reference to
In other words, the processing unit 150 comprises a graphic positioning module 152A2 configured to graphically position the georeferenced ultrasound image 20_Eco_Img_geo along the ultrasound projection beam coming out of the ultrasound instrument 20 along the ultrasound beam plane 20_P_beam, with the optical effect of displaying the georeferenced image 20_Eco_img_geo.
Since the first position sensors S1 are positioned so as to remain outside the patient's body, in order to avoid invading the patient's body in any way with said sensors it is necessary to determine a correct theoretical position of the medical instrument inside the body also when it is not visible from the outside.
With particular reference to
In other words, the processing unit 150 comprises a processing module 152A31 configured to process 152A31 the image of the medical instrument 30_Img and the third position data DS2_pos, thereby determining a virtualized image 30_V_Img of the medical instrument 30.
In other words, with particular reference to
According to the invention, therefore, the processing module 152A31 is configured to graphically construct the virtualized image 30_V_Img on the basis of the position data picked up by the first position sensors S1 and a predefined conformation conf_30 of the medical instrument 30.
It follows that, in the case of a medical needle provided with first position sensors S1 and having a predefined shape conf_30, a virtualized image 30_V_Img of the needle both inside and outside the body is exactly reconstructed, ensuring that the operator can see the position of the needle in the body and the continuation of the image of the needle outside the body as if the body were open and the needle accessible.
The technical effect achieved is, therefore, that of rendering the position of the needle detectable and evaluable also in areas where this is not physically possible.
The processing unit 150 is further configured to process 152A32 the virtualized image 30_V_Img and the third position data DS2_pos, thereby determining a virtualized and georeferenced image 30_V_Img_geo of the medical instrument 30, wherein the virtualized and georeferenced image 30_V_Img_geo lies along a second image plane P_30_Img.
It may be understood that the second image plane P_30_Img is identified in terms of position in space, as it is the plane in which the virtualized and georeferenced image 30_V_Img_geo lies, in turn identified in terms of position in space by means of the virtualized image 30_V_Img—wherein said virtualized image 30_V_Img is graphically constructed on the basis of the position data picked up by the first position sensors S1 and a predefined conformation conf_30 of the medical instrument 30—and by means of the third position data DS2_pos (picked up by means of the third position sensors S2).
In particular, the virtualized image 30_V_Img_geo of the medical instrument 30 is georeferenced in reference to the positional reference system Geo_Ref.
In other words, the processing unit 150 comprises a second processing module 152A32 (
With particular reference to
It may be understood that the second image plane P_30_Img intersects the ultrasound beam plane 20_P_beam, since the virtualized and georeferenced image of the medical instrument 30_V_Img_geo is overlaid onto the georeferenced ultrasound image 20_Eco_Img_geo, wherein the virtualized and georeferenced image of the medical instrument 30_V_Img_geo is defined in terms of spatial coordinates by means of the virtualized image 30_V_Img and the third position data DS2_pos, and the georeferenced ultrasound image 20_Eco_Img_geo is defined in terms of spatial coordinates by means of the ultrasound image 20_Eco_Img and the first position data DS1_pos; consequently, the intersection between the second image plane P_30_Img and the ultrasound beam plane 20_P_beam is defined in terms of spatial coordinates. In other words, the electronic processing unit 150 comprises an overlay module 152A4 configured to overlay the virtualized and georeferenced image of the medical instrument 30_V_Img_geo onto the georeferenced ultrasound image 20_Eco_Img_geo so that the second image plane P_30_Img intersects the ultrasound beam plane 20_P_beam.
The technical effect achieved by said overlay is a determination of an overall image 20Eco_30 (
The major technical effect of this solution is that the operator always has at his or her disposal the reciprocal positions of the entire medical instrument and of the ultrasound plane and does not need to decide how to move the medical instrument to render it visible inside the body by means of an ultrasound image, as is the case in the prior art, based only on his or her own experience and in the absence of objective references. The operator can thus act upon the medical instrument with a full view of where it is positioned and also in reference to the georeferenced ultrasound image 20_Eco_Img_geo.
According to the invention, in fact, the needle is entirely visible and virtualized, i.e. both the part thereof outside the patient's body and the one inside the patient's body are visible.
With particular reference to
In other words, the processing unit 150 comprises a receiving module 151 configured to receive the real captured scene Img_Real from the camera 10. With particular reference to
In other words, the processing unit 150 comprises an overlay module 152B configured to graphically overlay the overall image 20Eco_30 onto the real scene Img_Real received, thereby determining the augmented reality overall image 20Eco_30_RA.
Furthermore, with reference to
In other words, the processing unit 150 comprises a sending module 152B configured to send the augmented reality overall image 20Eco_30_RA to the image display 11.
The image display 11 is adapted to display the augmented reality overall image 20Eco_30_RA.
The technical effect achieved is the possibility of seeing, with the glasses 1 or the monitor 10, both the first sensors S1 and the third sensors S2, and of reading the interaction between the ultrasound plane and the medical instrument.
Consequently, when the medical instrument, in particular the needle, disappears into the patient's body, the reciprocal positioning of the needle and of the ultrasound plane is visible all the same.
A second aspect of the invention relates to a method for processing ultrasound images, comprising the steps of:
Preferably, the method comprises the further steps, on the part of the image processing device 50, of:
The method further comprises the step of providing augmented and virtual reality glasses 1 comprising the camera 10 coupled to the display 11.
Further steps of the method correspond to the previously described functionalities of the ultrasound instrument and/or medical instrument and/or camera and/or image processing device.
It may be understood from the description that, in the system and in the method of the invention, the second image plane P_30_Img corresponds to the plane in which the virtualized and georeferenced image 30_V_Img_geo lies, thus being identified in terms of position in space.
It may be understood from the description that, in the system and in the method of the invention, the virtualized and georeferenced image 30_V_Img_geo is identified in terms of position in space by means of the virtualized image 30_V_Img and the third position data DS2_pos.
It may be understood from the description that, in the system and in the method of the invention, the virtualized image 30_V_Img is graphically constructed on the basis of the position data picked up by the first position sensors S1 and a predefined conformation conf_30 of the medical instrument 30.
It may be understood from the description that, in the system and in the method of the invention, the georeferenced ultrasound image 20_Eco_Img_geo is defined in terms of spatial coordinates by means of the ultrasound image 20_Eco_Img and the first position data DS1_pos.
It may be understood from the description that, in the system and in the method of the invention, the overlay between the virtualized and georeferenced images of the medical instrument 30_V_Img_geo and the georeferenced ultrasound image 20_Eco_Img_geo determines the intersection between the second image plane P_30_Img and the ultrasound beam plane 20_P_beam, defining the intersection in terms of spatial coordinates.
An inventive system and method for processing ultrasound images have been described.
The invention as described enables an innovative interaction between a medical instrument and the practice of ultrasound examination, achieving the following technical effects:
Number | Date | Country | Kind |
---|---|---|---|
102020000023353 | Oct 2020 | IT | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/IB2021/059109 | 10/5/2021 | WO |