The present disclosure is generally directed to interventional medical articles (for example, needles, catheters, cannulas, sheaths, etc.) including features that provide enhanced ultrasound visibility during introduction and/or delivery into a body space, such as, for example, an artery, vein, vessel, body cavity, or drainage site, and more specifically directed to systems and methods for determining the location of the medical article within the body of a patient.
Ultrasonic imaging is used to examine the interior of living tissue and the image is used to aid in the performance of medical procedures on this tissue. One such procedure is the insertion of an interventional device, such as a needle to a desired location in the tissue, for instance the insertion of a needle into a lesion or other anomaly in the tissue to take a biopsy, or to inject the tissue with a diagnostic or medical treatment material, such as a local anesthesia or nerve block. As the needle is inserted into the body of the patient, ultrasonic imaging is performed in conjunction with the insertion of the needle to illustrate on an associated display the position of the needle within the body of the patient relative to the tissue that is the target for the insertion of the needle.
In order to safely and effectively perform the procedure employing the needle, it is necessary to be able to determine the exact location of the tip of the needle in order to direct the tip into the desired area of the tissue that is the subject of the procedure. However, in some cases the entire body of the needle and particularly the tip of the needle is not readily apparent in the ultrasound image. For example, during insertion the tip of the needle may be inadvertently directed or deflected out of the imaging plane for the ultrasonic images being obtained. As a result, only the portion of the needle body behind the tip that remains in the imaging plane is visible in the displayed ultrasound image, while the actual position of the tip of the needle is disposed ahead of the portion of the needle that is visible in the displayed ultrasound image. Thus, with this displayed ultrasound image, the user may think the portion of the needle illustrated in the ultrasound image defines the proper location of the tip of the needle, such that the user can potentially cause unintentional damage to other organs or unintended injections into vessels with the further insertion of the needle into the body of the patient.
In the prior art, to enhance the ability of the ultrasound imaging system to provide an accurate display of the position of the needle including the needle tip within the body of the patient, needles have been developed that include an echogenic portion on the needle, such as those examples disclosed in US Patent Application Publication Nos. US2017/0043100, entitled Echogenic Pattern And Medical Articles Including Same, and US2012/0059247, entitled Echogenic Needle For Biopsy Device, the entirety of which are hereby expressly incorporated herein by reference for all purposes. In certain needles, the echogenic portion of the needle can be formed adjacent the tip of the needle in order to provide enhancement to the ultrasound imaging of the tip as it is inserted into the body of the patient.
However, even with the enhanced echogenic features disposed on the needle, it is still possible for the tip of the needle including the echogenic features to be directed or deflected out of the imaging plane. In that situation, the user may still view the ultrasound image showing less than the entirety of the needle and may inadvertently further insert the needle into the patient creating a highly undesirable situation.
Therefore, it is desirable to develop a system and method for the ultrasonic imaging of a needle inserted into the body of a patient that can provide the user with an accurate indication of the location of the tip of the needle when the needle tip is deflected or directed out of the imaging plane for the ultrasonic imaging system.
In one exemplary embodiment of the invention, an ultrasound imaging system for obtaining ultrasound images of an interior of an object includes a processing unit configured to receive and process acquired ultrasound image data to create ultrasound images derived from ultrasound image data, the procession unit including a detection and recognition system configured to detect a pattern of echogenic features within the created ultrasound images, a memory unit operably connected to the processing unit and storing information regarding individual interventional devices, including stored echogenic patterns and specific locations and dimensions of the stored echogenic patterns on the individual interventional devices, a display operably connected to the processing unit to present the ultrasound images to a user, an ultrasound imaging probe operably connected to the processing unit to acquire the ultrasound image data for use by the processing unit to form the ultrasound images, wherein the detection and recognition system is configured to detect a pattern of echogenic features within the ultrasound image data, to compare a detected echogenic pattern to the stored echogenic patterns within the memory unit and to position an indicator within an ultrasound image on the display illustrating a position of viewable and non-viewable parts of the detected echogenic pattern.
In another exemplary embodiment of the invention, an ultrasound imaging system includes a processing unit configured to receive and process acquired ultrasound image data to create ultrasound images derived from ultrasound image data, the procession unit including a detection and recognition system configured to detect a pattern of echogenic features within the created ultrasound images, a memory unit operably connected to the processing unit and storing information regarding individual interventional devices, including stored echogenic patterns and specific locations and dimensions of the stored echogenic patterns on the individual interventional devices, a display operably connected to the processing unit to present the ultrasound images to a user, an ultrasound imaging probe operably connected to the processing unit to acquire the ultrasound image data for use by the processing unit to form the ultrasound images, and an interventional device adapted to be inserted into an object adjacent the imaging probe, the interventional device including one or more echogenic portions thereon, the one or more echogenic portions including one or more echogenic features disposed thereon forming one or more echogenic patterns, wherein the detection and recognition system is configured to detect one or more echogenic patterns within the ultrasound image data, to compare a detected echogenic pattern to the stored echogenic patterns within the memory unit and to position an indicator within an ultrasound image on the display illustrating a position of viewable and non-viewable parts of the detected echogenic pattern.
In still another exemplary embodiment of the method of the invention, a method for providing an indication of viewable and non-viewable parts of an interventional device in an ultrasound image including the steps of providing an ultrasound imaging system having a processing unit configured to receive and process acquired ultrasound image data to create ultrasound images derived from ultrasound image data, the procession unit including a detection and recognition system configured to detect a pattern of echogenic features within the created ultrasound images, a memory unit operably connected to the processing unit and storing information regarding individual interventional devices, including stored echogenic patterns and specific locations and dimensions of the stored echogenic patterns on the individual interventional devices, a display operably connected to the processing unit to present the ultrasound images to a user, an ultrasound imaging probe operably connected to the processing unit to acquire the ultrasound image data for use by the processing unit to form the ultrasound images, and an interventional device adapted to be inserted into an object adjacent the imaging probe, the interventional device including one or more echogenic portions thereon, the one or more echogenic portions including one or more echogenic features disposed thereon forming one or more echogenic patterns, wherein the detection and recognition system is configured to detect one or more echogenic patterns within the ultrasound image data, to compare a detected echogenic pattern to the stored echogenic patterns within the memory unit and to position an indicator within an ultrasound image on the display illustrating a position of viewable and non-viewable parts of the detected echogenic pattern, inserting the interventional device into the object, obtaining ultrasound image data using the probe, matching one or more detected echogenic patterns to corresponding stored echogenic patterns in the memory unit, determining the dimensions of the stored echogenic patterns from the memory unit, determining the viewable and non-viewable parts of the one or more detected echogenic patterns by comparing the dimensions of the stored echogenic patterns with a representation of the one or more detected echogenic patterns in the ultrasound image data, and positioning the indicator within the ultrasound image on the display in alignment with the viewable and non-viewable parts of the one or more detected echogenic patterns on the interventional device.
It should be understood that the brief description above is provided to introduce in simplified form a selection of concepts that are further described in the detailed description. It is not meant to identify key or essential features of the claimed subject matter, the scope of which is defined uniquely by the claims that follow the detailed description. Furthermore, the claimed subject matter is not limited to implementations that solve any disadvantages noted above or in any part of this disclosure.
Referring to
Further, the system 100 includes a processing unit 120 communicatively coupled to the transmit circuitry 110, the beamformer 116, the probe 106, and/or the receive circuitry 114, over a wired or wireless communications network 118. The processing unit 120 may be configured to receive and process the acquired image data, for example, the RF signals according to a plurality of selectable ultrasound imaging modes in near real-time and/or offline mode.
Moreover, in one embodiment, the processing unit 120 may be configured to store the acquired volumetric images, the imaging parameters, and/or viewing parameters in a memory device 122. The memory device 122, for example, may include storage devices such as a random access memory, a read only memory, a disc drive, solid-state memory device, and/or a flash memory. Additionally, the processing unit 120 may display the volumetric images and or information derived from the image to a user, such as a cardiologist, for further assessment on a operably connected display 126 for manipulation using one or more connected user input-output devices 124 for communicating information and/or receiving commands and inputs from the user, or for processing by a video processor 128 that may be connected and configured to perform one or more functions of the processing unit 120. For example, the video processor 128 may be configured to digitize the received echoes and output a resulting digital video stream on the display device 126.
Referring now to
In the exemplary illustrated embodiment of
Alternatively, as shown in the illustrated exemplary embodiment of
Referring now to
Looking now at
In an alternative exemplary embodiment, as a substitute for or a supplement to the automatic detection of the pattern 47 by the detection and recognition system 200 to identify the needle 32, the user interface/input device 124 can be operated to allow the user to select the type of needle 32 that is to be used in the procedure. The detection and recognition system 200 can then identify the pattern 47 for the needle 32 selected by the user and operate to locate that pattern 47 within the image data/ultrasound image 202. This information on the needle 32 to be used can be supplied to the detection and recognition system 200 by the user in various manners, such as through the input device 124, such as by manually entering identifying information on the needle 32, or by scanning a barcode or RFID located on packaging for the needle 32 including the identifying information.
If one or more echogenic portions 50 are determined to be present in the ultrasound image data/image 202, in block 302 the detection and recognition system 200 accesses the memory unit 122 containing the stored information on the different patterns of echogenic features associated with particular interventional devices 30/needles 32. The pattern 47 of the echogenic features 34 disposed on the echogenic portion(s) 50 detected by the detection and recognition system 200 is compared to the stored patterns in order to match the detected pattern 47 to the pattern utilized on a particular interventional device 30/needle 32.
Once the pattern 47 of the echogenic portion 50 detected in the ultrasound image data/image 202 is recognized and/or matched with a particular manufacturer, in block 304 the information stored in the memory unit 122 regarding the specific configuration of the particular interventional device 30/needle 32 including the recognized pattern 47 can be employed by the detection and recognition system 200 to determine the position of the needle 32 in relation to the ultrasound image 202. This is accomplished by the recognition and detection system 200 by comparing the location and/or dimensions (e.g., length) of the echogenic portion(s) 50,50′ and associated pattern(s) 47,47′ detected in the ultrasound image 202 and associated with a particular needle 32 with the dimensions of the needle 32 stored in the memory unit 122. For example, if the needle 32 detected in the ultrasound image 202 includes two echogenic portions 50,50′ spaced from one another by a band 52, with the echogenic portion(s) 50,50′ and the band 52 each having a specified length, the recognition and detection system 200 can determine the length of the body 40 of the needle 32 that is present within the image 202 based on the length of the echogenic portion(s) 50,50′ and band(s) 52 visible in the image 202.
Using this information, in block 306 the recognition and detection system 200 can provide an enhancement to the representation of the needle 32 within the ultrasound image 202. Referring to
For example, as illustrated in the exemplary embodiment of
If the length of the first echogenic portion 50′ stored in the memory unit 122 corresponds to the length of the first echogenic portion 50′ represented in the ultrasound image 202, the detection and recognition system 200 can provide the device indicator 400 illustrating that the entirety of the first echogenic portion 50′ and the tip 42 are visible within the ultrasound image 202.
Conversely, if the system 200 determines that the length of the first echogenic portion 50′ stored in the memory unit 122 does not correspond to the length of the first echogenic portion 50′ represented in the ultrasound image 202, the detection and recognition system 200 can provide a device indicator 400 illustrating that a portion of the first echogenic portion 50′ and the tip 42 are outside of the image plane/frame 402 represented in the ultrasound image 202.
As shown in the illustrated exemplary embodiment of
Alternatively, in the situation where the detection and recognition system 200 determines that less than the entire length of the first echogenic portion 50′ is represented or viewable within the ultrasound image 202, illustrating that the tip 42 has been directed and/or deflected out of the imaging plane for the image 202, the system 200 will position the boundary lines 404 along each side of the needle 32 represented in the ultrasound image 202. However, in this situation the boundary lines 404 presented by the system 202 will extend beyond the length of the needle 32 represented in the ultrasound image 202 to correspond in length and position to the actual position of the tip 42 of the needle 32 as determined by the detection and recognition system 200. As shown in the exemplary illustrated embodiment of
Further, in block 308 these first portions 411 and second portions 412 can be altered in orientation and/or length by the system 200 as the tip 42 of the needle 32 is moved, e.g., closer to or further away from the plane of the ultrasound image 202, as determined by the system 200 based on the portion(s) 50,50′ and/or band(s) 52 of the needle 32 that are viewable within the ultrasound image 202. As the ultrasound images 202 are presented to the user on the display 126, which can be done in a user-controlled frame rate or cine manner, or as a real-time, video display of the current position and movement of the needle 32 within the patient 20, the detection and recognition system 200 can alter the device indicator 400/boundary lines 404 to reflect the real time position of the needle 32 and tip 42 within and/or relative to the frame/image plane 402 represented by the images 202.
In addition to the length of the portions 411,412 of the boundary lines 404, the detection and recognition system 200 can enhance the indication of the location of the tip 42 out of the plane 402 of the ultrasound image 202 using the boundary lines 404. For example, the system 200 can change or add color to the portions 412 of the boundary lines 404 that is different from that for the first portions 411, such as by changing the color of those dots 406 forming the second portions 412 of the boundary lines 404 as shown in
As best shown in
In addition to the device indicator 400/boundary lines 402 and path indicator 500, the detection and recognition system 200 can directly enhance the representation of the tip 42 within the ultrasound image 202 based on the fact that the position of the tip 42 is now known. More specifically, in one exemplary embodiment, the detection and recognition system 200 can brighten the representation and/or the expected area or location of the tip 42 within the ultrasound image 202 or provide another icon 403 aligned with the position of the tip 42 within the ultrasound image 202 as determined by the system 200. Alternatively, the detection and recognition system 200 can detect the motion of the tip 42 to brighten it, such as by changing some scan parameters in the area of the image data/ultrasound image 202 where motion of the tip 42 was detected to achieve higher resolution in that area.
With the system additionally being provided with the exact location of the target tissue 102 within the patient 20, the system 200 can also provide information on the display 126 regarding the distance between the tip 42 and the target tissue 102, such as a line extending between the tip 42 and tissue 102 and/or a real time measurement 600 (
The written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the invention is defined by the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal language of the claims.