Medical imaging techniques generally can be used to collect data and generate in-vivo visualization of anatomical areas of interest. One such example is intravascular imaging, where vascular structures and lumens may be imaged. For instance, intravascular imaging may be used to produce one or more images of the coronary artery lumen, coronary artery wall morphology, and devices, such as stents, at or near the coronary artery wall. Images generated using medical imaging techniques can be useful for diagnostic purposes, such as identifying diagnostically significant characteristics of a vessel. However, generally information collected during medical imaging can include data that may not be relevant to the purpose for which the imaging is being performed, and, in some cases, may even obscure clinically useful data.
Systems and methods are disclosed to reduce near-field artifacts from intravascular images. Catheter based intravascular imaging devices are susceptible to interference caused by a catheter body which can introduce near-field image artifacts in an intravascular image. Disclosed systems and methods are adapted to identify near-field image artifacts in imaging data and generate an enhanced intravascular image by reducing near-field image artifacts. One or more techniques can be used to reduce near-field image artifacts including spatial, circumferential, and/or radial filtering, as well as employing shader techniques.
The following drawings are illustrative of particular embodiments of the invention and therefore do not limit the scope of the invention. The drawings are not necessarily to scale, unless so stated. The drawings are intended for use in conjunction with the explanations in the following detailed description. Embodiments of the invention will hereinafter be described in conjunction with the appended drawings, wherein like numerals denote like elements.
The following detailed description is exemplary in nature and is not intended to limit the scope, applicability, or configuration of the invention in any way. Rather, the following description provides some practical illustrations for implementing exemplary embodiments of the present invention. Examples of constructions, materials, and processes are provided for selected elements, and all other elements employ that which is known to those of ordinary skill in the field of the invention. Those skilled in the art will recognize that many of the noted examples have a variety of suitable alternatives.
Catheter assembly 102 can include an intravascular imaging device 108 adapted to generate imaging data. Intravascular imaging device 108 can be in communication with imaging engine 140. In some examples, intravascular imaging device 108 is an ultrasonic device adapted to emit and receive ultrasound energy and generate ultrasound data. In some examples, intravascular imaging device 108 is an optical coherence tomography (OCT) device adapted to emit and receive light and generate OCT data.
Translation device 119 can be configured to translate intravascular imaging device 108 of catheter assembly 102. The translation device 119 may comprise a linear translation system (LTS) 122. As is discussed elsewhere herein, LTS 122 may be mechanically engaged with catheter assembly 102 and configured to translate the catheter assembly 102 a controlled distance within the patient 144 during a translation operation, for example a pullback or push-forward operation. System 100 may comprise a patient interface module (PIM) 120 configured to interface the translation device 119 with the catheter assembly 102.
Imaging engine 140 can be in communication with intravascular imaging device 108 and translation device 119. According to some examples, the imaging engine 140 may comprise at least one programmable processor. In some examples, the imaging engine 140 may comprise a computing machine including one or more processors configured to receive commands from a system user 142 and/or display data acquired from catheter assembly 102 via a user interface. The computing machine may include computer peripherals (e.g., keyboard, mouse, electronic display) to receive inputs from the system user 142 and output system information and/or signals received from catheter assembly 102 (e.g., rendered images). In some examples, the user interface of the computing machine may be a touchscreen display configured to act as both an input device and an output device. In some examples, imaging engine 140 may include memory modules for storing instructions, or software, executable by the one or more processors.
The structure of imaging engine 140 can take a variety of forms. In some embodiments, the imaging engine can be made of an integrated machine that is configured to displace blood and to generate the screening and blood-displaced images. In some embodiments, the imaging engine can include separate injection and imaging apparatuses. In some such embodiments, the injection apparatus can be configured to displace blood, and the imaging apparatus can be configured to generate the screening and blood-displaced images. In some embodiments involving separate injection and imaging apparatuses, the two separate apparatuses can be configured to communicate and synchronize with one another. In some embodiments involving separate injection and imaging apparatuses, the injection apparatus can include a manual injection apparatus.
PIM 230 can provide an electromechanical interface between catheter assembly 240 and imaging engine 210. In some examples, PIM 230 may provide a catheter interface 232 to secure catheter assembly 240 to system 200. The PIM 230 may include a motor 234 configured to provide mechanical energy to rotate an intravascular imaging device of catheter assembly 240. According to some examples, PIM 230 may provide an electrical interface that transmits signals to the intravascular imaging device of catheter assembly 240 and receives return signals. In some examples, the intravascular imaging device may be electrically rotated via a phased array of ultrasound transducers.
Translation device 220 can be configured to provide longitudinal translation of catheter assembly 240. Translation device 220 may comprise a Linear Translation System (LTS). The translation device 220 may be configured to mate with PIM 230 and catheter assembly 240 to enable controlled pullback of an intravascular imaging device of catheter assembly 240.
According to some examples, translation device 220 may feature a translation user interface 222 which may comprise a translation display configured to display translation data associated with the translation of the intravascular imaging device to a user of system 200. In some examples, translation data may include linear distance traversed and/or translation speed. The translation user interface 222 may be configured to receive inputs from a user to control starting/stopping translation, setting translation speed, resetting linear distance traversed to zero, and/or switching to manual mode. In manual mode, a user may freely move the intravascular imaging device of the catheter assembly forward and backward (e.g., distally and proximally). In some examples, the translation device 220 may be configured to enable both pullback and push-forward of the intravascular imaging device at a controlled rate. In another example, the translation device 220 may be configured to oscillate, or cycle, the intravascular imaging device by alternately performing pullback and push-forward operations. In some examples, translation device 220 may include a position sensor configured to measure a distance of a translation operation.
Injection system 250 can be configured to deliver fluid into a vessel of a patient via catheter assembly 240. Injection system 250 may comprise an injector pump 252 configured to deliver one or more fluids (e.g., contrast, saline, therapeutic agent(s)) into the patient. In some examples, the injector pump 252 may be automated, in electrical communication with, and controlled by imaging engine 210. According to some examples, injector pump 252 may comprise a manual pump (e.g., syringe injection) configured to allow a user to manually deliver one or more fluids into the patient. As is discussed elsewhere herein, the injection system 250 may be in fluid communication with an intravascular blood displacement fluid port, which may be associated with catheter assembly 240, such that fluid from the injection system 250 is delivered into a patient's vasculature via the intravascular blood displacement fluid port. As can be appreciated, the injection system 250 may be configured to deliver any number of fluids and any quantity of fluid as appropriate for a specific application of system 200. In some examples, the quantity of blood displacement fluid may comprise a contrast media. In some examples, the quantity of blood displacement fluid may comprise saline.
As noted, in some examples, an injection system may deliver a quantity of fluid (e.g., a bolus of fluid) through an intravascular blood displacement fluid port into a vessel of a patient. In some such examples, catheter assembly 300 may include an injection cannula 342 in fluid communication with the injection system upstream of point 340. The injection cannula 342 can include an injection cannula lumen 344 and an intravascular blood displacement fluid port 346 for delivering the fluid into the vessel. The injection system may deliver small boluses of fluid (e.g., saline, contrast dye, therapeutic agent(s)) into the injection cannula lumen 344, out the intravascular blood displacement fluid port 346, and into the vessel. In other examples, the catheter assembly 300 need not include the injection cannula 342. Instead, the catheter assembly 300 can directly utilize the lumen in which the injection cannula 342 is disposed for conveying a quantity of fluid into the vessel at fluid port 346. The blood displacement fluid port 346 may be located in a proximal section 320 of the catheter assembly 300 upstream of imaging element 308 such that the injected bolus will travel along with the blood flow within the vessel (i.e., left to right with reference to
The injection system can be used to deliver a quantity of fluid (e.g., a bolus of fluid) into a vessel of a patient using the space defined between the catheter sheath 303 and guide catheter 372. In such embodiments, the space defined between the catheter sheath 303 and guide catheter 372 can serve the function of the injection cannula as described for
Referring back to
Computer storage article 214 can be adapted to store instructions executable by processor 212 (e.g., software). In some examples, computer storage article 214 can include one or more non-transitory computer readable storage media which may include volatile and/or non-volatile memory including, e.g., random access memory (RAM), read only memory (ROM), programmable read only memory (PROM), erasable programmable read only memory (EPROM), electronically erasable programmable read only memory (EEPROM), flash memory, a hard disk, a CD-ROM, a floppy disk, a cassette, magnetic media, optical media, or other computer readable media as appropriate for a specific purpose. In some examples, instructions can be embedded or encoded in computer storage article 214 which can cause processor 212 to perform a method, e.g., when the instructions are executed. For example, computer storage article 214 can store program modules adapted to execute on processor 212. As will be discussed further herein, in some examples the stored program modules can include a receiving module, a detection module, and an image processing module.
System 200 can be adapted to reduce near-field artifacts in an intravascular image. In some examples, near-field artifacts can be caused by interference associated with a catheter. For example, with reference to
As noted above, near-field artifacts can be caused when a catheter body interferes with the emission and reception of wave-based energy. More specifically, catheter body 522 can interfere with the emission and reception of wave-based energy from imaging device 525 to cause near-field artifacts in artifact area 521. In some examples, artifact area 521 can be a subset of the imaging data that includes, or is likely to include, a near-field artifact. The size of artifact area 521 can correspond with an area within artifact distance 530 of imaging device 525. As will be discussed further below, artifact distance 530, and consequently the size of artifact area 521, can be predetermined or dynamically calculated to facilitate reduction of near-field artifacts.
As in
In some examples, catheter assembly 610 can be adapted to provide the option to displace blood 612 from a vessel before imaging the vessel. Where catheter assembly is adapted to displace blood, catheter assembly 610 can be adapted to clear 614 the vessel before generating 616 image data. Clearing 614 the vessel of blood can reduce interference for ultrasound based imaging elements or provide line of sight for OCT based imaging element.
Receiving module 630 of imaging engine 620 can be adapted to receive 632 imaging data. In some examples, receiving module 630 is adapted to receive 632 imaging data from an intravascular imaging device of catheter assembly 610. Receiving module 630 can also receive imaging parameters/settings from catheter assembly 610. In such examples, imaging parameters/settings can be used to facilitate reduction of near-field artifacts as the manifestation of near-field artifacts in imaging data can vary depending on the imaging parameters/settings. Imaging parameters/settings that can be taken into account when reducing near-field artifacts can include, but is not limited to, a type of catheter body (e.g., construction/material), whether vessel is cleared of blood, and imaging frequency.
Detection module 640 of imaging engine 620 can be adapted to provide the option to automatically determine 642 an artifact area and automatically set 644 an artifact area. In some examples, detection module 640 can be adapted to automatically set 644 the artifact area based on imaging parameters passed to imaging engine 620 from catheter assembly 610. In some examples, detection module 640 can be adapted to automatically set 644 the artifact area by analyzing the imaging data to identify a subset of imaging data wherein near-field artifacts are present and setting the artifact area to include the subset of imaging data.
Automatically setting 644 the artifact area can vary depending on whether the application involves a blood-filled lumen or a blood-cleared lumen. In examples associated with blood-filled lumens, detection module 640 can be adapted to detect artifacts having arcs greater than 45 degrees in a Cartesian image. In similar examples, detection module 640 can be adapted to detect artifacts in a blood-filled lumen having an angular spatial frequency less than 0.8 radians. Similarly, detection module 640 can be adapted detect artifacts having a radial spatial frequency between 6/mm and 8/mm in a polar format image. In some examples, detection module 640 can be adapted to detect artifacts that repeat every 12 to 18 points in a radial direction of a polar format image. In examples associated with blood-cleared lumens, detection module 640 can be adapted to detect artifacts having arcs greater than 10 degrees in a Cartesian image. In similar examples, detection module 640 can be adapted to detect artifacts in a blood-cleared lumen having an angular spatial frequency less than 0.2 radians.
The feature of providing the option to automatically determine 642 and setting 644 an artifact area provides the advantage of minimizing the processing power and time required to reduce near-field artifacts within the artifact area. More specifically, steps 642 and 644 helps to ensure that resources are used to enhance only imaging data likely to include near-field artifacts. This benefit is particularly advantageous where catheter assembly 610 is adapted to provide a live view of a vessel where increased processing time can increase latency of the image.
Another advantage provided by steps 642 and 644 is that imaging engine 620 can be configurable to work together with a plurality of catheter assemblies. It has been found that the size and/or nature of near-field artifacts can vary based on the method of intravascular imaging. For example, it has been discovered that near-field artifacts are larger when imaging a blood-cleared lumen as compared to a blood-filled lumen. Similarly, the manifestation of near-field artifacts in imaging data can be affected by size, shape, and thickness of a catheter body, material from which a catheter body and/or imaging window is made, imaging frequency of the imaging element, a position of the transducer, and/or wall thickness. Accordingly, imaging engine can be used together with a variety of catheter assemblies where imaging engine 620 is adapted to account for these various factors when reducing near-field imaging artifacts.
In examples where detection module 640 does not automatically determine the artifact area, the artifact area can be set 643 manually or predefined by a user. In one application, an imaging device can have a field of view with a radius of approximately 4 mm, measured from a center of the imaging device. Thus, the imaging device can collect image data relating to items within this 4 mm radius (or 8 mm diameter) at the particular longitudinal location within the vessel. A distance from the center of imaging device to the catheter body can be approximately 0.5 mm along the field of view radius, such that the imaging device's field of view outside the catheter is approximately 3.5 mm. In one example, an artifact distance of 1.5 mm can be manually set or predefined. In other examples, an artifact distance of 1 mm can be manually set or predefined. In some examples, an artifact distance between 0.5 and 1 mm can be manually set or predefined. In some examples, an artifact distance between 0.25 and 1.5 mm can be manually set or predefined. In some examples, an artifact area can correspond with a number of rows of imaging data adjacent to an imaging element, for example as depicted in a polar format image. In some examples, the artifact area corresponds with the first 76 rows of the imaging data adjacent to the imaging element. In other examples, the filter area corresponds with the first 100 rows of the imaging data adjacent to the imaging element. The ranges disclosed above can be associated with near-field artifacts in intravascular ultrasound imaging. It can be appreciated that other artifact distances are contemplated and can vary as necessary for a particular application/use and are within the spirit of this disclosure. For instance, a manually set or predefined artifact area can be a function of the particular catheter assembly being used (e.g., dimensions of the catheter assembly, such as a distance from a center of imaging device to the catheter body).
Image processing module 650 of imaging engine 620 can be adapted to generate 652 enhanced imaging data. In some examples, enhanced imaging data can be generated 652 by reducing a near field artifact from the imaging data using a variety of techniques. In some examples, image processing module can be adapted to generate 654 an enhanced image from the enhanced imaging data and display the enhanced image on a display. The enhanced image generated 654 can be in any suitable format, for example Cartesian format or polar format.
Where a filter is used to reduce near-field artifacts from imaging data, step 668 determines an appropriate filter type. An appropriate filter can include, as one example, application of a sufficient blur to accomplish a low-pass filter. In this example, method 660 is adapted to apply 670 a circumferential filter, apply 672 a radial filter, and/or apply 673 a spatial filter. In some examples, the filters can be iteratively applied. As noted above, in some examples radial filters and spatial filters can be used to reduce near-field artifacts in polar format images, and circumferential filters can be used to reduce near-field artifacts in Cartesian images.
An image processing module can be adapted to apply 670 a circumferential filter. In some examples, a circumferential filter can be adapted to reduce near-field artifacts by filtering an arc associated with the near-field artifact from the imaging data. The circumferential filter can be adapted to filter arcs greater than 45 degrees for imaging data associated with blood-filled lumens, and greater than 10 degrees in blood-cleared lumens. Similarly, the circumferential filter can be adapted to filter artifacts have an angular spatial frequency less than 0.8 radians for imaging data associated with blood-filled lumens, and less than 0.2 radians in blood-cleared lumens.
An image processing module can be adapted to apply 672 a radial filter. In some examples, a radial filter can be adapted to filter artifacts having a radial spatial frequency between 6/mm and 8/mm in a polar format image. Similarly, a radial filter can be adapted to filter artifacts that repeat every 12 to 18 points in a radial direction of a polar format image.
In some examples, an image processing module can be adapted to apply 673 a spatial filter. For instance, a high-pass spatial filtered image can be calculated by first utilizing a low-pass spatial filter (e.g., low-pass Gaussian filter) to obtain low-pass filtered image data. The low-pass filtered image data can then be subtracted from the original image data (e.g., generated by the imaging device) to obtain high-pass filtered image data. In various embodiments, utilizing a spatial high-pass filter can be beneficial because the image data corresponding to near-field artifacts may have relatively more low frequency content as compared to other items of the imaged vessel (e.g., blood, tissue).
In some examples of the method 660, an image processing module can be adapted to apply a multi-pass per data point of the appropriate one or more filters selected at step 668 (e.g., a shader technique) to reduce one or more near-field artifacts. A shader technique can involve a multi-pass per data point application of one or more filters in a parallel (e.g., vectorized) operation. Shader techniques may be amenable to operations performed via a graphics processing unit (GPU) included in the image processing module, as opposed to non-shader techniques (e.g., a single pass per data point) performed in non-parallel manner on a central processing unit (CPU). In one example, applying a shader technique can include applying a first filter to image data (e.g., a data point). The first filter can be, for instance, a radial, spatial, or circumferential filter. Then, a second filter can be applied to the image data upon which the first filter was applied. This process can be repeated for an appropriate number of passes (e.g., a third filter can be applied to the image data upon which the second filter was applied) to accomplish multi-pass filtering of the same image data. Thus, the method 660 can include multiple uses of a filter on a single data point.
Depending on the application in which imaging data is generated, it may be useful to apply one or more of the described operations across multiple frames of imaging data to reduce near-field artifacts. As detailed previously, an imaging device can be configured to generate and receive wave-based energy by rotating at a specific longitudinal location within a vessel. A frame of imaging data can include imaging data generated at the specific longitudinal location within the vessel (e.g., based on a 360 degree rotation of the imaging device at that location). The imaging device can translate longitudinally within the vessel, to a different longitudinal location within the vessel, and generate a second frame of imaging data representative of the new longitudinal location within the vessel. As will be appreciated, a period of time elapses between the times at which the first and second frames are generated. Furthermore, the greater the longitudinal distance within the vessel between any two frames, the greater the period of time will be between these frames.
In utilizing multiple frames of imaging data to reduce near-field artifacts, a first frame of imaging data is designated as a key frame (e.g., a first frame). One or more other frames can be designated as reference frames (e.g., second and third frames). In one instance, the key frame and one or more reference frames may be neighboring frames that are adjacent one another along the longitudinal direction of the vessel. For example, the key frame can have first and second reference frames as the respective forward and backward immediately adjacent frames. In other instances, the key frame and one or more reference frames can be spaced apart by any number of other frames. In some examples, the key frame can be spaced apart from one or more reference frames by 2, 3, 4, 5, 10, 20, 25, 50, or 100 frames as examples. In some cases, the key frame can be spaced apart from one or more reference frames by between 2 and 50 frames, 2 and 25 frames, 2 and 10 frames, or 2 and 5 frames. The further spaced apart the key frame is from the one or more reference frames (e.g., the greater the number of frames between the key frame and the one or more reference frames), the greater the period of time will be between generation of the imaging data of the key and reference frame(s).
In one exemplary embodiment, a key frame and two reference frames can be selected, where multiple other frames are generated at longitudinal locations between the longitudinal location corresponding to each of the key and reference frames (e.g., the key and reference frames are not adjacent frames). As such, a period of time may pass between the imaging data generated for the selected key and reference frames (e.g. a greater period of time as compared to the selected key and reference frames being adjacent frames). Selecting key and reference frames that are spaced apart in time can allow the selected key and reference frames to capture useful vessel information. For instance, key and reference frames that are far away in time can include image data representing tissue movement ascertainable when the key and reference frames are compared. Such information can be used in generating an enhanced image data.
A filter can be applied to each of the selected key and two reference frames. In particular, the filter may be applied to the artifact area of each of the selected key and two reference frames. As described previously, the artifact area of each frame may be determined automatically or set manually. In one example, the filter applied to the artifact area of each of the selected key and two reference frames can, for instance, be a low-pass filter. Applying the low-pass filter to the artifact area of each of the selected key and two reference frames can include applying circumferential, radial, and/or spatial filters, and in some case can include, additionally or alternatively, the use of a shader technique.
For instance, in one application a radial filter is applied to the artifact area of each of the selected key and two reference frames. In addition, a circumferential filter is applied to the artifact area of each of the selected key and two reference frames. In such an example, an additional filter can be applied to the artifact area of each of the selected key and two reference frames on a per pixel basis across the selected frames. The per pixel filter can be taken, for instance, using a minimum pixel value. Once the described filtering of the present example has been applied, the resulting filtered key and two filtered reference frames can each be subtracted from the originally generated image data corresponding to the selected key and two reference frames.
Filtering across multiple frames can allow for determination of movement of low frequency data. If movement of low frequency data is detected, then this can indicate that the identified portion of image data represents, for example, tissue that may not be desirable to filter out of an enhanced image in many applications. Near-field artifacts generally will not substantially move across multiple frames, and ascertaining such information in the imaging data can allow near-field artifacts to be filtered out when generating an enhanced image. Thus, in many applications low frequency image data that does not move between frames indicates that such data represents an artifact that can be filtered out when generating an enhanced image.
Various examples of the invention have been described. Although the present invention has been described in considerable detail with reference to certain disclosed embodiments, the embodiments are presented for purposes of illustration and not limitation. Other embodiments incorporating the invention are possible. One skilled in the art will appreciate that various changes, adaptations, and modifications may be made without departing from the spirit of the invention and the scope of the appended claims.
Number | Name | Date | Kind |
---|---|---|---|
3918025 | Koshikawa et al. | Nov 1975 | A |
4347443 | Whitney | Aug 1982 | A |
4850363 | Yanagawa | Jul 1989 | A |
4860758 | Yanagawa et al. | Aug 1989 | A |
4949310 | Smith et al. | Aug 1990 | A |
5070734 | Kawabuchi et al. | Dec 1991 | A |
5070735 | Reichert et al. | Dec 1991 | A |
5131396 | Ishiguro et al. | Jul 1992 | A |
5183048 | Eberle | Feb 1993 | A |
5203338 | Jang | Apr 1993 | A |
5361767 | Yukov | Nov 1994 | A |
5363849 | Suorsa et al. | Nov 1994 | A |
5396285 | Hedberg et al. | Mar 1995 | A |
5462057 | Hunt et al. | Oct 1995 | A |
5531679 | Schulman et al. | Jul 1996 | A |
5690115 | Feldman et al. | Nov 1997 | A |
5741552 | Takayama et al. | Apr 1998 | A |
5795296 | Pathak | Aug 1998 | A |
5833615 | Wu et al. | Nov 1998 | A |
5848969 | Panescu et al. | Dec 1998 | A |
5876343 | Teo et al. | Mar 1999 | A |
5921931 | O'Donnell | Jul 1999 | A |
6015385 | Finger | Jan 2000 | A |
6036650 | Wu | Mar 2000 | A |
6132374 | Hossack et al. | Oct 2000 | A |
6139501 | Roundhill et al. | Oct 2000 | A |
6154572 | Chaddha | Nov 2000 | A |
6216025 | Kruger | Apr 2001 | B1 |
6277075 | Torp et al. | Aug 2001 | B1 |
6589181 | Grunwald | Jul 2003 | B2 |
6645147 | Jackson et al. | Nov 2003 | B1 |
7194294 | Panescu et al. | Mar 2007 | B2 |
7691061 | Hirota | Apr 2010 | B2 |
7925064 | Cloutier et al. | Apr 2011 | B2 |
9292918 | Zagrodsky | Mar 2016 | B2 |
9761006 | Bergner et al. | Sep 2017 | B2 |
9858668 | Jones | Jan 2018 | B2 |
10089755 | Griffin | Oct 2018 | B2 |
20010017941 | Chaddha | Aug 2001 | A1 |
20010029336 | Teo | Oct 2001 | A1 |
20030063787 | Natanzon et al. | Apr 2003 | A1 |
20030078497 | Ji et al. | Apr 2003 | A1 |
20030097069 | Avinash et al. | May 2003 | A1 |
20030103212 | Westphal | Jun 2003 | A1 |
20030191392 | Haldeman | Oct 2003 | A1 |
20030208123 | Panescu | Nov 2003 | A1 |
20040030250 | Stewart | Feb 2004 | A1 |
20040037164 | Garlick et al. | Feb 2004 | A1 |
20040199047 | Taimisto et al. | Oct 2004 | A1 |
20050119573 | Vilenkin et al. | Jun 2005 | A1 |
20050215897 | Sakaguchi et al. | Sep 2005 | A1 |
20050249391 | Kimmel et al. | Nov 2005 | A1 |
20060253028 | Lam et al. | Nov 2006 | A1 |
20070016068 | Grunwald et al. | Jan 2007 | A1 |
20070036404 | Li | Feb 2007 | A1 |
20070167710 | Unal | Jul 2007 | A1 |
20070201736 | Klingensmith | Aug 2007 | A1 |
20080015569 | Saadat et al. | Jan 2008 | A1 |
20080031498 | Corcoran et al. | Feb 2008 | A1 |
20080043024 | Schiwietz | Feb 2008 | A1 |
20080075375 | Unal | Mar 2008 | A1 |
20080200815 | Van Der Steen et al. | Aug 2008 | A1 |
20080234582 | Nair | Sep 2008 | A1 |
20090088830 | Mohamed et al. | Apr 2009 | A1 |
20090284332 | Moore et al. | Nov 2009 | A1 |
20100010344 | Ahn | Jan 2010 | A1 |
20100094127 | Xu | Apr 2010 | A1 |
20100174190 | Hancock et al. | Jul 2010 | A1 |
20100312092 | Listz et al. | Dec 2010 | A1 |
20100312109 | Satoh | Dec 2010 | A1 |
20110071404 | Schmitt | Mar 2011 | A1 |
20110160586 | Li et al. | Jun 2011 | A1 |
20110196237 | Pelissier et al. | Aug 2011 | A1 |
20110257527 | Suri | Oct 2011 | A1 |
20120022360 | Kemp | Jan 2012 | A1 |
20120065511 | Jamello, III | Mar 2012 | A1 |
20120123271 | Cai | May 2012 | A1 |
20120170848 | Kemp | Jul 2012 | A1 |
20120224751 | Kemp | Sep 2012 | A1 |
20130109968 | Azuma | May 2013 | A1 |
20130303907 | Corl | Nov 2013 | A1 |
20130303910 | Hubbard et al. | Nov 2013 | A1 |
20130317359 | Wilson et al. | Nov 2013 | A1 |
20140099011 | Begin | Apr 2014 | A1 |
20140100440 | Cheline | Apr 2014 | A1 |
20140180078 | Nair | Jun 2014 | A1 |
20140180083 | Hoseit | Jun 2014 | A1 |
20140180108 | Rice | Jun 2014 | A1 |
20140257087 | Elbasiony | Sep 2014 | A1 |
20140268167 | Friedman et al. | Sep 2014 | A1 |
20140270445 | Kemp | Sep 2014 | A1 |
20140276065 | He et al. | Sep 2014 | A1 |
20140316758 | Yagi et al. | Oct 2014 | A1 |
20140350404 | Nikhil et al. | Nov 2014 | A1 |
20150099975 | Lam et al. | Apr 2015 | A1 |
20150141832 | Yu et al. | May 2015 | A1 |
20150245776 | Hirohata et al. | Sep 2015 | A1 |
20150356734 | Ooga et al. | Dec 2015 | A1 |
20160007967 | Johnson et al. | Jan 2016 | A1 |
20160022248 | Mori et al. | Jan 2016 | A1 |
20160206290 | Itoh | Jul 2016 | A1 |
20170035394 | Maeda | Feb 2017 | A1 |
20170100100 | Jamello et al. | Apr 2017 | A1 |
20170193658 | Cardinal et al. | Jul 2017 | A1 |
20170224286 | Sakamoto | Aug 2017 | A1 |
20170301089 | Lam et al. | Oct 2017 | A1 |
20170330331 | Bhatt et al. | Nov 2017 | A1 |
20180042575 | Moore et al. | Feb 2018 | A1 |
Number | Date | Country |
---|---|---|
101208045 | Jun 2008 | CN |
103025247 | Apr 2013 | CN |
346889 | Jan 1995 | EP |
851241 | Jul 1998 | EP |
1387317 | Feb 2004 | EP |
1609423 | Dec 2005 | EP |
1988505 | Nov 2008 | EP |
2488107 | Aug 2012 | EP |
62221335 | Sep 1987 | JP |
H09000522 | Jan 1997 | JP |
2001333902 | Dec 2001 | JP |
2002530143 | Sep 2002 | JP |
2004180784 | Jul 2004 | JP |
2006014938 | Jan 2006 | JP |
2007029520 | Feb 2007 | JP |
2007229015 | Sep 2007 | JP |
2008508970 | Mar 2008 | JP |
2008536638 | Sep 2008 | JP |
2009545406 | Dec 2009 | JP |
4648652 | Mar 2011 | JP |
2013507227 | Mar 2013 | JP |
2015104463 | Jun 2015 | JP |
0101864 | Jan 2001 | WO |
2006015877 | Feb 2006 | WO |
2006102511 | Sep 2006 | WO |
2006113857 | Oct 2006 | WO |
2006122001 | Nov 2006 | WO |
2007098209 | Aug 2007 | WO |
2008016992 | Feb 2008 | WO |
2008110013 | Sep 2008 | WO |
2011046903 | Apr 2011 | WO |
2014186268 | Nov 2014 | WO |
2017100274 | Jun 2017 | WO |
Entry |
---|
Dumane et al., “Use of Frequency Diversity and Nakagami Statistics in Ultrasonic Tissue Characterization,” IEEE Transactions on Ultrasonics, Ferroelectrics, and Frequency Control, vol. 48, No. 5, Sep. 2001, pp. 1139-1146. |
Foster, “Transducer Materials and Probe Construction,” Ultrasound in Medicine and Biology, vol. 26, Supp. 1, 2000, pp. S2-55. |
Frijlink et al., “High Frequency Harmonic Imaging in Presence of Intravascular Stents,” IEEE Ultrasonics Symposium, 2003, pp. 208-211. |
Garcia-Garcia et al., “Imaging of coronary atherosclerosis: intravascular ultrasound,” European Heart Journal, vol. 3, 2010, pp. 2456-2469. |
Seo et al., “Sidelobe Suppression in Ultrasound Imaging Using Dual Apodization with Cross-Correlation,” IEEE Transactions on Ultrasonics, Ferroelectrics, and Frequency Control, vol. 55, No. 10, Oct. 2008, pp. 2198-2210. |
Shankar et al., “Computer-Aided Classification of Breast Masses in Ultrasonic B-Scans Using a Multiparameter Approach,” IEEE Transactions on Ultrasonics, Ferroelectrics, and Frequency Control, vol. 50, No. 8, Aug. 2003, pp. 1002-1009. |
Smith et al., “The Maltese Cross Processor: Speckle Reduction for Circular Transducers,” Ultrasonic Imaging, vol. 10, No. 3, Jul. 1988, pp. 153-170. |
U.S. Appl. No. 61/218,177, titled Vector Domain Image Enhancement for Mechanically Rotating Imaging Catheters, filed Jun. 18, 2009. |
Van Der Steen et al., “IVUS Harmonic Imaging,” Ultrasound in Medicine and Biology, vol. 26, Supp. 2, 2000, p. A90. |
Wang et al., “Optimizing the Beam Pattern of a Forward-Viewing Ring-Annular Ultrasound Array for Intravascular Imaging,” IEEE Transactions on Ultrasonics, Ferroelectrics, and Frequency Control, vol. 49, No. 12, Dec. 2002, pp. 1652-1664. |
Waters et al., “Development of a High-Definition Intravascular Ultrasound Imaging System and Catheter,” IEEE International Ultrasonics Symposium Proceedings, Oct. 18, 2011, 4 pages. |
International Patent Application No. PCT/US2016/054589, International Search Report & Written Opinion dated Dec. 16, 2016, 15 pages. |
Moore et al., “Intravascular Ultrasound Image Processing of Blood-Filled or Blood-Displaced Lumens,” U.S. Appl. No. 15/704,710, filed Sep. 14, 2017, 49 pages. |
Cardinal, M. et al., “Intravascular Ultrasound Image Segmentation: A Fast-Marching Method,” (2003), Springer, Berlin, Heidelberg 032548 XP055299068, vol. 2879, pp. 432-439. |
Chalana, V. et al., “A Methodology for Evaluation of Boundary Detection Algorithms on Medical Images,” IEEE Transactions on Medical Imaging, (Oct. 5, 1997) vol. 16, No. 5, pp. 643-645. |
Number | Date | Country | |
---|---|---|---|
20170103498 A1 | Apr 2017 | US |