The present disclosure pertains to medical imaging, and systems and methods for medical imaging. More particularly, the present disclosure pertains to systems and methods for vascular imaging including intravascular imaging and extravascular imaging and co-registration.
A wide variety of medical imaging systems and methods have been developed for medical use, for example, use in imaging vascular anatomy. Some of these systems and methods include intravascular imaging modalities and extravascular imaging modalities for imaging vasculature. These systems and methods include various configurations and may operate or be used according to any one of a variety of methods. Of the known vascular imaging systems and methods, each has certain advantages and disadvantages. Accordingly, there is an ongoing need to provide alternative systems and methods for vascular imaging and assessment, and co-registration of imaging.
This disclosure provides alternative medical imaging systems and methods. An example includes a method for vascular imaging co-registration. The method comprises obtaining extravascular imaging data of a portion of a blood vessel. The extravascular imaging data includes an extravascular image showing an intravascular imaging device disposed within the vessel, with an imaging element of the intravascular imaging device disposed at a starting location for a translation procedure during which the imaging element is translated within the blood vessel from the starting location to an ending location. The extravascular image also includes an extravascular contrast image showing the portion of the blood vessel with contrast and showing a visualized anatomical landmark. The method also comprises obtaining intravascular imaging data from the intravascular imaging device during the translation procedure, the intravascular imaging data including one or more intravascular images showing a detected anatomical landmark. The method also comprises marking the starting location and the ending location of the imaging element on the extravascular imaging data; marking a predicted location of the detected anatomical landmark on the extravascular imaging data; and aligning the predicted location of the detected anatomical landmark with the visualized anatomical landmark.
Alternatively or additionally to any of the embodiments above, wherein the extravascular imaging data includes one or both angiographic image data and fluoroscopic image data.
Alternatively or additionally to any of the embodiments above, wherein the angiographic data is selected from one or more of two-dimensional angiographic image data; three-dimensional angiographic image data; or computer tomography angiographic image data.
Alternatively or additionally to any of the embodiments above, wherein the extravascular imaging data is video including the extravascular image showing the intravascular imaging device and the extravascular contrast image showing the portion of the blood vessel with contrast.
Alternatively or additionally to any of the embodiments above, wherein extravascular imaging data is a series of images including the extravascular image showing the intravascular imaging device and the extravascular contrast image showing the portion of the blood vessel with contrast.
Alternatively or additionally to any of the embodiments above, wherein the intravascular imaging data is selected from one or more of intravascular ultrasound data and optical coherence tomography data.
Alternatively or additionally to any of the embodiments above, wherein marking the starting location and the ending location includes using image pattern recognition software.
Alternatively or additionally to any of the embodiments above, wherein marking the starting location and the ending location includes allowing a user to manually mark the starting location and the ending location.
Alternatively or additionally to any of the embodiments above, further including identifying the visualized anatomical landmark on the extravascular imaging data.
Alternatively or additionally to any of the embodiments above, wherein identifying the visualized anatomical landmark on the extravascular imaging data includes using image pattern recognition software.
Alternatively or additionally to any of the embodiments above, wherein identifying the visualized anatomical landmark on the extravascular imaging data includes allowing a user to manually mark the visualized anatomical landmark on the extravascular imaging data.
Alternatively or additionally to any of the embodiments above, wherein identifying the visualized anatomical landmark on the extravascular imaging data includes the image pattern recognition software marking the visualized anatomical landmark on the extravascular imaging data.
Alternatively or additionally to any of the embodiments above, wherein marking the predicted location of the detected anatomical landmark on the extravascular imaging data includes using image pattern recognition software.
Alternatively or additionally to any of the embodiments above, wherein marking the predicted location of the detected anatomical landmark on the extravascular imaging data includes allowing a user to manually mark the predicted location of the detected anatomical landmark on the extravascular imaging data.
Alternatively or additionally to any of the embodiments above, wherein aligning the predicted location of the detected anatomical landmark with the visualized anatomical landmark is performed automatically using software.
Alternatively or additionally to any of the embodiments above, wherein aligning the predicted location of the detected anatomical landmark with the visualized anatomical landmark includes allowing a user to manually align the predicted location of the detected anatomical landmark with the visualized anatomical landmark.
Alternatively or additionally to any of the embodiments above, wherein the translation procedure is performed using an automatic translation system.
Alternatively or additionally to any of the embodiments above, wherein the translation procedure is a pullback.
Alternatively or additionally to any of the embodiments above, wherein during the translation procedure, the imaging element is translated within the blood vessel from the starting location to the ending location at a known speed
Alternatively or additionally to any of the embodiments above, further including: calculating a path on the extravascular imaging data that the imaging element of the intravascular imaging device will travel during the translation procedure from the starting location to the ending location.
Alternatively or additionally to any of the embodiments above, further including: determining the predicted location of the detected anatomical landmark on the extravascular imaging data using the known speed that the imaging element is translated within the blood vessel from the starting location to the ending location.
Alternatively or additionally to any of the embodiments above, wherein marking the predicted location of the detected anatomical landmark on the extravascular imaging data includes: calculating a path on the extravascular imaging data that the imaging element of the intravascular imaging device will travel during the translation procedure from the starting location to the ending location; and determining the predicted location of the detected anatomical landmark on the extravascular imaging data using the known speed that the imaging element is translated within the blood vessel from the starting location to the ending location.
Alternatively or additionally to any of the embodiments above, further including: estimating accuracy of the imaging co-registration.
Alternatively or additionally to any of the embodiments above, further including: generating a visual indicator representing the estimated accuracy of the imaging co-registration.
Alternatively or additionally to any of the embodiments above, further including: displaying the visual indicator overlaid on the portion of the blood vessel on the extravascular imaging data.
Alternatively or additionally to any of the embodiments above, further including: estimating accuracy of the imaging co-registration for one or more segments of the portion of the blood vessel, generating a visual indicator representing the estimated accuracy of the imaging co-registration for the one or more segments, and displaying the visual indicator on the one or more segments on the extravascular imaging data.
Alternatively or additionally to any of the embodiments above, wherein the visual indicator includes one or more color coded indicators.
Alternatively or additionally to any of the embodiments above, wherein the extravascular imaging data further includes an intermediate extravascular image obtained during the translation procedure showing the intravascular imaging device disposed within the vessel with the imaging element disposed at an intermediate location during the translation procedure between the starting location the ending location; and the method further includes marking the intermediate location of the imaging element on the extravascular imaging data.
Alternatively or additionally to any of the embodiments above, wherein marking the intermediate location of the imaging element on the extravascular imaging data includes using image pattern recognition software.
Alternatively or additionally to any of the embodiments above, wherein marking the intermediate location of the imaging element on the extravascular imaging data includes allowing a user to manually mark the intermediate location of the imaging element on the extravascular imaging data.
Alternatively or additionally to any of the embodiments above, wherein the extravascular contrast image also shows a second visualized anatomical landmark, and the intravascular imaging data includes one or more additional intravascular images showing a second detected anatomical landmark, the method further including: marking a predicted location of the second detected anatomical landmark on the extravascular imaging data; and aligning the predicted location of the second detected anatomical landmark with the second visualized anatomical landmark.
Alternatively or additionally to any of the embodiments above, wherein the extravascular contrast image also shows a third visualized anatomical landmark, and the intravascular imaging data includes one or more additional intravascular images showing a third detected anatomical landmark, the method further including: marking a predicted location of the third detected anatomical landmark on the extravascular imaging data; and aligning the predicted location of the third detected anatomical landmark with the third visualized anatomical landmark.
Alternatively or additionally to any of the embodiments above, wherein the extravascular contrast image also shows a fourth visualized anatomical landmark, and the intravascular imaging data includes one or more additional intravascular images showing a fourth detected anatomical landmark, the method further including: marking a predicted location of the fourth detected anatomical landmark on the extravascular imaging data; and aligning the predicted location of the fourth detected anatomical landmark with the fourth visualized anatomical landmark.
A computer readable medium having stored thereon in a non-transitory state a program code for use by a computing device, the program code causing the computing device to execute the method of any one of the embodiments above.
A system for vascular imaging co-registration, system comprising: one or more input port for receiving imaging data; one or more output port; a controller in communication with the input port and the output port, the controller configured to execute the method any one of the embodiments above.
Alternatively or additionally to any of the embodiments above, wherein the input port can support one or more of live video and DICOM.
Alternatively or additionally to any of the embodiments above, wherein the output port is configured to output to one or more of a display and a data archive.
A system for intravascular imaging registration, the system comprising: an intravascular imaging device; a computer; and a computer readable medium having stored thereon in a non-transitory state a program code for use by the computing device, the program code causing the computing device to execute the method of any one of the embodiments above.
A controller comprising; a processor; and memory including instructions executable by the processor to perform the method of any one of the embodiments above.
The above summary of some embodiments is not intended to describe each disclosed embodiment or every implementation of the present disclosure. The Figures, and Detailed Description, which follow, more particularly exemplify these embodiments.
The disclosure may be more completely understood in consideration of the following detailed description in connection with the accompanying drawings, in which:
While the disclosure is amenable to various modifications and alternative forms, specifics thereof have been shown by way of example in the drawings and will be described in detail. It should be understood, however, that the intention is not to limit the invention to the particular embodiments described. On the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the disclosure.
For the following defined terms, these definitions shall be applied, unless a different definition is given in the claims or elsewhere in this specification.
All numeric values are herein assumed to be modified by the term “about”, whether or not explicitly indicated. The term “about” generally refers to a range of numbers that one of skill in the art would consider equivalent to the recited value (e.g., having the same function or result). In many instances, the terms “about” may include numbers that are rounded to the nearest significant figure.
The recitation of numerical ranges by endpoints includes all numbers within that range (e.g. 1 to 5 includes 1, 1.5, 2, 2.75, 3, 3.80, 4, and 5).
As used in this specification and the appended claims, the singular forms “a”, “an”, and “the” include plural referents unless the content clearly dictates otherwise. As used in this specification and the appended claims, the term “or” is generally employed in its sense including “or” unless the content clearly dictates otherwise.
It is noted that references in the specification to “an embodiment”, “some embodiments”, “other embodiments”, etc., indicate that the embodiment described may include one or more particular features, structures, or characteristics. However, such recitations do not necessarily mean that all embodiments include the particular features, structures, or characteristics. Additionally, when particular features, structures, or characteristics are described in connection with one embodiment, it should be understood that such features, structures, or characteristics may also be used connection with other embodiments whether or not explicitly described unless clearly stated to the contrary.
The following detailed description should be read with reference to the drawings in which similar elements in different drawings are numbered the same. The drawings, which are not necessarily to scale, depict illustrative embodiments and are not intended to limit the scope of the invention.
A number of different medical imaging modalities may be used to evaluate or treat blood vessels. Two general types of imaging modalities include extravascular imaging modalities and intravascular imaging modalities. This disclosure relates to the use and co-registration of these modalities.
Extravascular imaging modalities, such as various forms of radiological imaging, provide extravascular imaging data of a portion of a blood vessel. Some examples include angiography or fluoroscopy imaging modalities, such as two-dimensional angiography/fluoroscopy; three-dimensional angiography/fluoroscopy; or computer tomography angiography/fluoroscopy. Angiography typically involves rendering a radiological view of one or more blood vessels, often with the use of radiopaque contrast media. An angiographic image can also be viewed real time by fluoroscopy. In general, fluoroscopy uses less radiation than angiography, and is often used to guide medical devices including radiopaque markers within or through vessels. Extravascular imaging data of blood vessels may provide useful information about the blood vessel, the anatomy or the location or positioning of devices within the blood vessel or anatomy. For example, extravascular imaging data (e.g. angiograms) may provide a comprehensive overall image or series of images or a video of the blood vessel(s) of interest, and may provide a “roadmap” with a good temporal resolution for the general assessment of the blood vessel(s) or navigation of devices within blood vessels.
Intravascular imaging modalities provide intravascular imaging data of a portion of a blood vessel. Some examples of intravascular imaging modalities include intravascular ultrasound (IVUS) and optical coherence tomography (OCT). These modalities typically include imaging the vessel itself using a device-mounted intravascular probe including an imaging element disposed within the vessel. Several types of device systems have been designed to track through a vasculature to provide intravascular image data. These can include, but are not limited to, intravascular ultrasound (IVUS) devices and optical coherence tomography (OCT) devices (e.g. catheters, guidewires, etc.) In operation, intravascular device-mounted probes including an imaging element are moved along a blood vessel in the region where imaging is desired. As the probe passes through an area of interest, sets of intravascular image data are obtained that correspond to a series of “slices” or cross-sections of the vessel, the lumen, and surrounding tissue. These devices may include radiopaque material or markers. Such markers are generally positioned near a distal tip or near or on the probe. Therefore, the approximate location of the imaging probe or imaging element can be discerned by observing the procedure on either a fluoroscope or an angiographic image or images. Typically, such imaging devices are connected to a dedicated processing unit or control module, including processing hardware and software, and a display. The raw image data is received by the console, processed to render an image including features of concern, and rendered on the display device. Intravascular imaging data of blood vessels may provide useful information about the blood vessel that is different from or in addition to the information provided by the extravascular imaging data. For example, intravascular imaging data may provide data regarding the cross-section of the lumen, the thickness of deposits on a vessel wall, the diameter of the non-diseased portion of a vessel, the length of diseased sections, the makeup of deposits or plaque on the wall of the vessel, assessment of plaque burden or assessment of stent deployment.
These two general types of imaging modalities provide different imaging data, and therefore may be complimentary to each other. As such, in certain circumstances, it may be desirable to provide or use both general types of medical imaging modalities to evaluate or treat blood vessels. Additionally, it may be useful for the locations of the acquired intravascular imaging data/images to be correlated with their locations on the vessel roadmap obtained by the extravascular imaging data/images. It may be useful to coordinate or “register” (e.g. co-register) the imaging data rendered by the two different modalities. It may also be useful to display the co-registered extravascular imaging data and intravascular imaging data together, for example, on a common display monitor. Some example embodiments disclosed herein may include or relate to some or all of these aspects.
In accordance with some embodiments of the present disclosure, example method(s), system(s), device(s), or software are described herein. These examples include image data acquisition equipment and data/image processors, and associated software, for obtaining and registering (e.g. co-registering) imaging data rendered by the two distinct imaging modalities (e.g. extravascular imaging data and intravascular imaging data). Additionally, or alternatively, example method(s), system(s) or software may generate views on a single display that simultaneously provides extravascular images with positional information and intravascular images associated with an imaging probe (e.g., an IVUS or OCT probe) mounted upon an intravascular device.
The extravascular imaging data may be radiological image data obtained by the angiography/fluoroscopy system 104. Such angiography/fluoroscopy systems are generally well known in the art. The angiography/fluoroscopy system 104 may include an angiographic table 110 that may be arranged to provide sufficient space for the positioning of an angiography/fluoroscopy unit c-arm 114 in an operative position in relation to a patient 100 on the table 110. Raw radiological image data acquired by the angiography/fluoroscopy c-arm 114 may be passed to an extravascular data input port 118 via a transmission cable 116. The input port 118 may be a separate component or may be integrated into or be part of the computer system/sub-system 130. The angiography/fluoroscopy input port 118 may include a processor that converts the raw radiological image data received thereby into extravascular image data (e.g. angiographic/fluoroscopic image data), for example, in the form of live video, DICOM, or a series of individual images. The extravascular image data may be initially stored in memory within the input port 118, or may be stored within the computer 130. If the input port 118 is a separate component from the computer 130, the extravascular image data may be transferred to the computer 130 through the cable 117 and into an input port in the computer 130. In some alternatives, the communications between the devices or processors may be carried out via wireless communication, rather than by cables.
The intravascular imaging data may be, for example, IVUS data or OCT data obtained by the intravascular imaging system/sub-system 106 (e.g. an IVUS or OCT system). Such IVUS and OCT systems are generally well known in the art. The intravascular sub-system 106 may include an intravascular imaging device such as an imaging catheter 120, for example an IVUS or OCT catheter. The imaging device 120 is configured to be inserted within the patient 100 so that its distal end, including a diagnostic assembly or probe 122 (e.g. an IVUS or OCT probe), is in the vicinity of a desired imaging location of a blood vessel. A radiopaque material or marker 123 located on or near the probe 122 may provide indicia of a current location of the probe 122 in a radiological image.
By way of example, in the case of IVUS intravascular imaging data, the diagnostic probe 122 generates ultrasound waves, and receives ultrasound echoes representative of a region proximate the diagnostic probe 122. The probe 122 or catheter 120 may convert the ultrasound echoes into corresponding signals, such as electrical or optical signals. The corresponding signals are transmitted along the length of the imaging catheter 120 to a proximal connector 124. The proximal connector 124 of the catheter 120 is communicatively coupled to processing unit or control module 126. IVUS versions of the probe 122 come in a variety of configurations including single and multiple transducer element arrangements. It should be understood that in the context of IVUS, a transducer may be considered an imaging element. In the case of multiple transducer element arrangements, an array of transducers is potentially arranged: linearly along a lengthwise axis of the imaging catheter 120, curvilinearly about the lengthwise axis of the catheter 120, circumferentially around the lengthwise axis, etc.
One example of an IVUS intravascular imaging catheter 120 is shown in
As shown in
Referring back to
Raw intravascular image data (e.g. raw IVUS or OCT data) may be acquired by the imaging catheter 120 and may be passed to the control module 126, for example via connector 124. The control module 126 may be a separate component or may be integrated into or be part of the computer system/sub-system 130. The control module 126 may include a processor that converts or is configured to convert the raw intravascular image data received via the catheter 120 into intravascular image data (e.g. IVUS or OCT image data), for example, in the form of live video, DICOM, or a series of individual images. The intravascular imaging data may include transverse cross-sectional images of vessel segments. Additionally, the intravascular imaging data may include longitudinal cross-sectional images corresponding to slices of a blood vessel taken along the blood vessel's length. The control module 126 may be considered an input port for the computer system/subsystem 130, or may be considered to be connected to an input port of the computer 130, for example, via cable 119 or a wireless connection. The intravascular image data may be initially stored in memory within the control module 126, or may be stored within memory in the computer system/subsystem 130. If the control module 126 is a separate component from the computer system/sub-system 130, the intravascular image data may be transferred to the computer 130, for example through the cable 119, and into an input port in the computer 130. Alternatively, the communications between the devices or processors may be carried out via wireless communication, rather than by cable 119.
The control module 126 may also include one or more components that may be configured to operate the imaging device 120 or control the collection of intravascular imaging data. For example, in the case of an IVUS system, the control module 126 may include one or more of a processor, a memory, a pulse generator, a motor drive unit, or a display. As another example, in the case of an OCT system, the control module 126 may include one or more of a processor, a memory, a light source, an interferometer, optics, a motor drive unit, or a display. In some cases, the control module 126 may be or include a motor drive unit that is configured to control movement of the imaging catheter 120. Such a motor drive unit may control rotation or translation of the imaging catheter 120 or components thereof. In some instances, the control module 126 or motor drive unit may include an automatic translation system that may be configured to translate the imaging catheter 120 in a controlled/measured matter within the patient 100. Such an automatic translation system may be used such that during a translation procedure, the imaging catheter 120 (including an imaging element) is translated within the blood vessel from a starting location to an ending location at a constant or known speed. (e.g. the imaging catheter 120 is translated at a specific rate for a known amount of time). In other embodiments, the translation may be done manually. Translation procedures may be, for example, a “pullback” procedure (where the catheter 120 is pulled through the vessel) or a “push-through” procedure (where the catheter 120 is pushed through the vessel). The control module 126 may also be configured from or include hardware and software configured to control intravascular imaging and data collection. For example, the control module 126 may include control features to turn on/off imaging or data collection from/to the catheter 120.
The computer system/sub-system 130 can include one or more controller or processor, one or more memory, one or more input port, one or more output port and/or one or more user interface. The computer 130 obtains or is configured to obtain intravascular image data from or through the intravascular imaging system/sub-system 106 (e.g. IVUS or OCT) and extravascular image data from or through the extravascular imaging system/sub-system 104 (e.g. angiography/fluoroscopy system). The computer 130, or the components thereof, can include software and hardware designed to be integrated into standard catheterization procedures and automatically acquire both extravascular imaging data (e.g. angiography/fluoroscopy) and intravascular imaging data (e.g. IVUS or OCT) through image or video acquisition.
The computer system/sub-system 130, or the components thereof, can include software or hardware that is configured to execute a method for vascular imaging co-registration of the obtained extravascular imaging data and the obtained intravascular imaging data. In that context, the computer 130 may include computer readable instructions or software to execute the method for vascular imaging co-registration as disclosed herein. For example, in some respects the computer may include a processor or a memory which includes software including program code causing the computer to execute the method for vascular imaging co-registration as disclosed herein. For example, the computer/computing device can include a processor or memory including instructions executable by the processor to perform the method for vascular imaging co-registration as disclosed herein. In that context, it can also be appreciated that also disclosed herein is a computer readable medium having stored thereon in a non-transitory state a program code for use by the computer/computing device 130, the program code causing the computing device 130 to execute the method for vascular imaging co-registration as disclosed herein. Additionally, the computer/computing device 130 may be part of or include a system for intravascular imaging registration that includes one or more input port for receiving imaging data; one or more output port; and a controller in communication with the input port and the output port, the controller configured to execute the method for intravascular imaging registration as disclosed herein.
The computer system/sub-system 130 can also include software and hardware that is configured for rendering or displaying imaging, including, for example, extravascular imaging or intravascular imaging derived from the received image data or co-registration method. In some cases, the computer 130 or software can be configured to render both extravascular imaging and intravascular imaging on a single display. In that regard, the system may include a display 150 configured for simultaneously displaying extravascular image data and intravascular image data rendered by the computer 130. The display 150 may be part of the computer system 130 or may be a separate component in communication with the computer system 130, for example through an output port on the computer 130 and a transmission cable 121. In some other cases, however, the communication through the output port may be wireless, rather than by cable. In some examples, the computer 130 or display 150 may be configured to simultaneously provide an angiogram, an IVUS transverse plane view, and an IVUS longitudinal plane view, which may or may not all be co-registered. In other examples, the display may be configured to simultaneously provide an angiogram, an OCT transverse plane view, and an OCT longitudinal plane view, which may or may not be co-registered.
The computer system/sub-system 130 can also include one or more additional output ports for transferring data to other devices. For example, the computer can include an output port to transfer data to a data archive or memory 131. The computer system/sub-system 130 can also include a user interface that may include software and hardware that is configured for allowing an operator to use or interact with the system.
The components of the system 102 may be used cooperatively during a vascular imaging method or procedure that involves the collection of extravascular imaging data and intravascular imaging data during a translation procedure. In the context of performing such a procedure, and obtaining the requisite imaging data, an example method for intravascular imaging registration may be executed or performed.
For example, the patient 100 may be arranged on the table 110 for extravascular imaging of a portion of a blood vessel of interest. The patient 100 or the table may be arranged or adjusted to provide for the desired view of the vessel of interest, in preparation for the collection of extravascular imaging data. Additionally, the intravascular imaging catheter 120 may be introduced intravascularly into the portion of the blood vessel of interest, in preparation for a translation procedure to collect intravascular imaging data. The intravascular imaging catheter 120 can be navigated, and positioned (often under fluoroscopy) within the vessel such that the imaging element is located at a desired starting location for the translation procedure. A guide catheter may be used to aid in navigation. Once in the proper position, a translation procedure may be executed or performed. Before or during the translation procedure, requisite extravascular and intravascular imaging data may be obtained. In this context, or as part of this process, an example method for vascular imaging co-registration or registration may be executed or performed.
Obtaining Extravascular Imaging Data
One aspect of the example method for vascular imaging co-registration, includes obtaining extravascular imaging data of a portion of the blood vessel. One component of the obtained extravascular imaging data includes an extravascular image showing the intravascular imaging device disposed within the vessel. This may be represented by
During the translation procedure, the imaging element 182 will be translated within the blood vessel 10 from the starting location 20 to an ending location 30. This extravascular image 151 may also show the ending location 30 for the translation procedure. For example, as indicated above, a guide catheter 190 including a distal end 191 may be used during the procedure. The distal end 191 may include a radiopaque material or marker, and may be visualized or shown on the obtained extravascular image 151, and used as the ending location for the translation procedure. Because the ending location 30 has been determined by identifying the actual location from the obtained angiographic/fluoroscopic image 151, it is also an actual/known location within the vessel 10, and may also be useful during registration. In other embodiments, other references points shown on the extravascular image 151 may be used to define the starting or ending locations. For example, other devices, stents, anatomical markers, etc., that are shown on the extravascular image 151 may be used.
The obtained extravascular imaging data may also include an extravascular contrast image showing the portion of the blood vessel with contrast and showing one or more visualized anatomical landmark(s). This may be represented, for example, by
The obtained extravascular imaging data may include video data including both the extravascular device image 151 (e.g. image showing the starting position of the intravascular imaging device 120), and the extravascular contrast image 251 (e.g. the “roadmap”). In some cases, the extravascular device image 151 and the extravascular contrast image 251 may be separate individual images that may be combined or superimposed. These images may be obtained automatically by the system as part of a program, which may be initiated by a user, or may be manually requested or obtained by a user interacting with the system, for example, through a user interface.
The order in which the extravascular imaging data is obtained may also vary. For example, as in the shown embodiment of
Obtaining Intravascular Imaging Data
Another aspect of the example method for vascular imaging co-registration includes obtaining intravascular imaging data from the intravascular imaging device during the translation procedure, the intravascular imaging data including one or more intravascular images showing one or more detected anatomical landmark. An example of this may be represented/described with reference to
In this example, the translation procedure is a pullback, where the imaging element 182 is translated within the blood vessel from the starting location 20 to the ending location 30. As may be appreciated, in this example, the intravascular imaging data is intravascular ultrasound data (IVUS). However, as discussed above, in other embodiments, the intravascular imaging data may be generated using other intravascular modalities, for example, optical coherence tomography (OCT), or the like. The obtained intravascular imaging data, in the form of video or a series of images, will include, over the progression of the translation procedure, one or more intravascular images that show one or more detected anatomical landmark. The individually rendered intravascular image frames may be appropriately tagged (e.g., frame number, pullback distance, time stamp, sequence number, etc.) which may be useful to help correlate image data frames, for example, to correlate intravascular image frames and corresponding extravascular (e.g. radiopaque marker) image data frames or to correlate longitudinal cross-sectional intravascular images with transverse cross-sectional intravascular images. As will be seen, in this example, the detected anatomical landmarks that will be detected on IVUS include the four side branches that were shown in the angiogram of this portion of the vessel (e.g. side branches 12, 14, 16, 18 on the angiogram in
Each of
Additionally, each of
During the translation procedure, intermediate extravascular imaging data may also be optionally/periodically obtained, for example, to track the actual progression of the intravascular imaging device 120 during the translation procedure. These intermediate extravascular image(s) may be obtained during the translation procedure, and may show the actual/known location of the intravascular imaging device 120 disposed within the vessel 10 with the imaging element disposed at an intermediate location during the translation procedure, between the starting location 20 the ending location 30. For example, periodically during the translation procedure, the angiography/fluoroscopy system 104 may be activated (e.g. fluoroscopy may be activated) either manually, or automatically, to generate one or more intermediate angiographic/fluoroscopic images, which are obtained by the computer 130, and displayed at that time in the image output 153 on display 150. In at least some embodiments, the translation procedure or the collection of intravascular images or the co-registration method may occur free of continuous angiography/fluoroscopy. In other words, the angiography/fluoroscopy system 104 may be periodically activated, but significant portion or periods of the translation may be performed without active angiography/fluoroscopy. In such embodiments, the actual/known location of the intravascular imaging device 120 is not continuously tracked under angiography/fluoroscopy during the translation procedure.
Each of
Briefly running through the progression of
In the upper right portion of the display 150 of
In the upper right portion of the display 150, image output 152 shows a corresponding intravascular transverse cross-sectional image 452 of vessel that corresponds to the ending location in the vessel.
Marking One or More Known Location on the Extravascular Imaging Data
Another aspect of the example method for vascular imaging co-registration includes marking the extravascular imaging data for registration. For example, the extravascular imaging data may be marked with actual/known registration points. For example, some embodiments involve marking the starting location 20 and/or the ending location 30 of the translation procedure on the extravascular imaging data.
For example, as described herein, the obtained extravascular imaging data includes an extravascular image 151 (e.g.
In some embodiments, the data from the extravascular image 151, showing actual/known locations or registration points, such as the starting location 20 and the ending location 30 for a translation procedure, may be combined with or superimposed onto the extravascular contrast image 251. In some cases, this process may be done automatically by the system. For example, the computer 130 may include software or hardware that is configured to perform image processing and image-recognition designed to combined or superimpose the data in images. In other cases, the images may be combined or superimposed manually by a user, for example, through a user interface.
The result of combining or overlaying/superimposing the data from extravascular image 151 (e.g. fluoroscopy image) with the data from extravascular contrast image 251 (e.g. angiogram with contrast) may result in a combined or enhanced extravascular image 651 (e.g. enhanced angiogram), which is extravascular imaging data that is or can be marked. One example of such an extravascular image 651 is depicted in
As can be appreciated in
Other extravascular data or images including or showing other actual/known locations of the imaging element of the intravascular imaging device 120 during the translation procedure may also be combined with or superimposed onto the extravascular contrast image 251 or the extravascular image 651. For example, other extravascular images, such as intermediate extravascular image 351 (
As discussed herein, at least parts of the translation procedure during the collection of intravascular images may occur or be performed free of continuous angiography/fluoroscopy. In other words, the angiography/fluoroscopy system 104 may not be, or only periodically be activated during the translation procedure, for example to obtain actual/known locations. But significant portion or periods may be performed without angiography/fluoroscopy. In such cases, the actual/known location of the intravascular imaging device 120 is not continuously tracked under angiography/fluoroscopy during the translation procedure.
In embodiments where angiography/fluoroscopy is inactive during significant portions of the translation procedure, the system may be configured to calculate an approximate or predicted location of the imaging element 182 (e.g. due to the radiopaque marker 123) for those portions of the translation procedure when the angiography/fluoroscopy is inactive. For example, such calculations may be based upon its last registered position/location (e.g. last registered actual/known location of the imaging element 182) and other indicators of catheter movement or location, such as a known pullback distance and speed, a calculated path, or other non-visual position data, or the like, etc.
For example, if an initial location of the imaging element 182 is known (e.g. through one or more actual/known locations or registration points—such as the starting location 20, ending location 30, or one of the intermediate locations obtained during the translation procedure, etc.,—and the catheter 120 is pulled by an automatic pullback system at a specific rate for a known amount of time, the calculated/predicted location will be a distance from the initial location along the path of travel, and is represented by the product of the pullback rate and the time period. The computer 130, or components thereof, can include software or hardware designed to make such calculations, and output the results, for example showing or marking the calculated/predicted location on displayed images, as desired. For example, a calculated/predicted location for a particular point during the translation procedure may be superimposed upon the extravascular image 651 or a co-registered image such as that shown in
In some embodiments, a calculated path that the imaging element 182 takes (e.g. calculated path of travel) during the translation procedure may be determined or used or displayed. For example, a predicted/calculated path may extend between the starting location 20 and the ending location 30, and may generally extend along the imaged vessel lumen shown on extravascular contrast imaging data. Data regarding the calculated path may also be used or considered when calculating an approximate or predicted location of the imaging element 182. Some examples of methods that may be used to determine a calculated path include: user-specified points or manual path specification; image pattern recognition; automated two-dimensional and three-dimensional path calculations; user assisted automated path calculations; and combinations of manual and automated calculations of a path. The computer 130, or the components thereof, can include software or hardware designed to make or facilitate such calculations, and output the results, for example showing or marking the calculated path on displayed images, as desired. For example, the calculated path may be superimposed upon the extravascular image 651 or a co-registered image such as that shown in
As may be appreciated, there may be error between the calculated/predicted location and the actual/known location. For example, it is expected that at certain periods during which fluoroscopy is inactive, foreshortening issues may be present and cause error between the calculated/predicted location and the actual/known location, especially in a tortuous/winding vessel. However, each subsequent time that the fluoroscope is activated and actual/known location data is acquired and presented to the processor, error between the actual/known location and the predicted/calculated location may be reduced or eliminated by replacing the calculated/predicted position with the actual/known location. Additionally, another aspect of the example method for vascular imaging co-registration disclosed herein includes aligning the predicted location of a particular detected anatomical landmark with a corresponding visualized anatomical landmark, as will be discussed in more detail below. This aspect may also help to alleviate or reduce error/misalignment.
Marking a Predicted Location of a Detected Anatomical Landmark on the Extravascular Imaging Data
Another aspect of the example method for vascular imaging co-registration includes marking a predicted location of detected anatomical landmark(s) on the extravascular imaging data. The intravascular imaging data obtained during the translation procedure will include one or more intravascular images showing one or more detected anatomical landmarks. As can be appreciated, these intravascular images showing the detected anatomical landmarks are included in the intravascular imaging data, obtained using the intravascular imaging device during the translation procedure. For co-registration purposes, the location of these detected anatomical landmark (e.g. from IVUS or OCT data) will be correspondingly identified and/or marked and/or registered on the extravascular imaging data (e.g. the angiography/fluoroscopy data). It is useful to know the location (either actual/known location or calculated/predicted location) of the imaging element 182 of the intravascular imaging device 120 when it detected a particular detected anatomical landmark during the translation procedure, which can then be used to mark and/or register that location (of the detected anatomical landmark) on the extravascular imaging data (e.g. the angiography/fluoroscopy data).
In certain specific situations, where it happens that the extravascular imaging device (angiography/fluoroscopy) is active at the same time that the imaging element 182 detects a particular detected anatomical landmark, then the location of the detected anatomical landmark is actual/known. The location of that particular detected anatomical landmark can be marked and/or registered at that location on the extravascular imaging data, using the actual/known location provided by the extravascular imaging.
However, as discussed herein, at least portions of, if not the majority or all of, the translation procedure when intravascular images are collected are performed free of continuous extravascular imaging (e.g. free of angiography/fluoroscopy). In such cases, the actual/known location of the imaging element 182 on the intravascular imaging device 120 as it detects a particular detected anatomical landmark during the translation procedure will not be known. As such, a calculated/predicted location of the imaging element 182 on intravascular imaging device 120 as it detects a particular detected anatomical landmark during the translation procedure will be used. Methods and/or systems for determining the calculated/predicted location of the imaging element 182 are described above, and may be used in this context. The predicted location of detected anatomical landmark(s) are then marked on the extravascular imaging data.
For example,
It is also noted that in the extravascular image 751, the visualized anatomical landmarks (e.g. side branches) are identified and/or identifiable—either manually or by the system. In some embodiments, the visualized anatomical landmarks (e.g. side branches 12, 14, 16, 18) may simply be identified manually by a user, for example, by the user evaluating the image on the screen. In some embodiments, the visualized anatomical landmarks (e.g. side branches 12, 14, 16, 18) may be identified automatically by the system. For example, the computer 130 may include software or hardware that is configured to perform image processing and image-recognition. Using the extravascular image data (e.g. angiographic data), the system may perform image processing and image-recognition to identify the visualized anatomical landmarks (e.g. side branches 12, 14, 16, 18).
The visualized anatomical landmarks may also be marked on the extravascular imaging data. For example, the visualized anatomical landmarks (e.g. side branches 12, 14, 16, 18) may be marked with appropriate markers and/or labels on the extravascular image 751. In this case, side branch 12 is marked as SB1, side branch 14 is marked as SB2, side branch 16 is marked as SB3, and side branch 18 is marked as SB4. In some cases, this identification or marking process may be done automatically by the system. For example, the computer 130 may include software or hardware that is configured to perform image processing and image-recognition configured to identify and mark the visualized anatomical landmarks. In other cases, marking the visualized anatomical landmarks may be done manually by a user, for example, through a user interface.
In some cases, these lines/markings/labels may be interactive. For example, through a user interface, a user may actuate one of the lines/markings/labels, and the lines/markings/labels may become highlighted and/or activated. When a line/marking/label for a particular detected anatomical landmark is highlighted or activated, the corresponding marked predicted location of for that particular detected anatomical landmark is shown on the extravascular imaging data (e.g. the extravascular image 751), and the corresponding intravascular transverse cross-sectional image for that particular detected anatomical landmark is shown in the image output 152 in the upper right portion of the display 150. For example, as seen in
Aligning the Predicted Location of the Detected Anatomical Landmark with the Visualized Anatomical Landmark
Another aspect of the example method for vascular imaging co-registration includes aligning the predicted location of the detected anatomical landmark with the visualized anatomical landmark.
As discussed herein, the visualized anatomical landmark(s) are identified and/or identifiable on the extravascular imaging data—either manually or automatically by the system. The visualized anatomical landmarks may also, optionally, be marked on the extravascular imaging data—either manually or by the system. Further, the predicted location(s) of detected anatomical landmark(s) may be calculated and marked on the extravascular imaging data—either manually or automatically by the system. However, as discussed herein, there may be some misalignment/error/discrepancy between the predicted and marked location(s) of detected anatomical landmark(s) on the extravascular imaging data, and the corresponding visualized anatomical landmark on the extravascular imaging data. The disclosed method for vascular imaging co-registration may include aligning the predicted location of the detected anatomical landmark with the visualized anatomical landmark, and may help to alleviate this misalignment/error/discrepancy.
For example,
A similar alignment steps may also be done for any other predicted locations of additional detected anatomical landmark and corresponding visualized anatomical landmarks that are misaligned.
For example,
Similarly,
Similarly,
As a result of the describes methods and/or systems as described herein, the intravascular images generated by the intravascular catheter during the translation procedure are co-registered to the extravascular images, so that every intravascular image (e.g. IVUS or OCT image) is linked to its corresponding location on the extravascular image (e.g. an angiographic image). The resulting co-registered images can be used to render and present a co-registered display including both the extravascular images and the corresponding intravascular images. The co-registered extravascular and intravascular images can be simultaneously displayed, along-side one another, upon the display 150. The co-registered image data may also be stored in the computer 130 or in a long-term storage device 131 for later review, for example in a session separate from the procedure that acquired the extravascular and intravascular image data. The co-registered display may also be rendered in playback mode.
In some embodiments, the system may include software or hardware that is configured to allow a user to scroll and/or track through the series of co-registered images. For example, the system may include a configuration that allows a user to scroll through one of the sets of co-registered images, and as the scrolling occurs, the processor acquires and displays corresponding co-registered images for the other sets of images. For example, with reference to
Another aspect of the method for vascular imaging co-registration may include estimating the accuracy of the imaging registration. As discussed herein, there may be some misalignment/error/discrepancy between the predicted and marked location(s) of detected anatomical landmark(s) on the extravascular imaging data, and the corresponding visualized anatomical landmark on the extravascular imaging data. As disclosed herein, one aspect of the method for vascular imaging co-registration may include aligning the predicted location of the detected anatomical landmark with the visualized anatomical landmark, and this may help alleviate some of this misalignment/error/discrepancy in the co-registration. However, the fact that this misalignment/error/discrepancy occurred in the first place (e.g. prior to aligning the predicted location of the detected anatomical landmark with the visualized anatomical landmark) may suggest a desire, and in some cases may provide a mechanism, to estimate the accuracy of the image co-registration. The system may include software or hardware that is configured to estimate the accuracy of the co-registration, and in some cases, display and/or otherwise indicate an estimated level of accuracy for portions of or all of the co-registration.
A variety of methods may be used to estimate the accuracy of the co-registration. For example, the error between predicted locations of detected anatomical landmark and the corresponding visualized anatomical landmarks can be measured, either individually, in groups, or as a whole over the entire co-registration. The magnitude these measurements may indicate an estimated level of accuracy of parts or all of the co-registration. For example, if the magnitude of measured error is large and/or exceeds certain predetermined thresholds, the predicted level of accuracy of the co-registration may be low. On the other hand, if the magnitude these measurements is large and/or exceeds certain predetermined thresholds, the predicted level of accuracy of the co-registration may be low. These estimations of accuracy may be made for portions of, or for the entre co-registration. Other example factors that may be used in estimating the accuracy of the co-registration may include: the total number actual/known locations or registration points used in the co-registration (e.g. obtained via fluoroscopy); the distance between actual/known locations or registration points; the total number of alignments performed (e.g. aligning a predicted location of the detected anatomical landmark with a visualized anatomical landmark); the distance between the performed alignments; and the degree of tortuosity of the vessel being analyze. Other methods of estimating accuracy may involve using curve-based algorithms or formulas, foreshortening predictions formulas or models, and the like.
The estimated level of accuracy of the co-registration may be performed for all, or portions of co-registration and/or over all or segments of the portion of the vessel being analyzed. Once an estimated level of accuracy is determined, the estimated level of accuracy for all, or portions of co-registration be displayed. For example, the system may include software or hardware that is configured to generating a visual indicator representing the estimated accuracy of the imaging co-registration. In some embodiments, the visual indicator may be displayed and/or overlaid on all or portions of the illustrated blood vessel on the extravascular imaging data. The visual characteristic may include color, symbol, intensity, or the like, and may be overlaid/superimposed on or associated with segments of the vessel shown. In some cases, different colors, symbols, intensity levels may be coded and/or used to indicate the level of accuracy. For example, if the level of accuracy is determined to be high for a particular segment, one color (e.g. green) may be overlaid on that segment. If alternatively, the level of accuracy is determined to be low for another particular segment, another color (e.g. red) may be overlaid on that segment. As can be appreciated, these are given by way of example only, and a broad variety of other configurations are contemplated.
In another aspect, it can be appreciated there may be error between an actual/known location and a calculated location, and this error may be determined or measured. In some embodiments, this error may be taken into account and used to adjust the co-registration. For example, an error/total travel distance ratio may be used as a scaling factor to recalculate and adjust previously calculated/predicted positions on the extravascular image (e.g. angiograph) for the entire preceding period in which the fluoroscope was inactive.
It should be understood that this disclosure is, in many respects, only illustrative. Changes may be made in details, particularly in matters of shape, size, and arrangement of steps without exceeding the scope of the disclosure. This may include, to the extent that it is appropriate, the use of any of the features of one example embodiment being used in other embodiments. The invention's scope is, of course, defined in the language in which the appended claims are expressed.
This application claims priority under 35 U.S.C. § 119 to U.S. Provisional Application Serial No. 63/157,427 filed Mar. 5, 2021, the entirety of which is incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
3273447 | Wallace | Sep 1966 | A |
3963323 | Arnold | Jun 1976 | A |
4112941 | Larimore | Sep 1978 | A |
4487206 | Aagard | Dec 1984 | A |
4711246 | Alderson | Dec 1987 | A |
4771782 | Millar | Sep 1988 | A |
4893630 | Roberts, Jr. | Jan 1990 | A |
4953553 | Tremulis | Sep 1990 | A |
5005584 | Little | Apr 1991 | A |
5106455 | Jacobsen et al. | Apr 1992 | A |
5135503 | Abrams | Aug 1992 | A |
5178159 | Christian | Jan 1993 | A |
5238004 | Sahatjian et al. | Aug 1993 | A |
5280786 | Wlodarczyk | Jan 1994 | A |
5313957 | Little | May 1994 | A |
5322064 | Lundquist | Jun 1994 | A |
5414507 | Herman | May 1995 | A |
5421195 | Wlodarczyk | Jun 1995 | A |
5422969 | Eno | Jun 1995 | A |
5425371 | Mischenko | Jun 1995 | A |
5427114 | Colliver et al. | Jun 1995 | A |
5437288 | Schwartz | Aug 1995 | A |
5438873 | Wlodarczyk et al. | Aug 1995 | A |
5450853 | Hastings | Sep 1995 | A |
5573520 | Schwartz et al. | Nov 1996 | A |
5633963 | Rickenbach et al. | May 1997 | A |
5748819 | Szentesi et al. | May 1998 | A |
5755668 | Itoigawa et al. | May 1998 | A |
5772609 | Nguyen et al. | Jun 1998 | A |
5779698 | Clayman | Jul 1998 | A |
5797856 | Frisbie et al. | Aug 1998 | A |
5836885 | Schwager | Nov 1998 | A |
5865801 | Houser | Feb 1999 | A |
5872879 | Hamm | Feb 1999 | A |
5873835 | Hastings et al. | Feb 1999 | A |
5902248 | Millar et al. | May 1999 | A |
5916177 | Schwager | Jun 1999 | A |
5938624 | Akerfeldt et al. | Aug 1999 | A |
5949929 | Hamm | Sep 1999 | A |
5964714 | Lafontaine | Oct 1999 | A |
6112598 | Tenerz et al. | Sep 2000 | A |
6120457 | Coombes et al. | Sep 2000 | A |
6139510 | Palermo | Oct 2000 | A |
6162182 | Cole | Dec 2000 | A |
6167763 | Tenerz et al. | Jan 2001 | B1 |
6196980 | Akerfeldt et al. | Mar 2001 | B1 |
6248083 | Smith et al. | Jun 2001 | B1 |
6265792 | Granchukoff | Jul 2001 | B1 |
6312380 | Hoek et al. | Nov 2001 | B1 |
6394986 | Millar | May 2002 | B1 |
6398738 | Millar | Jun 2002 | B1 |
6409677 | Tulkki | Jun 2002 | B1 |
6428336 | Akerfeldt | Aug 2002 | B1 |
6461301 | Smith | Oct 2002 | B2 |
6506313 | Fetterman et al. | Jan 2003 | B1 |
6508803 | Horikawa et al. | Jan 2003 | B1 |
6565514 | Svanerudh et al. | May 2003 | B2 |
6575911 | Schwager | Jun 2003 | B2 |
6579246 | Jacobsen et al. | Jun 2003 | B2 |
6579484 | Tiernan et al. | Jun 2003 | B1 |
6585660 | Dorando et al. | Jul 2003 | B2 |
6589164 | Flaherty | Jul 2003 | B1 |
6615067 | Hoek et al. | Sep 2003 | B2 |
6663570 | Mott et al. | Dec 2003 | B2 |
6766720 | Jacobsen et al. | Jul 2004 | B1 |
6767327 | Corl et al. | Jul 2004 | B1 |
6776720 | Bartlett | Aug 2004 | B2 |
6908442 | von Malmborg et al. | Jun 2005 | B2 |
6918873 | Millar et al. | Jul 2005 | B1 |
6918882 | Skujins et al. | Jul 2005 | B2 |
6974422 | Millar | Dec 2005 | B1 |
6976965 | Corl et al. | Dec 2005 | B2 |
6993974 | Tenerz et al. | Feb 2006 | B2 |
6994695 | Millar | Feb 2006 | B1 |
7071197 | Leonardi et al. | Jul 2006 | B2 |
7134994 | Alpert et al. | Nov 2006 | B2 |
7162926 | Guziak et al. | Jan 2007 | B1 |
7187453 | Belleville | Mar 2007 | B2 |
7244244 | Racenet et al. | Jul 2007 | B2 |
7259862 | Duplain | Aug 2007 | B2 |
7265847 | Duplain et al. | Sep 2007 | B2 |
7274956 | Mott et al. | Sep 2007 | B2 |
7331236 | Smith et al. | Feb 2008 | B2 |
7532920 | Ainsworth et al. | May 2009 | B1 |
7618379 | Reynolds et al. | Nov 2009 | B2 |
7684657 | Donlagic et al. | Mar 2010 | B2 |
7689071 | Belleville et al. | Mar 2010 | B2 |
7715903 | Hartley et al. | May 2010 | B2 |
7724148 | Samuelsson et al. | May 2010 | B2 |
7731664 | Millar | Jun 2010 | B1 |
7759633 | Duplain et al. | Jul 2010 | B2 |
7783338 | Ainsworth et al. | Aug 2010 | B2 |
7878984 | Jacobsen et al. | Feb 2011 | B2 |
7930014 | Huenneckens et al. | Apr 2011 | B2 |
7946997 | Hübinette | May 2011 | B2 |
8025623 | Millar | Sep 2011 | B1 |
8029447 | Kanz et al. | Oct 2011 | B2 |
8174395 | Samuelsson et al. | May 2012 | B2 |
8216151 | Smith | Jul 2012 | B2 |
8298156 | Manstrom et al. | Oct 2012 | B2 |
8317715 | Belleville et al. | Nov 2012 | B2 |
8343076 | Sela et al. | Jan 2013 | B2 |
8393802 | Stanley et al. | Mar 2013 | B2 |
8410940 | Samuelsson et al. | Apr 2013 | B2 |
8419648 | Corl et al. | Apr 2013 | B2 |
8461997 | Samuelsson et al. | Jun 2013 | B2 |
8485985 | Manstrom et al. | Jul 2013 | B2 |
8491484 | Lewis | Jul 2013 | B2 |
8555712 | Narvaez et al. | Oct 2013 | B2 |
8556820 | Alpert et al. | Oct 2013 | B2 |
8562537 | Alpert et al. | Oct 2013 | B2 |
8583218 | Eberle | Nov 2013 | B2 |
8585613 | Nagano | Nov 2013 | B2 |
8636659 | Alpert et al. | Jan 2014 | B2 |
8641633 | Smith | Feb 2014 | B2 |
8641639 | Manstrom et al. | Feb 2014 | B2 |
8676299 | Schmitt et al. | Mar 2014 | B2 |
8698638 | Samuelsson et al. | Apr 2014 | B2 |
8752435 | Belleville et al. | Jun 2014 | B2 |
8757893 | Isenhour et al. | Jun 2014 | B1 |
8764683 | Meller et al. | Jul 2014 | B2 |
8781193 | Steinberg et al. | Jul 2014 | B2 |
8855744 | Tolkowsky et al. | Oct 2014 | B2 |
8920870 | Weber | Dec 2014 | B2 |
8936401 | Belleville et al. | Jan 2015 | B2 |
8998823 | Manstrom et al. | Apr 2015 | B2 |
9010286 | Novak | Apr 2015 | B2 |
RE45534 | Huennekens et al. | Jun 2015 | E |
9052466 | Belleville et al. | Jun 2015 | B2 |
9095313 | Tolkowsky et al. | Aug 2015 | B2 |
9110255 | Lin et al. | Aug 2015 | B2 |
9149230 | Caron | Oct 2015 | B2 |
9289137 | Corl | Mar 2016 | B2 |
9339348 | Davies et al. | May 2016 | B2 |
9364153 | Merritt et al. | Jun 2016 | B2 |
9375164 | Tolkowsky et al. | Jun 2016 | B2 |
9629571 | Tolkowsky et al. | Apr 2017 | B2 |
RE46562 | Huennekens et al. | Oct 2017 | E |
9855384 | Cohen et al. | Jan 2018 | B2 |
9907527 | Dascal et al. | Mar 2018 | B2 |
9974443 | Merritt et al. | May 2018 | B2 |
10028666 | Gregorich | Jul 2018 | B2 |
10076301 | Millett et al. | Sep 2018 | B2 |
10098702 | Merritt et al. | Oct 2018 | B2 |
10130310 | Alpert et al. | Nov 2018 | B2 |
20020013527 | Hoek | Jan 2002 | A1 |
20030031422 | Inagaki et al. | Feb 2003 | A1 |
20030069522 | Jacobsen et al. | Apr 2003 | A1 |
20030120175 | Ehr | Jun 2003 | A1 |
20030159518 | Sawatari | Aug 2003 | A1 |
20040006277 | Langenhove et al. | Jan 2004 | A1 |
20040073141 | Hartley et al. | Apr 2004 | A1 |
20040181174 | Davis et al. | Sep 2004 | A2 |
20040258370 | Bush | Dec 2004 | A1 |
20050000294 | Tenerz et al. | Jan 2005 | A1 |
20050141817 | Yazaki et al. | Jun 2005 | A1 |
20060052700 | Svanerudh | Mar 2006 | A1 |
20060074318 | Ahmed et al. | Apr 2006 | A1 |
20060122537 | Reynolds et al. | Jun 2006 | A1 |
20070010726 | Loeb | Jan 2007 | A1 |
20070038061 | Huennekens et al. | Feb 2007 | A1 |
20070055162 | Vlahos | Mar 2007 | A1 |
20080119758 | Samuelsson et al. | May 2008 | A1 |
20080285909 | Younge et al. | Nov 2008 | A1 |
20090082678 | Smith | Mar 2009 | A1 |
20090088650 | Corl | Apr 2009 | A1 |
20090116020 | Wu et al. | May 2009 | A1 |
20090192412 | Sela et al. | Jul 2009 | A1 |
20090226128 | Donlagic et al. | Sep 2009 | A1 |
20100022950 | Anderson et al. | Jan 2010 | A1 |
20100087605 | Yamamoto et al. | Apr 2010 | A1 |
20100145308 | Layman et al. | Jun 2010 | A1 |
20100234698 | Manstrom et al. | Sep 2010 | A1 |
20100241008 | Belleville et al. | Sep 2010 | A1 |
20110046477 | Hulvershorn et al. | Feb 2011 | A1 |
20110071407 | Hübinette et al. | Mar 2011 | A1 |
20110098572 | Chen et al. | Apr 2011 | A1 |
20110152721 | Sela | Jun 2011 | A1 |
20110178413 | Schmitt et al. | Jul 2011 | A1 |
20110186294 | Narvaez et al. | Aug 2011 | A1 |
20110229094 | Isenhour et al. | Sep 2011 | A1 |
20110245808 | Voeller et al. | Oct 2011 | A1 |
20110319773 | Kanz et al. | Dec 2011 | A1 |
20120004529 | Tolkowsky | Jan 2012 | A1 |
20120059241 | Hastings et al. | Mar 2012 | A1 |
20120083794 | Martin et al. | Apr 2012 | A1 |
20120122051 | Hackel et al. | May 2012 | A1 |
20120210797 | Yu et al. | Aug 2012 | A1 |
20120227505 | Belleville et al. | Sep 2012 | A1 |
20120238869 | Schmitt et al. | Sep 2012 | A1 |
20120245457 | Crowley | Sep 2012 | A1 |
20120259273 | Moshinsky et al. | Oct 2012 | A1 |
20120265102 | Leo et al. | Oct 2012 | A1 |
20130046190 | Davies | Feb 2013 | A1 |
20130051731 | Belleville et al. | Feb 2013 | A1 |
20130190633 | Dorando | Jul 2013 | A1 |
20130218032 | Belleville | Aug 2013 | A1 |
20130296718 | Ranganathan et al. | Nov 2013 | A1 |
20130296722 | Warnking et al. | Nov 2013 | A1 |
20130317372 | Eberle et al. | Nov 2013 | A1 |
20130345574 | Davies et al. | Dec 2013 | A1 |
20140005558 | Gregorich | Jan 2014 | A1 |
20140058275 | Gregorich et al. | Feb 2014 | A1 |
20140066789 | Nishigishi et al. | Mar 2014 | A1 |
20140081244 | Voeller et al. | Mar 2014 | A1 |
20140094691 | Steinberg et al. | Apr 2014 | A1 |
20140094693 | Cohen et al. | Apr 2014 | A1 |
20140103273 | Nakajima | Apr 2014 | A1 |
20140107624 | Belleville | Apr 2014 | A1 |
20140121475 | Alpert et al. | May 2014 | A1 |
20140135633 | Anderson et al. | May 2014 | A1 |
20140180028 | Burkett | Jun 2014 | A1 |
20140205235 | Benjamin et al. | Jul 2014 | A1 |
20140207008 | Davies | Jul 2014 | A1 |
20140241669 | Belleville et al. | Aug 2014 | A1 |
20140248021 | Belleville et al. | Sep 2014 | A1 |
20140275996 | Stigall | Sep 2014 | A1 |
20140276109 | Gregorich | Sep 2014 | A1 |
20140276142 | Dorando | Sep 2014 | A1 |
20140309533 | Yamashika | Oct 2014 | A1 |
20140350414 | McGowan et al. | Nov 2014 | A1 |
20150003783 | Benjamin et al. | Jan 2015 | A1 |
20150003789 | Webler | Jan 2015 | A1 |
20150025330 | Davies et al. | Jan 2015 | A1 |
20150025398 | Davies et al. | Jan 2015 | A1 |
20150032011 | McGowan et al. | Jan 2015 | A1 |
20150051499 | McGowan | Feb 2015 | A1 |
20150078714 | Isenhour et al. | Mar 2015 | A1 |
20150080749 | Anderson et al. | Mar 2015 | A1 |
20150112210 | Webler | Apr 2015 | A1 |
20150119705 | Tochterman et al. | Apr 2015 | A1 |
20150133800 | McCaffrey | May 2015 | A1 |
20150141842 | Spanier | May 2015 | A1 |
20150161790 | Takashi et al. | Jun 2015 | A1 |
20150164467 | Suetoshi et al. | Jun 2015 | A1 |
20150198774 | Lin et al. | Jul 2015 | A1 |
20150230713 | Merritt et al. | Aug 2015 | A1 |
20150230714 | Davies et al. | Aug 2015 | A1 |
20150301288 | Thornton, Jr. | Oct 2015 | A1 |
20150305633 | McCaffrey | Oct 2015 | A1 |
20150323747 | Leigh et al. | Nov 2015 | A1 |
20160008084 | Merritt et al. | Jan 2016 | A1 |
20160135757 | Anderson et al. | May 2016 | A1 |
20160135787 | Anderson et al. | May 2016 | A1 |
20160136392 | Wenderow et al. | May 2016 | A1 |
20160157787 | Merritt et al. | Jun 2016 | A1 |
20160157802 | Anderson | Jun 2016 | A1 |
20160157803 | Keller | Jun 2016 | A1 |
20160157807 | Anderson et al. | Jun 2016 | A1 |
20160166327 | Keller | Jun 2016 | A1 |
20160206214 | Davies et al. | Jul 2016 | A1 |
20160262627 | Hecker et al. | Sep 2016 | A1 |
20170065225 | Hanson | Mar 2017 | A1 |
20170164925 | Marshall et al. | Jun 2017 | A1 |
20180078170 | Panescu et al. | Mar 2018 | A1 |
20180168732 | Trousset et al. | Jun 2018 | A1 |
20180192983 | Dascal et al. | Jul 2018 | A1 |
20180228387 | Park et al. | Aug 2018 | A1 |
20180263507 | Merritt et al. | Sep 2018 | A1 |
20180354106 | Moore | Dec 2018 | A1 |
20190083046 | Alpert et al. | Mar 2019 | A1 |
Number | Date | Country |
---|---|---|
102469943 | May 2012 | CN |
202014100938 | Mar 2014 | DE |
0235992 | Sep 1987 | EP |
0738495 | Oct 1996 | EP |
0879615 | Nov 1998 | EP |
0879617 | Nov 1998 | EP |
1039321 | Sep 2000 | EP |
0750879 | Nov 2000 | EP |
1136032 | Sep 2001 | EP |
1136036 | Sep 2001 | EP |
1136036 | Feb 2003 | EP |
1136032 | Sep 2003 | EP |
1479407 | Nov 2004 | EP |
1925958 | May 2008 | EP |
1927316 | Jun 2008 | EP |
1440761 | Jun 1976 | GB |
2300978 | Nov 1996 | GB |
S53141644 | Dec 1978 | JP |
H08257128 | Oct 1996 | JP |
H08280634 | Oct 1996 | JP |
H10501339 | Feb 1998 | JP |
H10337280 | Dec 1998 | JP |
H1172399 | Mar 1999 | JP |
H11258476 | Sep 1999 | JP |
2005291945 | Oct 2005 | JP |
2008304731 | Dec 2008 | JP |
200910182 | Jan 2009 | JP |
2010233883 | Oct 2010 | JP |
2013132886 | Jul 2013 | JP |
201442645 | Mar 2014 | JP |
2014061268 | Apr 2014 | JP |
2014511114 | May 2014 | JP |
2017526407 | Sep 2017 | JP |
2018057835 | Apr 2018 | JP |
9313707 | Jul 1993 | WO |
9533983 | Dec 1995 | WO |
9626671 | Sep 1996 | WO |
9945352 | Sep 1999 | WO |
2007058616 | May 2007 | WO |
2007130163 | Nov 2007 | WO |
2008034010 | Mar 2008 | WO |
2008076931 | Jun 2008 | WO |
2009042865 | Apr 2009 | WO |
200807693 | Feb 2010 | WO |
2011027282 | Mar 2011 | WO |
2011090744 | Jul 2011 | WO |
2011123689 | Oct 2011 | WO |
2012000798 | Jan 2012 | WO |
2012090210 | Jul 2012 | WO |
2012091783 | Jul 2012 | WO |
2013033489 | Mar 2013 | WO |
2014025255 | Feb 2014 | WO |
2015059311 | Apr 2015 | WO |
2016005944 | Jan 2016 | WO |
2016187231 | Nov 2016 | WO |
2017013020 | Jan 2017 | WO |
2017056007 | Apr 2017 | WO |
Entry |
---|
International Search report and Written Opinion dated May 29, 2017 for International Application No. PCT/US2017/018905. |
Matsou et al; “Visualization of the Improvement of Myocardial Perfusion after Coronary Intervention using Motorized Fractional Flow Reserve Pullback Curve,” Cardiovascular and Therapy, vol. 33, pp. 99-108, 2016. |
International Search Report and Written Opinion dated Oct. 22, 2018 for International Application No. PCT/US2018/044153. |
Van't Veer et al., “Comparison of Different Diastolic Resting Indexes to iFR. Are They Equal?”, Journal of American College of Cardiology, 70(25): 3088-3096, Dec. 18, 2017. |
Jaroslaw et al., “Two Stage EMG Onset Detection Method”, Archives of Control Sciences, 22(4): 427-440, Dec. 1, 2012. |
International Search Report and Written Opinion dated May 7, 2019 for International Application No. PCT/US2019/019247. |
International Search Report and Written Opinion dated May 20, 2022 for International Application No. PCT/US2022/018456. |
International Search Report and Written Opinion dated Jun. 19, 2019 for International Application No. PCT/US2019/027512. |
International Search Report and Written Opinion dated Jul. 3, 2019 for International Application No. PCT/US2019/023488. |
International Search Report and Written Opinion dated Jul. 8, 2019 for International Application No. PCT/US2019/026055. |
Number | Date | Country | |
---|---|---|---|
20220284606 A1 | Sep 2022 | US |
Number | Date | Country | |
---|---|---|---|
63157427 | Mar 2021 | US |