The present disclosure relates generally to coregistration of intraluminal and extraluminal data. In particular, intraluminal data is coregistered to an x-ray image obtained without contrast injection.
Physicians use many different medical diagnostic systems and tools to monitor a patient's health and diagnose and treat medical conditions. Different modalities of medical diagnostic systems may provide a physician with different images, models, and/or data relating to internal structures within a patient. These modalities include invasive devices and systems, such as intravascular systems, and non-invasive devices and systems, such as external ultrasound systems or x-ray systems. Using multiple diagnostic systems to examine a patient's anatomy provides a physician with added insight into the condition of the patient.
In the field of intravascular imaging and physiology measurement, co-registration of data from invasive devices (e.g. intravascular ultrasound (IVUS) devices) with images collected non-invasively (e.g. via x-ray angiography and/or x-ray venography) is a powerful technique for improving the efficiency and accuracy of vascular catheterization procedures. Co-registration identifies the locations of intravascular data measurements along a blood vessel by mapping the data to an x-ray image of the vessel. A physician may then see on an angiography image exactly where along the vessel a measurement was made, rather than estimate the location.
Coregistration of intravascular data to locations along a blood vessel typically requires introduction of a contrast agent into the patient vasculature. The contrast agent makes otherwise non-radiopaque blood vessels appear in x-ray images. When displayed to a user, the locations of the intravascular data are displayed along the contrast-filled vessel in the x-ray image. Introducing contrast agent, however, can be time consuming and prone to error. Some patients may also not tolerate contrast agent well, which can cause discomfort for the patient.
Embodiments of the present disclosure are systems, devices, and methods for coregistering intraluminal data and/or annotations to locations along a vessel of an x-ray image obtained without contrast. In a no-contrast x-ray image, the vessel itself is not visible in the image. Aspects of the present disclosure advantageously allow a user to perform coregistration with a no-contrast x-ray image or a low-dose contrast x-ray image. This advantageously allows coregistration procedures to be performed for patients with Chronic Kidney Disease (CKD), or other sensitivities to x-ray contrast agent, without exposing them to contrast dyes. This also allows patients, particularly those with CKD, to be discharged after a coregistered intraluminal procedure the same day with less concern for the development of Contrast Induced Nephropathy (CIN). Same-day discharge is cost effective and safer for patients and has been shown to be safe even after the most complex interventions.
Aspects of the present invention may include zero contrast coregistration and/or optimizing co-registration workflow in interventional vascular procedures under x-ray which do not utilize contrast injection. Multiple zero-contrast x-ray images are obtained during an intravascular procedure. A radiopaque portion of an intravascular device is seen in each zero-contrast x-ray image. The positions of the device in each image form a pathway. The pathway is then processed and a motion-corrected centerline pathway is determined. This motion-corrected centerline pathway is overlaid over one of the zero-contrast x-ray images. The pathway is then displayed to a user. The user may edit the shape of the pathway and/or confirm that the shape of the pathway is correct. The positions at which intravascular data was collected may then be associated with locations along the pathway allowing a physician to see where intravascular data was obtained within an x-ray image.
In an exemplary aspect, a system is provided. The system includes a processor circuit configured for communication with an extraluminal imaging device and an intraluminal catheter or guidewire, wherein the processor circuit is configured to: receive a first extraluminal image obtained by the extraluminal imaging device; receive a plurality of second extraluminal images obtained by the extraluminal imaging device during movement of the intraluminal catheter or guidewire within a body lumen of a patient, wherein the plurality of second plurality of extraluminal images are obtained without a contrast agent within the body lumen; receive a plurality of intraluminal data points obtained by the intraluminal catheter or guidewire during the movement; determine, based on the plurality of second extraluminal images, a curve representative of at least one of a shape or a location of the body lumen; determine if the first extraluminal image was obtained without the contrast agent within the body lumen; in response to the determination that the first extraluminal image was obtained without the contrast agent within the body lumen: assign the curve to be a centerline of the body lumen in the first extraluminal image; co-register the plurality of intraluminal data points to positions along the curve; output, to a display in communication with the processor circuit, a first screen display comprising: the first extraluminal image; a visual representation of an intraluminal data point of the plurality of intraluminal data points; and a marking overlaid on the extraluminal image at a corresponding position of the intraluminal data point.
In one aspect, in response to the determination that the first extraluminal image was obtained without the contrast agent within the body lumen, the processor circuit is configured to: output, to the display, a second screen display comprising: the first extraluminal image; and the curve overlaid on the first extraluminal image. In one aspect, the second screen display comprises a plurality of user input options to at least one of accept the centerline, correct the centerline, or draw a new centerline. In one aspect, when a user input option to correct the centerline is selected, the processor circuit is configured to receive a user input to identify a region of the curve and select a new location within the first extraluminal image corresponding to a corrected location of the region. In one aspect, the processor is configured to perform the co-registration and output the first screen display only after receiving a user input via the plurality of user input options. In one aspect, the processor circuit is configured for communication with a touchscreen display, the processor circuit is configured to output the first screen display to the touchscreen display, and the processor circuit is configured to receive the user input from the touchscreen display. In one aspect, the extraluminal imaging device comprises an x-ray imaging device. In one aspect, the first extraluminal image is obtained with a first radiation dose and the plurality of second extraluminal images are obtained with a second radiation dose smaller than the first radiation dose. In one aspect, the processor circuit is configured to: receive a plurality of first extraluminal images obtained by the extraluminal imaging device; and select the first extraluminal image from among the plurality of first extraluminal images. In one aspect, the processor circuit is configured to determine if the first extraluminal image was obtained without the contrast agent automatically, without receiving a user input to identify that the first extraluminal image was obtained without the contrast agent. In one aspect, the plurality of second extraluminal images show a radiopaque portion of the intraluminal catheter or guidewire, and the processor circuit is configured to determine the curve based on the radiopaque portion shown in the plurality of second extraluminal images. In one aspect, the plurality of second extraluminal images are obtained during a plurality of anatomical cycles such that the intraluminal catheter or guidewire experiences periodic motion during the movement of the intraluminal catheter or guidewire through the body lumen, and to determine the curve, the processor circuit is configured to perform motion compensation. In one aspect, to perform the motion compensation, the processor circuit is further configured to locate the curve along a center of a shape generated by the movement of the intraluminal catheter or guidewire within the body lumen while the intraluminal catheter or guidewire experiences the periodic motion. In one aspect, the first extraluminal image is one of the plurality of second extraluminal images. In one aspect, the processor circuit is further configured to assign the curve to be a centerline of the body lumen in the first extraluminal image without identifying the body lumen in the first extraluminal image and without identifying the centerline in the first extraluminal image.
In an exemplary aspect, a method is provided. The method includes receiving, with a processor circuit in communication with an extraluminal imaging device, a first extraluminal image obtained by the extraluminal imaging device; receiving, with the processor circuit, a plurality of second extraluminal images obtained by the extraluminal imaging device during movement of the intraluminal catheter or guidewire within a body lumen of a patient, wherein the plurality of second plurality of extraluminal images are obtained without a contrast agent within the body lumen; receiving, with the processor circuit, a plurality of intraluminal data points obtained by an intraluminal catheter or guidewire during the movement through the body lumen, wherein the processor circuit is in communication with the intraluminal catheter or guidewire; determining, with the processor circuit, a curve representative of at least one of a shape or a location of the body lumen, based on the plurality of second extraluminal images; determining, with the processor circuit, if the first extraluminal image was obtained without the contrast agent within the body lumen; in response to the processor circuit determining that the first extraluminal image was obtained without the contrast agent within the body lumen (for example, the processor circuit, having made the determination that the extraluminal image was obtained without the contrast agent, performs the following steps): assigning, with the processor circuit, the curve to be a centerline of the body lumen in the first extraluminal image without identifying the body lumen in the first extraluminal image and without identifying the centerline in the first extraluminal image; co-registering, with the processor circuit, the plurality of intraluminal data points to positions along the curve; outputting, to a display in communication with the processor circuit, a first screen display comprising: the first extraluminal image; a visual representation of an intraluminal data point of the plurality of intraluminal data points; and a marking overlaid on the extraluminal image at a corresponding position of the intraluminal data point.
In an exemplary aspect, a system is provided. The system includes a processor circuit configured for communication with an extraluminal imaging device and an intraluminal catheter or guidewire, wherein the processor circuit is configured to: receive a first extraluminal image obtained by the extraluminal imaging device, wherein the first extraluminal image is obtained without contrast agent within the body lumen; receive a plurality of second extraluminal images obtained by the extraluminal imaging device during movement of the intraluminal catheter or guidewire within a body lumen of a patient, wherein the plurality of second plurality of extraluminal images are obtained without a contrast agent within the body lumen; receive a plurality of intraluminal data points obtained by the intraluminal catheter or guidewire during the movement; co-register the plurality of intraluminal data points to the first extraluminal image based on the plurality of second extraluminal images such that the co-registration is performed without an extraluminal image obtained with contrast agent within the body lumen; output, to a display in communication with the processor circuit, a first screen display comprising: the first extraluminal image; a visual representation of an intraluminal data point of the plurality of intraluminal data points; and a marking overlaid on the extraluminal image at a corresponding position of the intraluminal data point.
In an exemplary aspect, a system is provided. The system includes an intravascular imaging catheter; and a processor circuit configured for communication with an x-ray imaging device and the intravascular imaging device, wherein the processor circuit is configured to: receive a first x-ray image obtained by the x-ray imaging device; receive a plurality of second x-ray images obtained by the x-ray imaging device during movement of the intravascular imaging catheter within a blood vessel of a patient, wherein the plurality of second plurality of x-ray images are obtained without a contrast agent within the blood vessel; receive a plurality of intravascular images obtained by the intravascular imaging catheter during the movement; determine, based on the plurality of second x-ray images, a curve representative of at least one of a shape or a location of the blood vessel; determine if the first x-ray image was obtained without the contrast agent within the blood vessel; in response to the determination that the first x-ray image was obtained without the contrast agent within the blood vessel: assign the curve to be a centerline of the body lumen in the first x-ray image without identifying the blood vessel in the first x-ray image and without identifying the centerline in the first x-ray image; co-register the plurality of intravascular images to positions along the curve; output, to a display in communication with the processor circuit, a first screen display comprising: the first x-ray image; an intravascular image of the plurality of intravascular images; and a marking overlaid on the extraluminal image at a corresponding position of the intravascular image.
Additional aspects, features, and advantages of the present disclosure will become apparent from the following detailed description.
Illustrative embodiments of the present disclosure will be described with reference to the accompanying drawings, of which:
For the purposes of promoting an understanding of the principles of the present disclosure, reference will now be made to the embodiments illustrated in the drawings, and specific language will be used to describe the same. It is nevertheless understood that no limitation to the scope of the disclosure is intended. Any alterations and further modifications to the described devices, systems, and methods, and any further application of the principles of the present disclosure are fully contemplated and included within the present disclosure as would normally occur to one skilled in the art to which the disclosure relates. In particular, it is fully contemplated that the features, components, and/or steps described with respect to one embodiment may be combined with the features, components, and/or steps described with respect to other embodiments of the present disclosure. For the sake of brevity, however, the numerous iterations of these combinations will not be described separately.
Aspects of the present disclosure invention seek to optimize workflow, user interface, and algorithmic aspects associated with co-registration of intraluminal data and extraluminal images which does not use contrast.
The intraluminal imaging system 101 may be in communication with the extraluminal imaging system 151 through any suitable components. Such communication may be established through a wired cable, through a wireless signal, or by any other means. In addition, the intraluminal imaging system 101 may be in continuous communication with the x-ray system 151 or may be in intermittent communication. For example, the two systems may be brought into temporary communication via a wired cable, or brought into communication via a wireless communication, or through any other suitable means at some point before, after, or during an examination. In addition, the intraluminal system 101 may receive data such as x-ray images, annotated x-ray images, metrics calculated with the x-ray imaging system 151, information regarding dates and times of examinations, types and/or severity of patient conditions or diagnoses, patient history or other patient information, or any suitable data or information from the x-ray imaging system 151. The x-ray imaging system 151 may also receive any of these data from the intraluminal imaging system 101. In some embodiments, and as shown in
In some embodiments, the system 100 may not include a control system 130 in communication with the intraluminal imaging system 101 and the x-ray imaging system 151. Instead, the system 100 may include two separate control systems. For example, one control system may be in communication with or be a part of the intraluminal imaging system 101 and an additional separate control system may be in communication with or be a part of the x-ray imaging system 151. In this embodiment, the separate control systems of both the intraluminal imaging system 101 and the x-ray imaging system 151 may be similar to the control system 130. For example, each control system may include various components or systems such as a communication interface, processor, and/or a display. In this embodiment, the control system of the intraluminal imaging system 101 may perform any or all of the coregistration steps described in the present disclosure. Alternatively, the control system of the x-ray imaging system 151 may perform the coregistration steps described.
The intraluminal imaging system 101 can be an ultrasound imaging system. In some instances, the intraluminal imaging system 101 can be an intravascular ultrasound (IVUS) imaging system. The intraluminal imaging system 101 may include an intraluminal imaging device 102, such as a catheter, guide wire, or guide catheter, in communication with the control system 130. The control system 130 may include a display 132, a processor 134, and a communication interface 140 among other components. The intraluminal imaging device 102 can be an ultrasound imaging device. In some instances, the device 102 can be an IVUS imaging device, such as a solid-state IVUS device. In some aspects, a user input device and the display 132 can be integrated into one housing in some instances, or may be separate devices.
At a high level, the IVUS device 102 emits ultrasonic energy from a transducer array 124 included in a scanner assembly, also referred to as an IVUS imaging assembly, mounted near a distal end of the catheter device. The ultrasonic energy is reflected by tissue structures in the surrounding medium, such as a vessel 120, or another body lumen surrounding the scanner assembly 110, and the ultrasound echo signals are received by the transducer array 124. In that regard, the device 102 can be sized, shaped, or otherwise configured to be positioned within the body lumen of a patient. The communication interface 140 transfers the received echo signals to the processor 134 of the control system 130 where the ultrasound image (including flow information in some embodiments) is reconstructed and displayed on the display 132. The control system 130, including the processor 134, can be operable to facilitate the features of the IVUS imaging system 101 described herein. For example, the processor 134 can execute computer readable instructions stored on the non-transitory tangible computer readable medium.
The communication interface 140 facilitates communication of signals between the control system 130 and the scanner assembly 110 included in the IVUS device 102. This communication includes the steps of: (1) providing commands to integrated circuit controller chip(s) included in the scanner assembly 110 to select the particular transducer array element(s), or acoustic element(s), to be used for transmit and receive, (2) providing the transmit trigger signals to the integrated circuit controller chip(s) included in the scanner assembly 110 to activate the transmitter circuitry to generate an electrical pulse to excite the selected transducer array element(s), and/or (3) accepting amplified echo signals received from the selected transducer array element(s) via amplifiers included on the integrated circuit controller chip(s) of the scanner assembly 110. In some embodiments, the communication interface 140 performs preliminary processing of the echo data prior to relaying the data to the processor 134. In examples of such embodiments, the communication interface 140 performs amplification, filtering, and/or aggregating of the data. In an embodiment, the communication interface 140 also supplies high- and low-voltage DC power to support operation of the device 102 including circuitry within the scanner assembly 110.
The processor 134 receives the echo data from the scanner assembly 110 by way of the communication interface 140 and processes the data to reconstruct an image of the tissue structures in the medium surrounding the scanner assembly 110. The processor 134 outputs image data such that an image of the lumen 120, such as a cross-sectional image of the vessel 120, is displayed on the display 132. The lumen 120 may represent fluid filled or surrounded structures, both natural and man-made. The lumen 120 may be within a body of a patient. The lumen 120 may be a blood vessel, such as an artery or a vein of a patient's vascular system, including cardiac vasculature, peripheral vasculature, neural vasculature, renal vasculature, and/or any other suitable lumen inside the body. For example, the device 102 may be used to examine any number of anatomical locations and tissue types, including without limitation, organs including the liver, heart, kidneys, gall bladder, pancreas, lungs; ducts; intestines; nervous system structures including the brain, dural sac, spinal cord and peripheral nerves; the urinary tract; as well as valves within the blood, chambers or other parts of the heart, and/or other systems of the body. In addition to natural structures, the device 102 may be used to examine man-made structures such as, but without limitation, heart valves, stents, shunts, filters and other devices.
In some embodiments, the IVUS device includes some features similar to traditional solid-state IVUS catheters, such as the EagleEye® catheter, Visions PV 0.014P RX catheter, Visions PV 0.018 catheter, Visions PV 0.035, and Pioneer Plus catheter, each of which are available from Koninklijke Philips N.V, and those disclosed in U.S. Pat. No. 7,846,101 hereby incorporated by reference in its entirety. For example, the IVUS device 102 includes the scanner assembly 110 near a distal end of the device 102 and a transmission line bundle 112 extending along the longitudinal body of the device 102. The transmission line bundle or cable 112 can include a plurality of conductors, including one, two, three, four, five, six, seven, or more conductors. It is understood that any suitable gauge wire can be used for the conductors. In an embodiment, the cable 112 can include a four-conductor transmission line arrangement with, e.g., 41 AWG gauge wires. In an embodiment, the cable 112 can include a seven-conductor transmission line arrangement utilizing, e.g., 44 AWG gauge wires. In some embodiments, 43 AWG gauge wires can be used.
The transmission line bundle 112 terminates in a patient interface module (PIM) connector 114 at a proximal end of the device 102. The PIM connector 114 electrically couples the transmission line bundle 112 to the communication interface 140 and physically couples the IVUS device 102 to the communication interface 140. In some embodiments, the communication interface 140 may be a PIM. In an embodiment, the IVUS device 102 further includes a guide wire exit port 116. Accordingly, in some instances the IVUS device 102 is a rapid-exchange catheter. The guide wire exit port 116 allows a guide wire 118 to be inserted towards the distal end to direct the device 102 through the vessel 120.
In some embodiments, the intraluminal imaging device 102 may acquire intravascular images of any suitable imaging modality, including optical coherence tomography (OCT) and intravascular photoacoustic (IVPA).
In some embodiments, the intraluminal device 102 is a pressure sensing device (e.g., pressure-sensing guidewire) that obtains intraluminal (e.g., intravascular) pressure data, and the intraluminal system 101 is an intravascular pressure sensing system that determines pressure ratios based on the pressure data, such as fractional flow reserve (FFR), instantaneous wave-free ratio (iFR), and/or other suitable ratio between distal pressure and proximal/aortic pressure (Pd/Pa). In some embodiments, the intraluminal device 102 is a flow sensing device (e.g., flow-sensing guidewire) that obtains intraluminal (e.g., intravascular) flow data, and the intraluminal system 101 is an intravascular flow sensing system that determines flow-related values based on the pressure data, such as coronary flow reserve (CFR), flow velocity, flow volume, etc.
The x-ray imaging system 151 may include an x-ray imaging apparatus or device 152 configured to perform x-ray imaging, angiography, fluoroscopy, radiography, venography, among other imaging techniques. The x-ray imaging system 151 can generate a single x-ray image (e.g., an angiogram or venogram) or multiple (e.g., two or more) x-ray images (e.g., a video and/or fluoroscopic image stream) based on x-ray image data collected by the x-ray device 152. The x-ray imaging device 152 may be of any suitable type, for example, it may be a stationary x-ray system such as a fixed c-arm x-ray device, a mobile c-arm x-ray device, a straight arm x-ray device, or a u-arm device. The x-ray imaging device 152 may additionally be any suitable mobile device. The x-ray imaging device 152 may also be in communication with the control system 130. In some embodiments, the x-ray system 151 may include a digital radiography device or any other suitable device.
The x-ray device 152 as shown in
The x-ray source 160 may include an x-ray tube adapted to generate x-rays. Some aspects of the x-ray source 160 may include one or more vacuum tubes including a cathode in connection with a negative lead of a high-voltage power source and an anode in connection with a positive lead of the same power source. The cathode of the x-ray source 160 may additionally include a filament. The filament may be of any suitable type or constructed of any suitable material, including tungsten or rhenium tungsten, and may be positioned within a recessed region of the cathode. One function of the cathode may be to expel electrons from the high voltage power source and focus them into a well-defined beam aimed at the anode. The anode may also be constructed of any suitable material and may be configured to create x-radiation from the emitted electrons of the cathode. In addition, the anode may dissipate heat created in the process of generating x-radiation. The anode may be shaped as a beveled disk and, in some embodiments, may be rotated via an electric motor. The cathode and anode of the x-ray source 160 may be housed in an airtight enclosure, sometimes referred to as an envelope.
In some embodiments, the x-ray source 160 may include a radiation object focus which influences the visibility of an image. The radiation object focus may be selected by a user of the system 100 or by a manufacture of the system 100 based on characteristics such as blurring, visibility, heat-dissipating capacity, or other characteristics. In some embodiments, an operator or user of the system 100 may switch between different provided radiation object foci in a point-of-care setting.
The detector 170 may be configured to acquire x-ray images and may include the input screen 174. The input screen 174 may include one or more intensifying screens configured to absorb x-ray energy and convert the energy to light. The light may in turn expose a film. The input screen 174 may be used to convert x-ray energy to light in embodiments in which the film may be more sensitive to light than x-radiation. Different types of intensifying screens within the image intensifier may be selected depending on the region of a patient to be imaged, requirements for image detail and/or patient exposure, or any other factors. Intensifying screens may be constructed of any suitable materials, including barium lead sulfate, barium strontium sulfate, barium fluorochloride, yttrium oxysulfide, or any other suitable material. The input screen 374 may be a fluorescent screen or a film positioned directly adjacent to a fluorescent screen. In some embodiments, the input screen 374 may also include a protective screen to shield circuitry or components within the detector 370 from the surrounding environment. In some embodiments, the x-ray detector 170 may include a flat panel detector (FPD). The detector 170 may be an indirect conversion FPD or a direct conversion FPD. The detector 170 may also include charge-coupled devices (CCDs). The x-ray detector 370 may additionally be referred to as an x-ray sensor.
The object 180 may be any suitable object to be imaged. In an exemplary embodiment, the object may be the anatomy of a patient. More specifically, the anatomy to be imaged may include chest, abdomen, the pelvic region, neck, legs, head, feet, a region with cardiac vasculature, or a region containing the peripheral vasculature of a patient and may include various anatomical structures such as, but not limited to, organs, tissue, blood vessels and blood, gases, or any other anatomical structures or objects. In other embodiments, the object may be or include man-made structures.
In some embodiments, the x-ray imaging system 151 may be configured to obtain x-ray images without contrast. In some embodiments, the x-ray imaging system 151 may be configured to obtain x-ray images with contrast (e.g., angiogram or venogram). In such embodiments, a contrast agent or x-ray dye may be introduced to a patient's anatomy before imaging. The contrast agent may also be referred to as a radiocontrast agent, contrast material, contrast dye, or contrast media. The contrast dye may be of any suitable material, chemical, or compound and may be a liquid, powder, paste, tablet, or of any other suitable form. For example, the contrast dye may be iodine-based compounds, barium sulfate compounds, gadolinium-based compounds, or any other suitable compounds. The contrast agent may be used to enhance the visibility of internal fluids or structures within a patient's anatomy. The contrast agent may absorb external x-rays, resulting in decreased exposure on the x-ray detector 170.
In some embodiments, the extraluminal imaging system 151 could be any suitable extraluminal imaging device, such as computed tomography (CT) or magnetic resonance imaging (MRI).
When the control system 130 is in communication with the x-ray system 151, the communication interface 140 facilitates communication of signals between the control system 130 and the x-ray device 152. This communication includes providing control commands to the x-ray source 160 and/or the x-ray detector 170 of the x-ray device 152 and receiving data from the x-ray device 152. In some embodiments, the communication interface 140 performs preliminary processing of the x-ray data prior to relaying the data to the processor 134. In examples of such embodiments, the communication interface 140 may perform amplification, filtering, and/or aggregating of the data. In an embodiment, the communication interface 140 also supplies high- and low-voltage DC power to support operation of the device 152 including circuitry within the device.
The processor 134 receives the x-ray data from the x-ray device 152 by way of the communication interface 140 and processes the data to reconstruct an image of the anatomy being imaged. The processor 134 outputs image data such that an image is displayed on the display 132. In an embodiment in which the contrast agent is introduced to the anatomy of a patient and a venogram is to be generated, the particular areas of interest to be imaged may be one or more blood vessels or other section or part of the human vasculature. The contrast agent may identify fluid filled structures, both natural and/or man-made, such as arteries or veins of a patient's vascular system, including cardiac vasculature, peripheral vasculature, neural vasculature, renal vasculature, and/or any other suitable lumen inside the body. For example, the x-ray device 152 may be used to examine any number of anatomical locations and tissue types, including without limitation all the organs, fluids, or other structures or parts of an anatomy previously mentioned. In addition to natural structures, the x-ray device 152 may be used to examine man-made structures such as any of the previously mentioned structures.
The processor 134 may be configured to receive an x-ray image that was stored by the x-ray imaging device 152 during a clinical procedure. The images may be further enhanced by other information such as patient history, patient record, IVUS imaging, pre-operative ultrasound imaging, pre-operative CT, or any other suitable data.
The flexible substrate 214, on which the transducer control logic dies 206 and the transducer elements 212 are mounted, provides structural support and interconnects for electrical coupling. The flexible substrate 214 may be constructed to include a film layer of a flexible polyimide material such as KAPTON™ (trademark of DuPont). Other suitable materials include polyester films, polyimide films, polyethylene napthalate films, or polyetherimide films, liquid crystal polymer, other flexible printed semiconductor substrates as well as products such as Upilex® (registered trademark of Ube Industries) and TEFLON® (registered trademark of E.I. du Pont). In the flat configuration illustrated in
The set of transducer control logic dies 206 is a non-limiting example of a control circuit. The transducer region 204 is disposed at a distal portion 221 of the flexible substrate 214. The control region 208 is disposed at a proximal portion 222 of the flexible substrate 214. The transition region 210 is disposed between the control region 208 and the transducer region 204. Dimensions of the transducer region 204, the control region 208, and the transition region 210 (e.g., lengths 225, 227, 229) can vary in different embodiments. In some embodiments, the lengths 225, 227, 229 can be substantially similar or, the length 227 of the transition region 210 may be less than lengths 225 and 229, the length 227 of the transition region 210 can be greater than lengths 225, 229 of the transducer region and controller region, respectively.
The control logic dies 206 are not necessarily homogenous. In some embodiments, a single controller is designated a master control logic die 206A and contains the communication interface for cable 112, between a processing system, e.g., processing system 106, and the flexible assembly 110. Accordingly, the master control circuit may include control logic that decodes control signals received over the cable 112, transmits control responses over the cable 112, amplifies echo signals, and/or transmits the echo signals over the cable 112. The remaining controllers are slave controllers 206B. The slave controllers 206B may include control logic that drives a plurality of transducer elements 512 positioned on a transducer element 212 to emit an ultrasonic signal and selects a transducer element 212 to receive an echo. In the depicted embodiment, the master controller 206A does not directly control any transducer elements 212. In other embodiments, the master controller 206A drives the same number of transducer elements 212 as the slave controllers 206B or drives a reduced set of transducer elements 212 as compared to the slave controllers 206B. In an exemplary embodiment, a single master controller 206A and eight slave controllers 206B are provided with eight transducers assigned to each slave controller 206B.
To electrically interconnect the control logic dies 206 and the transducer elements 212, in an embodiment, the flexible substrate 214 includes conductive traces 216 formed in the film layer that carry signals between the control logic dies 206 and the transducer elements 212. In particular, the conductive traces 216 providing communication between the control logic dies 206 and the transducer elements 212 extend along the flexible substrate 214 within the transition region 210. In some instances, the conductive traces 216 can also facilitate electrical communication between the master controller 206A and the slave controllers 206B. The conductive traces 216 can also provide a set of conductive pads that contact the conductors 218 of cable 112 when the conductors 218 of the cable 112 are mechanically and electrically coupled to the flexible substrate 214. Suitable materials for the conductive traces 216 include copper, gold, aluminum, silver, tantalum, nickel, and tin, and may be deposited on the flexible substrate 214 by processes such as sputtering, plating, and etching. In an embodiment, the flexible substrate 214 includes a chromium adhesion layer. The width and thickness of the conductive traces 216 are selected to provide proper conductivity and resilience when the flexible substrate 214 is rolled. In that regard, an exemplary range for the thickness of a conductive trace 216 and/or conductive pad is between 1-5 μm. For example, in an embodiment, 5 μm conductive traces 216 are separated by 5 μm of space. The width of a conductive trace 216 on the flexible substrate may be further determined by the width of the conductor 218 to be coupled to the trace or pad.
The flexible substrate 214 can include a conductor interface 220 in some embodiments. The conductor interface 220 can be in a location of the flexible substrate 214 where the conductors 218 of the cable 112 are coupled to the flexible substrate 214. For example, the bare conductors of the cable 112 are electrically coupled to the flexible substrate 214 at the conductor interface 220. The conductor interface 220 can be tab extending from the main body of flexible substrate 214. In that regard, the main body of the flexible substrate 214 can refer collectively to the transducer region 204, controller region 208, and the transition region 210. In the illustrated embodiment, the conductor interface 220 extends from the proximal portion 222 of the flexible substrate 214. In other embodiments, the conductor interface 220 is positioned at other parts of the flexible substrate 214, such as the distal portion 221, or the flexible substrate 214 may lack the conductor interface 220. A value of a dimension of the tab or conductor interface 220, such as a width 224, can be less than the value of a dimension of the main body of the flexible substrate 214, such as a width 226. In some embodiments, the substrate forming the conductor interface 220 is made of the same material(s) and/or is similarly flexible as the flexible substrate 214. In other embodiments, the conductor interface 220 is made of different materials and/or is comparatively more rigid than the flexible substrate 214. For example, the conductor interface 220 can be made of a plastic, thermoplastic, polymer, hard polymer, etc., including polyoxymethylene (e.g., DELRIN®), polyether ether ketone (PEEK), nylon, Liquid Crystal Polymer (LCP), and/or other suitable materials.
Depending on the application and embodiment of the presently disclosed invention, transducer elements 212 may be piezoelectric transducers, single crystal transducer, or PZT (lead zirconate titanate) transducers. In other embodiments, the transducer elements of transducer array 124 may be flexural transducers, piezoelectric micromachined ultrasonic transducers (PMUTs), capacitive micromachined ultrasonic transducers (CMUTs), or any other suitable type of transducer element. In such embodiments, transducer elements 212 may comprise an elongate semiconductor material or other suitable material that allows micromachining or similar methods of disposing extremely small elements or circuitry on a substrate.
In some embodiments, the transducer elements 212 and the controllers 206 can be positioned in an annular configuration, such as a circular configuration or in a polygon configuration, around a longitudinal axis 250 of a support member 230. It is understood that the longitudinal axis 250 of the support member 230 may also be referred to as the longitudinal axis of the scanner assembly 110, the flexible elongate member 121, or the device 102. For example, a cross-sectional profile of the imaging assembly 110 at the transducer elements 212 and/or the controllers 206 can be a circle or a polygon. Any suitable annular polygon shape can be implemented, such as one based on the number of controllers or transducers, flexibility of the controllers or transducers, etc. Some examples may include a pentagon, hexagon, heptagon, octagon, nonagon, decagon, etc. In some examples, the transducer controllers 206 may be used for controlling the ultrasound transducers 512 of transducer elements 212 to obtain imaging data associated with the vessel 120.
The support member 230 can be referenced as a unibody in some instances. The support member 230 can be composed of a metallic material, such as stainless steel, or a non-metallic material, such as a plastic or polymer as described in U.S. Provisional Application No. 61/985,220, “Pre-Doped Solid Substrate for Intravascular Devices,” filed Apr. 28, 2014, the entirety of which is hereby incorporated by reference herein. In some embodiments, support member 230 may be composed of 303 stainless steel. The support member 230 can be a ferrule having a distal flange or portion 232 and a proximal flange or portion 234. The support member 230 can be tubular in shape and define a lumen 236 extending longitudinally therethrough. The lumen 236 can be sized and shaped to receive the guide wire 118. The support member 230 can be manufactured using any suitable process. For example, the support member 230 can be machined and/or electrochemically machined or laser milled, such as by removing material from a blank to shape the support member 230, or molded, such as by an injection molding process or a micro injection molding process.
Referring now to
Stands 242, 243, and 244 that extend vertically are provided at the distal, central, and proximal portions respectively, of the support member 230. The stands 242, 243, and 244 elevate and support the distal, central, and proximal portions of the flexible substrate 214. In that regard, portions of the flexible substrate 214, such as the transducer portion 204 (or transducer region 204), can be spaced from a central body portion of the support member 230 extending between the stands 242, 243, and 244. The stands 242, 243, 244 can have the same outer diameter or different outer diameters. For example, the distal stand 242 can have a larger or smaller outer diameter than the central stand 243 and/or proximal stand 244 and can also have special features for rotational alignment as well as control chip placement and connection.
To improve acoustic performance, the cavity between the transducer array 212 and the surface of the support member 230 may be filled with an acoustic backing material 246. The liquid backing material 246 can be introduced between the flexible substrate 214 and the support member 230 via passageway 235 in the stand 242, or through additional recesses as will be discussed in more detail hereafter. The backing material 246 may serve to attenuate ultrasound energy emitted by the transducer array 212 that propagates in the undesired, inward direction.
The cavity between the circuit controller chips 206 and the surface of the support member 230 may be filled with an underfill material 247. The underfill material 247 may be an adhesive material (e.g. an epoxy) which provides structural support for the circuit controller chips 206 and/or the flexible substrate 214. The underfill 247 may additionally be any suitable material.
In some embodiments, the central body portion of the support member can include recesses allowing fluid communication between the lumen of the unibody and the cavities between the flexible substrate 214 and the support member 230. Acoustic backing material 246 and/or underfill material 247 can be introduced via the cavities (during an assembly process, prior to the inner member 256 extending through the lumen of the unibody. In some embodiments, suction can be applied via the passageways 235 of one of the stands 242, 244, or to any other suitable recess while the liquid backing material 246 is fed between the flexible substrate 214 and the support member 230 via the passageways 235 of the other of the stands 242, 244, or any other suitable recess. The backing material can be cured to allow it to solidify and set. In various embodiments, the support member 230 includes more than three stands 242, 243, and 244, only one or two of the stands 242, 243, 244, or none of the stands. In that regard the support member 230 can have an increased diameter distal portion 262 and/or increased diameter proximal portion 264 that is sized and shaped to elevate and support the distal and/or proximal portions of the flexible substrate 214.
The support member 230 can be substantially cylindrical in some embodiments. Other shapes of the support member 230 are also contemplated including geometrical, non-geometrical, symmetrical, non-symmetrical, cross-sectional profiles. As the term is used herein, the shape of the support member 230 may reference a cross-sectional profile of the support member 230. Different portions of the support member 230 can be variously shaped in other embodiments. For example, the proximal portion 264 can have a larger outer diameter than the outer diameters of the distal portion 262 or a central portion extending between the distal and proximal portions 262, 264. In some embodiments, an inner diameter of the support member 230 (e.g., the diameter of the lumen 236) can correspondingly increase or decrease as the outer diameter changes. In other embodiments, the inner diameter of the support member 230 remains the same despite variations in the outer diameter.
A proximal inner member 256 and a proximal outer member 254 are coupled to the proximal portion 264 of the support member 230. The proximal inner member 256 and/or the proximal outer member 254 can comprise a flexible elongate member. The proximal inner member 256 can be received within a proximal flange 234. The proximal outer member 254 abuts and is in contact with the proximal end of flexible substrate 214. A distal tip member 252 is coupled to the distal portion 262 of the support member 230. For example, the distal member 252 is positioned around the distal flange 232. The tip member 252 can abut and be in contact with the distal end of flexible substrate 214 and the stand 242. In other embodiments, the proximal end of the tip member 252 may be received within the distal end of the flexible substrate 214 in its rolled configuration. In some embodiments there may be a gap between the flexible substrate 214 and the tip member 252. The distal member 252 can be the distal-most component of the intraluminal imaging device 102. The distal tip member 252 may be a flexible, polymeric component that defines the distal-most end of the imaging device 102. The distal tip member 252 may additionally define a lumen in communication with the lumen 236 defined by support member 230. The guide wire 118 may extend through lumen 236 as well as the lumen defined by the tip member 252.
One or more adhesives can be disposed between various components at the distal portion of the intraluminal imaging device 102. For example, one or more of the flexible substrate 214, the support member 230, the distal member 252, the proximal inner member 256, the transducer array 212, and/or the proximal outer member 254 can be coupled to one another via an adhesive. Stated differently, the adhesive can be in contact with e.g. the transducer array 212, the flexible substrate 214, the support member 230, the distal member 252, the proximal inner member 256, and/or the proximal outer member 254, among other components.
The processor 560 may include a CPU, a GPU, a DSP, an application-specific integrated circuit (ASIC), a controller, an FPGA, another hardware device, a firmware device, or any combination thereof configured to perform the operations described herein. The processor 560 may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
The memory 564 may include a cache memory (e.g., a cache memory of the processor 560), random access memory (RAM), magnetoresistive RAM (MRAM), read-only memory (ROM), programmable read-only memory (PROM), erasable programmable read only memory (EPROM), electrically erasable programmable read only memory (EEPROM), flash memory, solid state memory device, hard disk drives, other forms of volatile and non-volatile memory, or a combination of different types of memory. In an embodiment, the memory 564 includes a non-transitory computer-readable medium. The memory 564 may store instructions 566. The instructions 566 may include instructions that, when executed by the processor 560, cause the processor 560 to perform the operations described herein with reference to the probe 110 and/or the host 130 (
The communication module 568 can include any electronic circuitry and/or logic circuitry to facilitate direct or indirect communication of data between the processor circuit 510, the probe 110, and/or the display 132 and/or display 132. In that regard, the communication module 568 can be an input/output (I/O) device. In some instances, the communication module 568 facilitates direct or indirect communication between various elements of the processor circuit 510 and/or the probe 110 (
At a high level, a coregistration procedure involves performing an intravascular procedure and an extraluminal imaging procedure simultaneously. For example, a patient anatomy may be positioned within an imaging region of an extraluminal imaging device. The extraluminal imaging device may acquire extraluminal images of the patient. While extraluminal images are acquired, a physician may position an intraluminal device, such as an IVUS catheter, within a vessel of a patient within the view of the extraluminal imaging device. As the physician moves the IVUS catheter through the vessel, a radiopaque portion of the IVUS device may be observed within the x-ray images. In that regard, in each received x-ray image, the position of the IVUS device may be in a different location as the device moves through the vessel. As the IVUS device moves, it may acquire IVUS images. Because the IVUS images and x-ray images are acquired simultaneously, the system may correspond an IVUS image with the position of the IVUS device at that time as observed in an x-ray image. The many positions of the IVUS device during this procedure may be stored as a series of coordinates and may be used to determine a pathway of the device as it moved through the vessel. Each location along the generated pathway may then correspond to an IVUS image. In many coregistration procedures, this pathway is then overlaid over an additional x-ray image with contrast. To obtain this x-ray image, a physician may administer a contrast agent to the vasculature of the patient. This contrast agent may cause the vessels within the x-ray images to appear. Without a contrast agent introduced, a typical x-ray image may not show any blood vessels. The pathway, which is generated based on the locations of the IVUS device, may be overlaid on an x-ray image with contrast. Or, in some cases, a centerline based on the positions of a vessel within multiple x-ray images with contrast, to ensure that the IVUS pathway matches the correct vessel and that the coregistration of IVUS data to the x-ray image with contrast (e.g., an angiogram) is accurate.
Although an intravascular ultrasound device has been described in the example above, the same principles may apply to any suitable intraluminal procedure. For example, other intraluminal data, such as physiology data including blood pressure data (e.g., FFR data, iFR data, or other pressure data) or blood flow data, may also be acquired and coregistered to an extraluminal image in the same way.
In some cases, some patients may be more sensitive to the contrast agents used to make blood vessels appear in x-ray images. In particular, patients with Chronic Kidney Disease (CKD) may be more susceptible to developing complications from the use of contrast agents. For example, patients may be exposed to a risk of Contrast Induced Nephropathy (CIN) if contrast agent is introduced to their vasculature. The present invention advantageously provides a way of performing a coregistration step without the use of a contrast agent for representing the roadmap image or the use of a much lower dose of contrast agent. This advantageously leads to patients' risk of complications related to contrast agent exposure dramatically decreasing. This may allow patients to be released from coregistration procedures sooner, may reduce the procedure time of procedures themselves, and may lead to same-day discharge for more patients, even after the most complex interventions.
The extraluminal image 600 shown in
In some embodiments, the image 600 may alternatively be a low contrast image or ultra-low contrast image. An ultra-low contrast image 600 may be an x-ray image obtained with less than 20 cc of contrast introduced to the vasculature. A low contrast image may be an image obtained with a greater quantity of contrast agent used.
In some embodiments, the processor circuit 510 may receive and store in a memory in communication with the circuit 510 an angle 690 and a zoom setting of the extraluminal imaging device. The angle 690 may correspond to an angle of a c-arm relative to the patient anatomy when the image 600 was acquired. The zoom setting may correspond to the amount of zoom the extraluminal imaging system used, if any, while acquiring the image 600. This information may be provided to a user or used by the processor circuit 510 in subsequent procedures. Once a zero-contrast angiogram is identified, the co-registration workflow, user display, and calculations will be optimized for this scenario according to elements of the invention described herein. For example, a processor circuit (e.g., the processor circuit 510 of
The extraluminal image 600 may be any suitable type of extraluminal image. For example, the image 600 may be a cine image obtained without contrast or a fluoroscopy image obtained without contrast. In some embodiments, a cine image may correspond to an x-ray image obtained with a relatively higher dose of radiation or a relatively higher frame rate, and thus may be an image with relatively higher resolution. In some embodiments, a fluoroscopy image is an x-ray image obtained with a relatively lower dose of radiation or a relatively lower frame rate and may be an image with relatively lower resolution.
It is noted that the pathway 610 shown in
All of the coordinates of the positions of the device 720 in all of the received images 710 may create a set of locations 740. The image 700 shown in
As shown by the arrow 762, positions observed within the images 710 may be identified or associated with the set of locations 740 in the image 700. In some embodiments, the image 700 is a composite image showing the locations of the radiopaque portions from a plurality (e.g., some, all, or substantially all) of the images 710. In this way, data collected at various locations of the device 720 by the device 720, such as IVUS images or physiology data, may be associated with corresponding locations in the set of locations 740. As an example, the location 730 shown in the image 710 may also be identified in the image 700. Aspects of determining and displaying the pathway that the device travels through the vessel are described in U.S. application Ser. No. 15/630,482, filed Jun. 22, 2017, and titled, “Estimating the endoluminal path of an endoluminal device along a Lumen,” which is hereby incorporated by reference in its entirety.
In some embodiments, the processor circuit 510 may be configured to calculate a width of the shape 840 at all locations of the shape 840. For example, beginning at a distal position 950 of the shape 840, The processor circuit may determine a width in a perpendicular direction 942 at each location along the shape 840 to proximal location 960. The calculated footprint line 940 may then be calculated based on these width measurements along the length of the shape 840. For example, at each location along the shape 840, the processor circuit may determine a width and position the calculated footprint line 940 at a distance of half of this width from either of the outer edges of the shape 840. This calculated footprint line 940 may represent the pathway that the intravascular device traveled through the blood vessel corrected for motion. In that regard, the calculated footprint line 940 may be an illustration of movement the intravascular device through the patient anatomy if the patient anatomy had remained stationary during the imaging procedure. Because the intravascular device is inside the blood vessel as it moves through the blood vessel, the calculated footprint line 940 may also be a representation of the shape and position of the imaged vessel. Thus, the shape and position of the imaged vessel without using contrast in x-ray frames may be calculated. In the case of a no-contrast angiogram, the algorithm of the present disclosure may map the estimated luminal path (e.g., the calculated footprint line 940) to a non-visible vessel contour. This may require the calculated footprint line to conform to a non-visible centerline of the blood vessel imaged by the IVUS device. As a result, the presumed mapping of the calculated footprint line to a vessel centerline may introduce inaccuracies which are corrected according to principles of the disclosure described herein.
In some embodiments the calculated footprint line 940 may be defined by multiple pixel coordinates within an image 800. For example, the image 800 may include or be made of multiple pixels. As an illustration of these pixels, the image 800 may be divided into multiple boxes 801. Each box 801 may correspond to a pixel of the image. The number of boxes 801 representing pixels of the image 800 as shown in
In some embodiments, the processor circuit 510 may be configured to receive an additional extraluminal image 1100. The extraluminal image 1100 may be any suitable extraluminal image. For example, the extraluminal image 1100 may be an x-ray image. In one example, the extraluminal image 1100 may be an x-ray image obtained without contrast, such as a fluoroscopy image or a cine image. In some embodiments, the x-ray image 1100 may be the same size as the image 800. For example, the image 1100 may contain the same number of pixels in the same arrangement and of the same resolution as the pixels of the image 800. In addition, the angle and zoom of the extraluminal imaging system used to acquire the image 1100 may match the angle and zoom used to acquire the images 710. This same angle may be denoted by the angle 690 shown adjacent to the image 1100. In some embodiments, the image 800 containing the calculated footprint line 940 may correspond to the same angle 690 and zoom settings of the images 710 from which it was derived as well as the image 1100. As a result, there may be a one-to-one correspondence of pixels 801 within the image 800 to pixels 1101 in the image 1100. In that regard, a location within a patient anatomy represented by a single pixel 801 within the image 800 may also be represented by a corresponding pixel 1101 in
As an example of the correspondence between pixels 801 of the image 800 and pixels 1101 of the image 1100, a location 1030 is shown within each image 800 and 1100. This location 1030 may be a location along the pathway 940 of image 800 and along the calculated centerline 1140 of the image 1100. In the image 800, this location 1030 may be identified by, or correspond to, the pixel 801(a). The same location 1030 may be identified by, or correspond to, the pixel 1101(a). This relationship between the image 800 and the calculated footprint line 940, and the image 1100 and the calculated centerline 1140 may be signified by the arrow 1060 shown in
The extraluminal image 1210 may be an additional extraluminal image similar to the image 1100 described with reference to
As shown in
According to some aspects of the present disclosure, a user of the system may confirm the shape of the calculated footprint line 1240 based on a number of references. For example, a user of the system may verify that the calculated footprint line 1240 accurately resembles the expected shape of the vessel by comparison to a contrast-filled angiogram of the same patient anatomy. For example, in some embodiments, during a previous procedure, a contrast agent may have been introduced into the patient vasculature. For example, a contrast agent may have been introduced into the patient vasculature in combination with the positioning of an initial guidewire, such as a workhorse guidewire, or a guidewire of an IVUS imaging device or other intraluminal device. In some embodiments, the contrast agent introduced may have been a low dose or ultra-low dose. In some embodiments, a low dose or ultra-low dose may correspond to a dose of 5 mL or 5 cc of contrast agent of any of the materials listed previously. In some embodiments, an extraluminal image acquired by the extraluminal imaging device 151 while contrast is present within the patient vasculature may be stored by the processor circuit 510 in a memory in communication with the processor circuit 510. When the processor circuit 510 prompts the user to confirm the shape and position of the calculated footprint line 1240, as shown in
In another embodiment, the user of the system 100 may confirm the shape of the calculated footprint line 1240 by comparing it to the observed path of the intravascular device during the intraluminal procedure used for coregistration described with reference to
In another embodiment, the user of the system 100 may confirm the shape and position of the calculated footprint line 1240 by referencing anatomical or other landmarks within the image 1210 that were observed previously. For example, the user may observe anatomical landmarks including various bone structures, abnormalities in bone structures or other anatomies of the patient, or any other anatomical landmarks during an initial imaging stage (e.g., the intraluminal imaging phase described with reference to
In some embodiments, the user of the system 100 may confirm the shape and position of the calculated footprint line 1240 by comparing the calculated footprint line 1240 to a no-contrast extraluminal image of the patient anatomy obtained while multiple guidewires are positioned within one or more vessels of the patient. In that regard, the radiopaque portions of the multiple guidewires highlight the vessel profile. In some embodiments, this image of the patient anatomy obtained with multiple guidewires within the anatomy may be obtained during the same imaging procedure as the procedure obtaining the multiple IVUS images and/or extraluminal images described herein. In some embodiments, the image may have been obtained during a previous procedure and may be retrieved from a memory.
In some embodiments, the processor circuit 510 may be configured to display various prompts to the user. For example, a prompt 1290 may direct a user to confirm the shape and position of the calculated footprint line 1240 by, for example, selecting a button 1280. The prompt 1290 may additionally convey that a user may edit the calculated footprint line 1240 by clicking on the calculated footprint line 1240 within the image 1210, as will be described in more detail with reference to
An indicator 1230 may be provided in the screen display 1200 identifying for the user that the x-ray image is obtained without contrast, and thus the user is confirming a zero-contrast roadmap. The Co-Registration results screen (e.g., the interface 1200, or other interfaces described in the figures described hereafter) are labelled as zero contrast as demonstrated by the indicator 1230. The labelling of the display 1200 as zero contrast and associated workflow will be clearly evident to any observer.
According to another aspect of the present disclosure, the calculated footprint line 1240 initially displayed to the user may be different from a calculated footprint line involving a contrast-based angiogram and may more closely match the intended roadmap,thus requiring less user editing because, for example, the algorithm for calculating and displaying the calculated footprint line does not require obtaining or identifying a contrast-filled vessel as also described in EP 3474750, incorporated by reference previously.
In the example shown in
As shown in
As shown in the graphical user interface 1300, the processor circuit 510 may be configured to provide, within the display, various user-selectable tools for editing the shape and/or location of the calculated footprint line 1340. For example, as shown in
In some embodiments, this modifying of the shape and position of the calculated footprint line 1340 may include interpolation between anchors, such as an anchor 1304 and/or other anchors along the calculated footprint line 1340, defining the calculated footprint line 1340. In some embodiments, the interpolation may include a local interpolation. The anchor 1304 may or may not be displayed to a user. For example, only anchor points or regions of the calculated footprint line 1340 close in proximity to the moved anchor 1304 may be adjusted, while regions of the calculated footprint line 1340 far from the anchor 1304 may remain unchanged. In some embodiments, the indicator 1302 may define a region of proximity around the anchor 1304. Sections of the pathway 1304 within the region defined by the indicator 1302 may be modified while sections outside the anchor 1302 may be unchanged. In some embodiments, the user of the system 100 may adjust various settings or aspects of the interpolation algorithm, including, for example, the size and shape of the indicator 1302.
Some aspects of modifying the shape and position of the calculated footprint line 1340 may include features similar to those described in U.S. Provisional Application No. 63/187,964, titled, “PATHWAY MODIFICATION FOR COREGISTRATION OF EXTRALUMINAL IMAGE AND INTRALUMINAL DATA” and filed May 13, 2021 (International Publication No. WO 2022/238276), which is hereby incorporated by reference in their entirety.
In some embodiments, after the calculated footprint line 1340 has been modified to a user's satisfaction, the processor circuit 510 may receive an input indicating that the pathway is confirmed and the system may exit a pathway modification mode.
After a pathway, including any of those previously mentioned and described, is confirmed and/or modified by the user, the processor circuit 510 may be configured to coregister any intraluminal data to the pathway. For example, as explained with reference to
As an example, the graphical user interface 1400 provides an x-ray image 1410, an IVUS image 1430, physiology data 1490, and a longitudinal view 1450 of the imaged vessel. The x-ray image 1410 may include a depiction of a calculated footprint line 1440. The calculated footprint line 1440 may be similar to the centerline 1140 of
As an example, iFR data 1490 may be coregistered to the calculated footprint line 1440. For example, iFR data may be received by the processor circuit 510 during an iFR pullback while also receiving extraluminal images (e.g., the image 710 of
Also shown within the graphical user interface 1400 is the IVUS image 1430. In that regard, a plurality of IVUS images (including image 1430) can be co-registered to the calculated footprint line 1440. The IVUS image 1430 may be an IVUS image obtained at the location identified by the indicator 1422. In some aspects, the indicator 1422 may also be referred to as a marking. The IVUS image 1430 may alternatively be an IVUS image obtained at the location identified by the indicator 1494. In some embodiments, the IVUS image 1430 may include a border 1432. This border may be identified automatically by the processor circuit 510 or may be identified by a user of the system. In some embodiments, the border 1432 may be a lumen border, a vessel border, a stent border, or any other border within the image.
Examples of border detection, image processing, image analysis, and/or pattern recognition include U.S. Pat. No. 6,200,268 entitled “VASCULAR PLAQUE CHARACTERIZATION” issued Mar. 13, 2001 with D. Geoffrey Vince, Barry D. Kuban and Anuja Nair as inventors, U.S. Pat. No. 6,381,350 entitled “INTRAVASCULAR ULTRASONIC ANALYSIS USING ACTIVE CONTOUR METHOD AND SYSTEM” issued Apr. 30, 2002 with Jon D. Klingensmith, D. Geoffrey Vince and Raj Shekhar as inventors, U.S. Pat. No. 7,074,188 entitled “SYSTEM AND METHOD OF CHARACTERIZING VASCULAR TISSUE” issued Jul. 11, 2006 with Anuja Nair, D. Geoffrey Vince, Jon D. Klingensmith and Barry D. Kuban as inventors, U.S. Pat. No. 7,175,597 entitled “NON-INVASIVE TISSUE CHARACTERIZATION SYSTEM AND METHOD” issued Feb. 13, 2007 with D. Geoffrey Vince, Anuja Nair and Jon D. Klingensmith as inventors, U.S. Pat. No. 7,215,802 entitled “SYSTEM AND METHOD FOR VASCULAR BORDER DETECTION” issued May 8, 2007 with Jon D. Klingensmith, Anuja Nair, Barry D. Kuban and D. Geoffrey Vince as inventors, U.S. Pat. No. 7,359,554 entitled “SYSTEM AND METHOD FOR IDENTIFYING A VASCULAR BORDER” issued Apr. 15, 2008 with Jon D. Klingensmith, D. Geoffrey Vince, Anuja Nair and Barry D. Kuban as inventors and U.S. Pat. No. 7,463,759 entitled “SYSTEM AND METHOD FOR VASCULAR BORDER DETECTION” issued Dec. 9, 2008 with Jon D. Klingensmith, Anuja Nair, Barry D. Kuban and D. Geoffrey Vince, as inventors, the teachings of which are hereby incorporated by reference herein in their entirety.
Additionally depicted in the interface 1400 are metrics 1434. The metrics 1434 may relate to the IVUS image 1430 shown and specifically the border 1432. For example, the processor circuit 510 may automatically calculate various metrics 1434 related to the border 1432. For example, the processor circuit 510 may identify a cross-sectional area of the border 1432. The circuit may also identify a minimum diameter of the border, a maximum diameter of the border, or any other measurements or metrics related to the border 1432, or other aspects of the image 1430.
In some embodiments, the longitudinal view 140 may also be displayed. The longitudinal image 1450 may be referred to as in-line digital (ILD) display or intravascular longitudinal display (ILD) 1450. The IVUS images acquired during an intravascular ultrasound imaging procedure, such as during an IVUS pullback, may be used to create the ILD 1450. In that regard, an IVUS image is a tomographic or radial cross-sectional view of the blood vessel. The ILD 1450 provides a longitudinal cross-sectional view of the blood vessel. The ILD 1450 can be a stack of the IVUS images acquired at various positions along the vessel, such that the longitudinal view of the ILD 1450 is perpendicular to the radial cross-sectional view of the IVUS images. In such an embodiment, the ILD 1450 may show the length of the vessel, whereas an individual IVUS image is a single radial cross-sectional image at a given location along the length. In some embodiments, the ILD 1450 may illustrate a time at which IVUS images were obtained and the position of aspects of the ILD 1450 may correspond to time-stamps of the IVUS images. In another embodiment, the ILD 1450 may be a stack of the IVUS images acquired overtime during the imaging procedure and the length of the ILD 1450 may represent time or duration of the imaging procedure. The ILD 1450 may be generated and displayed in real time or near real time during the pullback procedure. As each additional IVUS image is acquired, it may be added to the ILD 1450. For example, at a point in time during the pullback procedure, the ILD 1450 shown in
The ILD 1450 may include a depiction of iFR data 1492, various length measurements 1462, indicators 1452 and 1456 identifying the beginning and ending of a length measurement, and bookmark identifiers 1454. Aspects of providing physiology data (e.g., pressure ratio data such as a iFR data 1492) on the ILD 1450 are described in U.S. Provisional Application No. 63/288,553, filed Dec. 11, 2021, and titled “REGISTRATION OF INTRALUMINAL PHYSIOLOGICAL DATA TO LONGITUDINAL IMAGE OF BODY LUMEN USING EXTRALUMINAL IMAGING DATA”, which is incorporated by reference herein in its entirety.
In some embodiments, the iFR data 1492 may be the same iFR data used to populate the metrics 1490 described. As shown in the ILD 1450 and because the ILD 1450 is generated based on IVUS data, if two intraluminal procedures (e.g., IVUS data and physiology data) are performed and coregistered to the same pathway (e.g., the pathway 1440), the same IVUS data and physiology data may be coregistered to each other, as shown by the iFR data 1492 shown at locations along the ILD 1450.
The length measurements along the ILD 1450 may be generated by a user of the system 100 and/or automatically by the processor circuit 510. For example, a user may select various locations along the ILD 1450 and the processor circuit may calculate length measurements corresponding to the selected locations. These various length measurements may also be displayed as metrics 1460 near the ILD 1450. In some embodiments, length measurements may be distinguished from one another by labels, colors, patterns, highlights, or other visual characteristics.
The indicators 1452 and 1456 may be user selected locations along the ILD 1450. In some embodiments, they may be automatically selected. As an example, the indicators 1452 and 1456 may identify the beginning and ending locations of a length measurement. In some embodiments, the indicators 1452 and 1456 correspond to a distal and proximal landing zone for a stent that is being considered by a physician. The iFR estimate value in the physiology data 1490 may be a predicted iFR value with proposed stent positioned within the vessel based on indicators 1452 and 1456. In some embodiments, corresponding indicators may be displayed at corresponding locations along the calculated footprint line 1440 of the image 1410.
In some embodiments, one or more bookmarks 1454 may also be included along the ILD 1450. These bookmarks 1454 may correspond to similar bookmarks at corresponding locations along the calculated footprint line 1440 of the image 1410.
An indicator 1470 is provided in the screen display 1400, overlaid on the x-ray image 1410. The indicator 1470 identifies for the user that the x-ray image is a zero contrast image frame.
In some embodiments, the processor circuit 510 may initiate the steps of coregistering intraluminal data to an extraluminal image without contrast as has been described in response to a user input selecting an extraluminal image without contrast or by automatically detecting an extraluminal image without contrast. For example, as shown in
In an embodiment in which the processor circuit 510 automatically determines whether a contrast-filled angiogram or contrast-free fluoroscopy image or cine image is presented, the processor circuit 510 may receive an extraluminal image either from an extraluminal imaging system during a procedure or from a memory in communication with the processor circuit 510. In such an embodiment, the processor circuit 510 may employ any suitable image processing and/or machine learning techniques, including any of those listed in the present disclosure, to determine whether the received image is an angiogram image or a contrast-free image. If an angiogram was received, the steps of coregistration to an angiogram image may be commenced. If a contrast-free extraluminal image was received, the steps described herein may be initiated.
In some embodiments, the processor circuit 510 may be configured to display prompts, such as the prompts 1530 to guide a user at this stage of the procedure. For example, by displaying the prompts 1530, the processor circuit 510 may guide a user to select an existing angiogram image, fluoroscopy image, or cine image, and/or acquire an additional image by following the prompts 1530.
At step 1605, the method 1600 includes receiving a first plurality of extraluminal images obtained by an extraluminal imaging device. The extraluminal imaging device may be a device of the extraluminal imaging system 151 shown and described with reference to
At step 1610, the method 1600 includes receiving a second plurality of extraluminal images obtained by the extraluminal imaging device during movement of an intraluminal catheter or guidewire within a body lumen of the patient. In some aspects, the intraluminal catheter may be the intraluminal device 102 shown and described with reference to
At step 1615, the method 1600 includes receive intraluminal data points obtained by the intraluminal catheter or guidewire during movement. The intraluminal data points may be of any suitable type, including IVUS data, OCT data, intravascular pressure data, intravascular flow data, or any other data. In addition, the intraluminal data points are acquired simultaneously with the second plurality of extraluminal images.
At step 1620, the method 1600 includes determining a curve representative of at least one of a shape or a location of the body lumen based on the second plurality of extraluminal images. In some aspects, this curve may be referred to as a footprint line (FPL) and may be an approximation of the path of the intraluminal device through the body lumen if there was no motion in the patient anatomy. This calculated footprint line may be a coarse/smoothed representation of the body lumen or an average location of the body lumen. It is based, for example, on analysis of pullback images (e.g., the second plurality of extraluminal images) and the detection of the intraluminal device (e.g., the radio-opaque markers, such as a guidewire (GW) opaque tip or Guiding Catheter (GC)). In that regard, the calculated FPL is a coarse/smoothed representation of the vessel or an average location of the vessel (which is subject to periodic motion as described above). The curve may also be referred to as a line, a curve, a pathway, a centerline, a roadmap, or any other term.
At step 1625, the method 1600 includes determining whether the first plurality of extraluminal images were obtained with contrast. In some aspects, the processor circuit of the system 100 may analyze one or more extraluminal images of the first plurality of extraluminal images to determine whether they were obtained with or without contrast. For example, a machine learning algorithm, such as a neural network or any other deep learning network, may be implemented to automatically identify whether an extraluminal image was obtained with or without contrast. As shown in
At step 1630, the method 1600 includes identifying an extraluminal image of the first plurality of extraluminal images based on the curve. This extraluminal image may be selected automatically. For example, a processor circuit may extract the centerline of the imaged body lumen and compare it to the curve. In some aspects, the processor circuit may compare multiple positions of the curve with corresponding positions of the centerlines identified in each extraluminal image of the first plurality of extraluminal images. For example, for one extraluminal image, a proximal position of the centerline may be compared with the proximal position of the curve. This comparison may result in a distance between the two positions, for example, in units of pixels, or any other unit. This comparison may be performed for each point along the centerline and corresponding curve (e.g., the centerline and curve may be compared at a regular interval of distance, or the centerline and curve may be divided into an equal number of sections and comparisons may be made for each section). After position comparisons are performed, resulting in a number of distance values, these values may be averaged, summed, or otherwise combined to determine an overall comparison value for the extraluminal image analyzed. In that regard, the processor circuit may select an extraluminal image which has the ideal comparison value (e.g., lowest, closest to a reference value, highest, etc.) indicating that the shape of the body lumen within that extraluminal image aligns most closely with the curve generated at step 1620.
In some aspects, after the processor circuit selects an extraluminal image of the first plurality of extraluminal images, the user may verify that the selected extraluminal image ideally matches the curve. The user may then correct or select a new image. In some aspects, a user may manually correct the result of the automatically derived centerline and/or re-draw a new centerline altogether.
At step 1635, the method 1600 includes co-registering intraluminal data points to the centerline of body lumen within the extraluminal image. Because the intraluminal data points are associated with corresponding locations along the curve (e.g., the positions at which the intraluminal data points were acquired as observed in the second plurality of extraluminal images), the curve and its corresponding location information for the intraluminal data points may be overlaid on the selected extraluminal image. As a result, the locations at which intraluminal data points were acquired may be observed within the extraluminal image.
At step 1640, the method 1600 includes outputting the extraluminal image and coregistered intraluminal data points. This may include any suitable graphical user interface including the extraluminal image with an indication of a location at which an intraluminal data point was acquired along with the intraluminal data point.
As previously described, the processor circuit may alternatively perform steps 1645-1670 if it determines at step 1625 that the first plurality of extraluminal images were obtained without contrast. At step 1645, the method 1600 includes identifying an extraluminal image of the first plurality of extraluminal images. This extraluminal image may be selected based on the orientation of the extraluminal imaging device and patient. For example, the extraluminal image selected should be an image obtained from the same angle, and with the same imaging settings, as the second plurality of extraluminal images. In some aspects, the extraluminal image selected at step 1645 may alternatively be one of the second plurality of extraluminal images. Because the first plurality of extraluminal images were obtained without contrast, the body lumen and the centerline of the body lumen are not visible in the first extraluminal image. In some aspects, the extraluminal image identified at step 1645 may be an extraluminal image of the second plurality of extraluminal images received at step 1610.
At step 1650, the method 1600 includes overlaying the curve on the selected extraluminal image. In that regard, step 1650 includes setting the lumen centerline as the calculated FPL, or curve, in selected extraluminal image without contrast. In that regard, the processor circuit does not identify the extraluminal image at step 1645 based on the centerline of the body lumen as in step 1630. In some aspects, the processor circuit assigns the curve to be the centerline, without regard for what the actual location and shape of the body lumen and the centerline are. The processor circuit does this because the curve is a sufficiently accurate representation of the actual location and shape of the body lumen and the centerline.
At step 1655, the method 1600 includes outputting the extraluminal image and overlaid curve. This may include displaying the selected extraluminal image with the curve (e.g., calculated FPL) overlaid. A user may then review the curve within the extraluminal image and determine whether the curve accurately depicts the expected location of the body lumen based on observing the acquisition of the second plurality of extraluminal images.
At step 1660, the method 1600 includes receiving a user input modifying or confirming the curve. For example, if the user determines that a section of the curve should be modified, the user may use an input device, such as a touch screen, a mouse, a keyboard, various buttons of a graphical user interface, or any other means to adjust the curve as needed. In some aspects, the curve may not need to be modified. However, the user may provide a user input confirming that the shape of the curve looks accurate. In some aspects, the system workflow may make it mandatory for the user to review, correct and/or redraw altogether the vessel centerline.
At step 1665, the method 1600 includes coregister intraluminal data points to locations within the extraluminal image. For example, as described at step 1640, the intraluminal data points may be associated with various locations along the curve. These intraluminal data points may be similarly associated with corresponding locations within the extraluminal image selected.
At step 1670, the method 1600 includes outputting the extraluminal image and the coregistered intraluminal data points to a display. The step 1670 may be similar to the step 1640 previously described. For example, the display may provide any suitable graphical user interface including the extraluminal image with an indication of a location at which an intraluminal data point was acquired along with the intraluminal data point.
Persons skilled in the art will recognize that the apparatus, systems, and methods described above can be modified in various ways. Accordingly, persons of ordinary skill in the art will appreciate that the embodiments encompassed by the present disclosure are not limited to the particular exemplary embodiments described above. In that regard, although illustrative embodiments have been shown and described, a wide range of modification, change, and substitution is contemplated in the foregoing disclosure. It is understood that such variations may be made to the foregoing without departing from the scope of the present disclosure. Accordingly, it is appropriate that the appended claims be construed broadly and in a manner consistent with the present disclosure.
This application claims priority to and the benefit of U.S. Provisional Application No. 63/292,529, filed Dec. 22, 2021, which is incorporated by reference herein in its entirety.
Number | Date | Country | |
---|---|---|---|
63292529 | Dec 2021 | US |