Methods, apparatus, and system for synchronization between a three-dimensional vascular model and an imaging device

Information

  • Patent Grant
  • 12039685
  • Patent Number
    12,039,685
  • Date Filed
    Wednesday, September 23, 2020
    4 years ago
  • Date Issued
    Tuesday, July 16, 2024
    6 months ago
Abstract
An apparatus for synchronizing a three-dimensional model of a patient's coronary arteries with an orientation of a medical imaging device is disclosed. In an example, an apparatus is configured to receive an indication that a three-dimensional model has been rotated. The indication includes a number of degrees of rotation of the three-dimensional model along a roll and/or pitch axis. The processor uses a correlation or transfer function to determine a rotational angulation position of the medical imaging device based on the number of degrees of rotation of the three-dimensional model along the roll and/or pitch axis. The processor transmits a command or instruction to the medical imaging device that is indicative of the desired rotational angulation position, thereby causing the medical imaging device to rotate to the desired position.
Description
TECHNICAL FIELD

The present invention, in some embodiments thereof, relates to vascular imaging and assessment, and more particularly, but not exclusively, to synchronization between a three-dimensional vascular model and an imaging device during a patient-catheterized imaging procedure.


BACKGROUND

Coronary arterial stenosis is one of the most serious forms of arterial disease. In clinical practice, stenosis severity is estimated by using either simple geometrical parameters, such as determining the percent diameter of a stenosis, or by measuring hemodynamically based parameters, such as pressure-based myocardial Fractional Flow Reserve (“FFR”). FFR is a measurement regarding a functional significance of coronary stenoses. In the past, known FFR measurement techniques included the insertion of a 0.014″ guidewire equipped with a miniature pressure transducer located across an arterial stenosis. Pressure in the artery would be measured at the stenosis and upstream from the stenosis. A ratio of these pressures represents the maximal blood flow in the area of stenosis and the maximal blood flow in approximately the same location without stenosis.


Currently, FFR may be determined non-invasively through computational fluid dynamic modeling of the blood flow through the arteries. Alternatively, FFR may be determined by calculating vascular resistances from vascular dimensions and identifying how the vascular resistances change when the coronary arteries are normalized. This normalization smooths vascular locations and virtually removes potential stenoses, where resistances calculated from the normalized vasculature provide an indication of maximal blood flow.


Early studies have demonstrated that FFR values less than 0.75 provide an accurate predictor of ischemia. Further, FFR values greater than 0.75 provide an indication that deferral of percutaneous coronary intervention for corresponding lesions should be safe. An FFR cut-off value of 0.8 is typically used in clinical practice to guide revascularization, supported by long-term outcome data. Typically, an FFR value in a range of 0.75-0.8 is considered a ‘grey zone’ having uncertain clinical significance.


To determine vascular dimensions for calculating FFR, a three-dimensional model of a patient's coronary arteries is constructed. Oftentimes, the three-dimensional model is created from two or more angiographic x-ray images that are recorded at different viewing angles with respect to the patient. The angiographic x-ray images are usually recorded by a C-arm in a catheterization laboratory.


The C-arm enables an operator to rotate around a patient's heart and acquire angiographic image(s) of different viewpoints of a patient's coronary arteries and/or lesions of interest. Generally, the C-arm is heavy to maneuver, and every image requires X-ray radiation and an injection of contrast material. Navigating a device, such as a stent, through a patient's coronary arteries is a task that requires the C-arm to track the placement of the stent along a coronary path to a target location. Currently, this tracking increases the amount of dose and dye needed for imaging because images from different angles and over different times are needed as the stent is moved through the arteries.


SUMMARY

The example methods, apparatus, and system disclosed herein are configured to synchronize a three-dimensional model of a patient's coronary arteries with a medical imaging device, such as a C-arm. Specifically, the methods, apparatus, and system disclosed herein register a model of a patient's coronary arteries with a coordinate system of a C-arm medical imaging device. After a model is created, the example system, apparatus, and methods register the model with the C-arm by referencing or assigning coordinates of the C-arm to corresponding rotational orientations of the model. The registration corresponds to a direction an image intensifier of the C-arm is facing, where the image intensifier faces one (two-dimensional) side of the three-dimensional model.


Rotation of the C-arm corresponds to changing a direction the image intensifier faces. The system, methods, and apparatus detect this rotation of the C-arm and accordingly update which side of the patient's coronary arteries, as represented by the three-dimensional model, is presently facing the image intensifier. The view from the image intensifier of the C-arm, as depicted by the orientation of the three-dimensional model, is shown within a user interface of a computer system. A clinician may move the C-arm to desired positions, for example, as a stent is being placed. The clinician may then use the corresponding view of the three-dimensional model to guide stent placement and determine when an updated image is necessary. This synchronization accordingly enables a clinician to position the C-arm to quickly obtain a clearer angle of certain vascular features to update the three-dimensional model.


The example system, methods, and apparatus also enable a user to rotate or otherwise maneuver the three-dimensional model to investigate a geometry of the patient's coronary vessel tree. In these embodiments, movement of the three-dimensional model reduces the number of x-ray images needed since two or three images may be used to construct a complete model, thereby providing views that were not imaged using the C-arm. This reduces a radiation exposure of the patient and the amount of contrast used. Moreover, the three-dimensional model, being a virtual object, can be rotated to viewpoints impossible for the C-arm to reach, enabling an operator to study the tree structure in greater detail. The system, methods, and apparatus are configured to determine how a three-dimensional model is rotated and provide corresponding instructions for the C-arm to move or rotate in a similar manner. If a clinician identifies an area of interest within the three-dimensional model that may need further imaging based on updated patient conditions, such as placement of a stent, the clinician may select an input that causes the C-arm to acquire an image at its present location. The synchronization of the C-arm reduces time from having the clinician manually position the C-arm to an approximate view that is shown of the three-dimensional model.


As disclosed herein, the example system, methods, and apparatus correlate coordinates of a three-dimensional model to a lateral angular axis (RAO angulation and LAO angulation) and a vertical angular axis (cranial angulation and caudal angulation) of a C-arm. In some embodiments, the C-arm may be moved along a track to move closer or away from a patient, which provides movement along at least one of an x-axis, a y-axis, and a z-axis. In these embodiments, the example system, methods, and apparatus additionally provide correlation of the three dimensional model to the x, y, and/or z-axes.


In light of the disclosure herein and without limiting the disclosure in any way, in a first aspect of the present disclosure, which may be combined with any other aspect listed herein unless specified otherwise, an apparatus for synchronizing a three-dimensional model of a patient's coronary arteries with an orientation of a medical imaging device includes a memory device storing a three-dimensional model of a patient's coronary arteries. The three-dimensional model includes a centerline through each of the coronary arteries. Each sample point along the respective centerline is defined in a three-dimensional coordinate system and is associated with vascular geometric information. The apparatus also includes a processor communicatively coupled to the memory device. The processor is configured to receive an instruction to register the three-dimensional model to a medical imaging device, and determine an orientation of the three-dimensional model that corresponds to a zero-degree starting position of the medical imaging device. The processor is also configured to receive potential rotational angulation positions of the medical imaging device, determine angular coordinates for the three-dimensional model that correspond to the potential rotational angulation positions of the medical imaging device, and store to the memory device a correlation between the determined angular coordinates for the three-dimensional model and the potential rotational angulation positions of the medical imaging device. The processor is further configured to determine a current view angle orientation of the medical imaging device, use the correlation between the determined angular coordinates for the three-dimensional model and the potential rotational angulation positions of the medical imaging device to rotate the three-dimensional model using the current view angle orientation of the medical imaging device, and display the rotated three-dimensional model in a user interface in a viewpoint orientation that matches the current view angle orientation of the medical imaging device.


In accordance with a second aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the processor is further configured to determine the orientation of the three-dimensional model by identifying a two-dimensional face or a plane of the three-dimensional model that aligns with a view angle at the zero-degree starting position of an image intensifier of the medical imaging device.


In accordance with a third aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the identified two-dimensional face or the plane of the three-dimensional model corresponds to a top-down view of the patient's coronary arteries when the patient is laying supine.


In accordance with a fourth aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the vascular geometric information includes at least one of a vascular diameter, a vascular radius, a cross sectional area, a cross sectional profile, a vascular wall curvature, or vascular branching.


In accordance with a fifth aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the medical imaging device includes a C-arm configured to record x-ray angiographic images.


In accordance with a sixth aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the potential rotational angulation positions of the medical imaging device include RAO angulation positions, LAO angulation positions, cranial angulation positons, and caudal angulation positions.


In accordance with a seventh aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the angular coordinates for the three-dimensional model include coordinates along a roll axis and a pitch axis.


In accordance with an eighth aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the angular coordinates correspond to an amount the three dimensional model is rotated along the roll axis and the pitch axis.


In accordance with a ninth aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the potential rotational angulation positions of the medical imaging device are at least one of stored in the memory device, received from the medical imaging device, or received via user input via an interface.


In accordance with a tenth aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the processor is further configured to receive from the medical imaging device a message that is indicative of at least one of (i) a relative position change from the zero-degree starting positon of the medical imaging device provided in a rotational angulation position, or (ii) an absolute position of the medical imaging device provided in a rotational angulation position, determine a new viewpoint orientation for the three-dimensional model based on the at least one of (i) or (ii) and the correlation between the determined angular coordinates for the three-dimensional model and the potential rotational angulation positions of the medical imaging device, rotate the three-dimensional model to the new viewpoint orientation, and display in the user interface the rotated three-dimensional model.


In accordance with an eleventh aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the processor is further configured to receive an imaging message indicative that a medical image is to be acquired, transmit an imaging instruction message to the medical imaging device, and receive the medical image, the medical image acquired by the medical imaging device in the new viewpoint orientation.


In accordance with a twelfth aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the processor is further configured to identify coronary arteries in the medical image, determine centerlines through the identified coronary arteries, determine sample points along the centerlines in the three-dimensional coordinate system, determine vascular geometric information for the sample points along the centerline, determine a correspondence between the coronary arteries in the medical image and the three-dimensional model using at least the centerlines of the medical image and the centerlines of the three-dimensional model, and update the three-dimensional model with the determined vascular geometric information from the medical image.


In accordance with a thirteenth aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the imaging instruction message includes at least one of an indication to record the medical image, a rotation instruction, a lateral movement instruction, or a zoom-magnification instruction.


In accordance with a fourteenth aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the processor is configured to display the medical image in conjunction with the three-dimensional model.


In accordance with a fifteenth aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the processor is configured to calculate and display, in the user interface, fractional flow reserve (“FFR”) values for the three-dimensional model.


In accordance with a sixteenth aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, a method for synchronizing a three-dimensional model of a patient's coronary arteries with an orientation of a medical imaging device includes storing, in a memory device, a three-dimensional model of a patient's coronary arteries, the three-dimensional model including a centerline through each of the coronary arteries, each sample point along the respective centerline being defined in a three-dimensional coordinate system and being associated with vascular geometric information. The example method also includes determining, via a processor communicatively coupled to the memory device, an orientation of the three-dimensional model that corresponds to a zero-degree starting position of a medical imaging device. The method further includes receiving, in the processor, potential rotational angulation positions of the medical imaging device, determining, via the processor, angular coordinates for the three-dimensional model that correspond to the potential rotational angulation positions of the medical imaging device, and storing, to the memory device via the processor, a correlation between the determined angular coordinates for the three-dimensional model to the potential rotational angulation positions of the medical imaging device. Moreover, the method includes determining, via the processor, a current viewpoint of the three-dimensional model displayed in a user interface, and using, via the processor, the current viewpoint of the three-dimensional model and the correlation between the angular coordinates for the three-dimensional model and the potential rotational angulation positions of the medical imaging device to cause the medical image device to rotate to a corresponding view angle orientation.


In accordance with a seventeenth aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the medical imaging device includes a C-arm and the method is performed in a catheterization laboratory during at least one of a stent placement, a percutaneous coronary intervention, or an FFR determination.


In accordance with an eighteenth aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the method further includes receiving, in the processor, for the medical imaging device, at least two medical images recorded at different view angles with respect to the patient, the at least two medical images including depictions of the patient's coronary arteries, identifying, via the processor, the coronary arteries in the at least two medical images, determining, via the processor, centerlines through the identified coronary arteries, determining, via the processor, sample points along the centerlines in the three-dimensional coordinate system, determining, via the processor, vascular geometric information for the sample points along the centerlines, creating the three-dimensional model using the centerlines, the sample points along the centerlines in the three-dimensional coordinate system, and the vascular geometric information, and storing the three-dimensional model to the memory device.


In accordance with a nineteenth aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the processor causes the medical image device to rotate by transmitting at least one instruction message to the medical image device, the at least one instruction message including at least one of (i) a relative position change from the zero-degree starting positon of the medical imaging device provided in a rotational angulation position, or (ii) an absolute position of the medical imaging device provided in a rotational angulation position.


In accordance with a twentieth aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the potential rotational angulation positions of the medical imaging device include RAO angulation positions, LAO angulation positions, cranial angulation positons, and caudal angulation positions, and the angular coordinates for the three-dimensional model include coordinates along a roll axis and a pitch axis.


In a twenty-first aspect of the present disclosure, any of the structure, functionality, and alternatives disclosed in connection with any one or more of FIGS. 1 to 17 may be combined with any other structure, functionality, and alternatives disclosed in connection with any other one or more of FIGS. 1 to 17.


In light of the present disclosure and the above aspects, it is therefore an advantage of the present disclosure to provide synchronization between a three-dimensional model of a patient's coronary arteries and an orientation/position of a medical imaging device, such as a C-arm.


It is another advantage of the present disclosure to enable a clinician to update a three-dimensional model of a patient's coronary arteries in real-time during a catheterization laboratory procedure, such as placement of a stent or determining FFR values.


It is yet another advantage of the present disclosure to change an orientation and/or position of a C-arm by rotating a three-dimensional model of a patient's coronary arteries.


Additional features and advantages are described in, and will be apparent from, the following Detailed Description and the Figures. The features and advantages described herein are not all-inclusive and, in particular, many additional features and advantages will be apparent to one of ordinary skill in the art in view of the figures and description. Also, any particular embodiment does not have to have all of the advantages listed herein and it is expressly contemplated to claim individual advantageous embodiments separately. Moreover, it should be noted that the language used in the specification has been selected principally for readability and instructional purposes, and not to limit the scope of the inventive subject matter.





BRIEF DESCRIPTION OF THE DRAWINGS

Some embodiments of the invention are herein described, by way of example only, with reference to the accompanying drawings and images. With specific reference now to the drawings and images in detail, it is stressed that the particulars shown are by way of example and for purposes of illustrative discussion of embodiments of the invention. In this regard, the description taken with the drawings and images makes apparent to those skilled in the art how embodiments of the system, methods, and apparatus disclosed herein may be practiced.



FIG. 1 is a diagram of an example vascular imaging system including a medical imaging device and a workstation, according to an example embodiment of the present disclosure.



FIG. 2 shows a first angular axis along which a support structure of the medical imaging device of FIG. 1 may be rotated, according to an example embodiment of the present disclosure.



FIG. 3 shows a second angular axis along which the support structure of the medical imaging device of FIG. 1 may be rotated, according to an example embodiment of the present disclosure.



FIG. 4 is a diagram of an example correlation file to synchronize a three-dimensional model of a patient's coronary arteries with a medical imaging device, according to an example embodiment of the present disclosure.



FIG. 5 is a flow diagram of an example procedure that uses the correlation file of FIG. 4 to synchronize a three-dimensional model of a patient's coronary arteries with a medical imaging device, according to an example embodiment of the present disclosure.



FIG. 6 is a flow diagram of an example procedure for causing a medical imaging device to rotate based on detected rotation of a three-dimensional model, according to an example embodiment of the present disclosure.



FIG. 7 is a diagram that is illustrative of a synchronization between a three dimensional model and the medical imaging device, according to an example embodiment of the present disclosure.



FIG. 8 is a flow diagram of an example procedure for causing a three-dimensional model to rotate based on detected rotation of a medical imaging device, according to an example embodiment of the present disclosure.



FIG. 9 is a flow diagram of an example procedure that uses synchronization between a medical imaging device and a three-dimensional model to generate a physiological index that is indicative of a functional significance of coronary lesions, according to an example embodiment of the present disclosure.



FIG. 10 shows an example image recorded by a medical imaging device and a Frangi-filter processed image, according to an example embodiment of the present disclosure.



FIG. 11 shows an image after centerlines have been determined, according to an example embodiment of the present disclosure.



FIG. 12 shows a schematic of an exemplary arrangement of imaging coordinates for an imaging system, according to an example embodiment of the present disclosure.



FIGS. 13 to 15 show diagrams illustrative of how centerline homologies are determined, according to an example embodiment of the present disclosure.



FIG. 16 shows a schematic representation of epipolar determination of three-dimensional target locations from two-dimensional image locations and their geometrical relationships in space, according to an example embodiment of the present disclosure.



FIG. 17 shows a diagram of a resistance array of a three-dimensional vascular tree model, according to an example embodiment of the present disclosure.





DETAILED DESCRIPTION

The present disclosure relates to a system, methods, and apparatus for synchronizing a three-dimensional model with a medical imaging device. Reference is made herein to a three-dimensional model of a patient's coronary arteries. However, it should be appreciated that the system, methods, and apparatus disclosed herein may provide for synchronization between three-dimensional models of other vascular networks.


The example system, methods, and apparatus determine or otherwise create a correlation between angular coordinates of a three-dimensional model and potential rotational angular positions of a medical imaging device. The correlation is used to ensure that a three-dimensional model, as shown in a display interface, is shown in the same viewpoint as, for example, an image intensifier of the medical imaging device (e.g., a C-arm). Movement of the C-arm is detected by the system, methods, and apparatus, which update the viewpoint of the three-dimensional model. Likewise, rotation of the three-dimensional model is detected by the system, methods, and apparatus, which cause the medical imaging device to rotate in a corresponding manner.


In some embodiments, the rotational angular positions of a medical imaging device include RAO angulation positions and LAO angulation positions, which are provided along a lateral (side-to-side) rotational axis with respect to a patient. Additionally or alternately, the rotational angular positions include cranial angulation positons and caudal angulation positions, which are provided along a vertical (head-to-toe) rotational axis with respect to a patient. It should be appreciated that other rotational or linear positions of a medical imaging device may also be used, such as movement along x, y, or z axes.


As disclosed herein, a three-dimensional model may be rotated by an amount specified by angular coordinates. The angular coordinates may include a roll axis and a pitch axis. The roll axis may correspond to RAO and LAO angular positions of the medical imaging device, while the pitch axis may correspond to cranial and caudal angulation positions. Coordinates along the roll axis and pitch axis specify how the three-dimensional model is to be virtually rotated to a desired viewpoint for display on a user interface. The viewpoint may correspond to a two-dimensional projected view of the three-dimensional model, which is synchronized to the two-dimensional view angle orientation of the medical imaging device, as described herein.


In some embodiments, the three-dimensional model may also be rotated along a pitch axis. If a medical imaging device is not able to move along a corresponding rotational angular position, the example methods, apparatus, or system may display an alert indicative that the movement of the three-dimensional model is outside the movement capability of the medical imaging device. Such an alert indication may also be provided if the three-dimensional model is moved along a roll and/or pitch axis to a degree that exceeds a travel limit of the medical imaging device. In some embodiments, the system, methods, and apparatus disclosed herein may lock the three-dimensional model to travel ranges of a medical imaging device to prevent a clinician from exceeding a travel limit.


The example synchronization provided by the system, methods, and apparatus disclosed herein reduces an amount of contrast agent or radiation dosage received by a patient during treatment in a catheterization laboratory. For example, after acquiring enough two-dimensional images, the methods, apparatus, and system disclosed herein create a three-dimensional model. While the patient is still catheterized, a clinician may review the three-dimensional model to identify any areas that appear unclear or have potential modeling errors. For example, some vessels may be designated in the model as being connected where in fact the vessels cross at different heights. In other instances, vessel curvature in a model may not be smooth, instead being shown as abrupt changes in direction or width. Such artifacts may be due to extrapolation errors from having too few two-dimensional images. In these instances, the clinician positions the three-dimensional model such that an area of concern is shown in a current viewpoint. The example methods, apparatus, and system determine corresponding angular positions of the medical imaging device, and transmit instructions to the medical imaging device to rotate to the identified position/orientation. The example methods, apparatus, and system then cause the medical imaging device to acquire at least one additional two-dimensional image, which is then used to update the three-dimensional model. Once the clinician is satisfied that the three-dimensional model is accurate, the example methods, apparatus, and system determine an index indicative of vascular function, such as FFR values. The computational efficiency and quick imaging provided by synchronization to the three-dimensional model enable all of the above operations to be carried out while the patient is still catheterized.


In another example, the methods, apparatus, and system provide synchronization during stent placement or any other catheterization laboratory procedure to inflate diseased locations of a patient's coronary arteries. In these examples, a clinician may track stent placement using the three-dimensional model. As the model is being rotated, the example methods, system, and apparatus cause a medical imaging device to rotate in a corresponding manner. If an updated view is needed, the clinician provides an instruction, which causes the example methods, system, and apparatus to record an image using the medical imaging device. The image may be displayed in conjunction with the model and/or may be used to update the model. This configuration enables a clinician to obtain additional two-dimensional angiographic images only as needed during treatment, where the medical imaging device is already in position to acquire an image when a clinician determines that an image is needed.


The example methods, apparatus, and system also detect movement or rotation of a medical imaging device. In response to the detected movement or rotation, the methods, apparatus, and system update an orientation of a displayed three-dimensional model of a patient's coronary arteries. Rotation of the three-dimensional model based on manual movement of the medical imaging device shows a clinician a viewpoint of a patient's coronary arteries that are facing the imaging device. Such a configuration may enable a clinician to determine a desired orientation for the model using external locations (provided by the medical imaging device) relative to a patient.


Example Vascular Imaging System


FIG. 1 is a diagram of an example vascular imaging system 1 including a medical imaging device 10 and a workstation 20, according to an example embodiment of the present disclosure. The medical imaging device 10 includes, for example, a C-arm. The example medical imaging device 10 in the illustrated example includes a radiation source 12 and an image intensifier 14. The example radiation source 12 is configured to emit ionizing radiation, which travels through a patient. For analysis of a patient's coronary arteries, the radiation source 12 is directed towards a patient's heart. The image intensifier 14 detects the radiation after the radiation has passed through a patient and records two-dimensional images that are indicative of the detected radiation intensity and/or ionization as the radiation contact's the patient's tissues. The image intensifier 14 is located directly across from the radiation source 12.


The radiation source 12 and the image intensifier 14 are positioned directly across from each other via a coupling to a support structure 16. As shown in FIG. 1, the support structure 16 may have a c-shape, with the image intensifier 14 and radiation source 12 being connected to opposing ends of the ‘c’. The support structure 16 is mechanically coupled to a base 18, which is connected to a floor or ceiling. The base 18 includes a pivot section 19, which is configured to rotate the support structure 16 in one or more angular axes. The example pivot section 19 includes motors that cause the support structure to rotate and/or otherwise move along a track to provide rotation of the image intensifier 14 and the radiation source 12.



FIG. 2 shows a first angular axis along which the support structure 16 may be rotated, according to an example embodiment of the present disclosure. The first angular axis may be referred to herein as a lateral (side-to-side) angular or rotational axis. As shown in FIG. 2, the axis includes a zero-degree starting position. Clockwise rotation is referred to herein as LAO angulation, and counter-clockwise rotation is referred to herein as RAO angulation. In some embodiments, the support structure 16 may be rotated 90° along the RAO and LAO directions. In other embodiments, the support structure 16 may be rotated greater than 90° or less than 90°.



FIG. 3 shows a second angular axis along which the support structure 16 may be rotated, according to an example embodiment of the present disclosure. The second angular axis may be referred to herein as a vertical (head-to-toe) angular or rotational axis. As shown in FIG. 3, the axis includes a zero-degree starting position. Clockwise rotation is referred to herein as caudal angulation, and counter-clockwise rotation is referred to herein as cranial angulation. In some embodiments, the support structure 16 may be rotated 90° along the caudal and cranial directions. In other embodiments, the support structure 16 may be rotated greater than 90° or less than 90°.



FIGS. 2 and 3 show that for each rotation along the first and second angular axes, the radiation source 12 is pointed toward the patient's heart. Likewise, the image intensifier 14 is also pointed towards the patient's heart. Rotation of the support structure 16 enables the patient's heart to be imaged from different viewpoints. In some embodiments, the support structure 16 may also be moved along x, y, and/or z axes with respect to the patient.


Returning to FIG. 1, the medical imaging device 10 is communicatively coupled to the workstation 20 via a link 21, which may include any wireless or wired link. The link 21 may include a network, such as a Wi-Fi network, an Ethernet network, a local area network, or a wide area network, such as the Internet. The link 21 enables the medical imaging device 10 to transmit acquired images 22 recorded by the image intensifier 14 to the workstation 20. The link 21 also enables the medical imaging device 10 to transmit position information, such as rotational coordinates 24 of the support structure 16 or image intensifier 14.


Additionally or alternatively, the link 21 enables the workstation 20 to transmit command messages or instructions 26 to the medical imaging device 10. The command messages or instructions 26 may include an instruction to record an image. The commend messages or instructions 26 may also include an instruction to rotate the support structure 16 to a specified orientation. As described herein, the instructors 26 may include relative position data that specifies a relative position change from a zero-degree starting positon (or current position) of the support structure 16 of the medical imaging device 10. The instructions 26 may alternatively include an absolute position along the lateral and vertical rotational axes. The instructions 26 may be specified in rotational angular positions, degrees, and/or coordinates.


The example workstation 20 includes a monitor or display screen 28, a processor 30, and a memory device 32. In some embodiments, the display screen 28 may include touchscreen sensors to enable clinicians to enter inputs. The example display screen 28 is configured to display one or more user interfaces transmitted by the processor 30. The user interfaces may display the images 22 recorded by the medical imaging device 10. The user interfaces may also display a three-dimensional model of a patient's coronary arteries. The user interfaces may include one or more controls to enable a clinician to manipulate the three-dimensional model. For instance, a user interface may enable a user to rotate a three-dimensional model along a pitch and/or roll angular axis. Further, a user interface may enable a user to rotate a three-dimensional model of a patient's coronary arteries along a yaw angular axis. In some embodiments, a user interface may enable a clinician to input an instruction to cause the medical imaging device 10 to record one or more images. Further, a user interface may include a control, which when selected, causes a three-dimensional model to lock or otherwise synchronize to the medical imaging device 10, as described herein.


The example processor 30 is communicatively coupled to the display screen 28 and the memory device 32. The example processor 30 may include a controller, a microcontroller, a control unit, a server, an application specific integrated circuit (“ASIC”), a field-programmable gate array (“FPGA”), microprocessor, etc. The example processor 30 is configured to perform the operations described herein. Aspects of the present disclosure are described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to certain embodiments. It should be understood that the operations described herein may be defined by computer program instructions. These computer program instructions may be provided to the processor 30, such that the instructions, when executed via the processor 30 perform the operations described herein. The instructions may comprise a software application, an algorithm, and/or a routine.


The computer program instructions may be stored in the memory device 32. The example memory device 32 may include an electronic memory, a magnetic memory, an optical memory, an electromagnetic memory, an infrared memory, and/or a semiconductor memory. For example, the memory device 32 may include a portable computer diskette, a hard disk, a random access memory (“RAM”), a read-only memory (“ROM”), an erasable programmable read-only memory (“EPROM” or Flash memory), an optical fiber, a portable compact disc read-only memory (“CD-ROM”), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.


The example processor 30 in cooperation with the memory device 32 is configured to create a three-dimensional model of a patient's coronary arteries using, for example, at least two images 22 from the medical imaging device 10. The images 22 may be recorded at different angles with respect to the patient's coronary arteries. As described below in more detail, the processor 30 identifies vessel centerlines and/or boundaries in the images. The processor 30 uses homologies or similarities between features of the vessel to correlate the same vessel location shown in the different images. The processor 30 may then determine sample points along the centerlines. Further, the processor 30 determines vascular geometric information for the sample points along the centerlines. Using the homologies and/or similarities, the processor 30 combines the same centerlines shown in the different images. The processor 30 also combines the vascular geometric information for the same location (as determined in the different images) to determine a three-dimensional geometry. The vascular geometric information and/or the three-dimensional geometry may include at least one of a vascular diameter, a vascular radius, a cross sectional area, a cross sectional profile, a vascular wall curvature, or vascular branching. As described herein, combining may include determining which vascular geometric information for a particular location is most likely representative of the vessel geometry using, for example, a projection degree from which a corresponding image was recorded by the medical imaging device 10, image information indicative that a substantially side-view of a vessel is shown rather than a front or rear view, and/or information indicative that visual interference or obstruction is not present. The visual interference may result from a crossing vessel or other tissue. After the three-dimensional geometries are determined, vessel wall boundaries are formed with respect to the combined centerlines, thereby forming a three-dimensional model. The processor 30 may also apply a three-dimensional coordinate system to the three-dimensional model, where each point along a centerline is assigned a coordinate. In some instances, the coordinate assignment may occur after the centerlines are identified in the images, which enables the processor 30 to assign similar locations in different images to the same coordinate, which is beneficial for identifying homologies and combining vascular data from multiple images together. FIG. 1 shows that the memory device 32 is configured to store a three-dimensional model 34 of a patient's coronary arteries.


In addition to creating three-dimensional models from two-dimensional images, the example processor 30 in cooperation with the memory device 32 is configured to determine values indicative of vascular health. As described below in more detail, the example processor 30 is configured to determine vascular resistances using the three-dimensional model. The vascular resistances are used to approximate blood flows and/or pressures within, for example, a patient's coronary arteries. Further, the processor 30 may normalize or otherwise smooth the three-dimensional model. This normalization reduces abrupt narrowing of the coronary arteries that may be attributed to a lesion or stenosis. The processor 30 determines vascular resistances in the normalized model, which may be indicative of vascular bed resistances. The processor 30 may further determine blood flows and/or pressures in the normalized model. The processor 30 calculates a ratio of the blood flows, pressures, and/or resistances for the same location throughout the three-dimensional model pre and post normalization. The calculated ratios correspond to FFR values.


The example processor 30 is also configured to provide for synchronization between the medical imaging device 10 and one or more three-dimensional models 34. The synchronization may be provided to improve a quality of a three-dimensional model where a first set of medical images may not have captured certain vascular features correctly. The synchronization may also be used during a medical treatment, such as stent placement, where updated medical images are needed to access or track placement of the stent.


To provide synchronization, the example processor 30 is configured to create a correlation between a three-dimensional model and the medical imaging device 10. The correlation may be stored in a file or other data structure within the memory device 32. The correlation relates coordinates or rotational angular degrees of a three-dimensional model to the lateral and vertical angular axes of the medical imaging device 10.



FIG. 4 is a diagram of an example correlation file 40 used by the processor 30 to synchronize a three-dimensional model of a patient's coronary arteries with the medical imaging device 10, according to an example embodiment of the present disclosure. In this embodiment, RAO angulation of the medical imaging device 10 corresponds to positive roll of a three-dimensional model while LAO angulation of the medical imaging device 10 corresponds to negative roll of the three-dimensional model. Further, cranial angulation of the medical imaging device 10 corresponds to negative pitch of the three-dimensional model and caudal angulation of the medical imaging device 10 corresponds to positive pitch of the three-dimensional model. Further, a zero-point starting position of the medical imaging device 10 (e.g., RAO 0°, LAO 0°, cranial 0°, and caudal 0°), corresponds to a viewpoint of a three-dimensional model of a front face or plane when a patient is in a supine position. In other words, the viewpoint of the three-dimensional model for an origin point of the roll and pitch angular axes corresponds to a plane that is facing the image intensifier 14 when the medical imaging device is placed at the zero-point starting position.


The correlation file 40 shown in FIG. 4 relates a degree of angulation to a degree of roll or pitch of the three-dimensional model. In some instances, there is a one-to-one correspondence. In other instances, the relationship in degrees may be non-linear. In an example, the processor 30 may receive rotational coordinates 24 from the medical imaging device 10 indicative of rotation of the support structure 16. Specifically, the rotational coordinates 24 may specify cranial 10° and ROA 10°. The processor 30 uses the correlation file 40 to determine that a selected three-dimensional model displayed in a user interface on the display screen 28 is to be rotated −11° along a pitch axis and 12° along the roll axis. In a similar manner, the processor 30 may receive roll/pitch degrees based on a clinician rotating a three-dimensional model. In response, the processor 30 determines the lateral and vertical angulation degrees for the medical imaging device 10. The processor 30 then transmits instructions 26 to the medical imaging device 10 indicative of the determined lateral and vertical angulation degrees.


It should be appreciated that the degrees may be absolute or relative. For example, the medical imaging device 10 may transmit an absolute position of the support structure 10. For absolute degrees, the processor 30 uses the correlated degrees as a corresponding absolute position for the three-dimensional model. Alternatively, the medical imaging device 10 may transmit relative rotation of the support structure 16. For relative degrees, the processor 30 rotates the three-dimensional model from a current position based on the determined relative roll and pitch degree changes.


In some embodiments, the correlation file 40 of FIG. 4 may be replaced by a transfer function. In these instances, there is a lineal, proportional, quadratic, etc. relationship between the roll/pitch of a three-dimensional model and the angulation of the medical imaging device 10. Further, it should be appreciated that the memory device 32 may store a separate transfer function and/or correlation file 40 for each different type of medical imaging device 10.


The example correlation file 40 and/or transfer function may be determined based on known relationships between angulation of the medical imaging device 10 and a three-dimensional model. In other examples, the processor 30 may perform an iterative method where a three-dimensional model is rotated by +/− one degree along the roll and pitches axes, and the medical imaging device 10 is rotated until an acquired image matches the viewpoint of the three-dimensional model. The processor 30 may determine centerlines of vessels in an acquired image and determine if they align with vessels in the three-dimensional model as viewed in a current viewpoint. In these examples, the processor 30 determines correlations through all paths of potential travel of the support structure 16 for of the medical imaging device 10.


In yet other examples, the processor 30 may receive or acquire a template three-dimensional model from the memory device 32. The three-dimensional model may correspond to a template three-dimensional object that is placed for imaging by the medical imaging device 10. The model may include fiducials or other markers at predefined locations. The three-dimensional object may include fiducials or other markers that are placed at known distances from each other. The processor 30 causes the medical imaging device 10 to acquire an image when the model is at a defined orientation. The processor 30 may compare the fiducials shown in the image with the fiducials in the model to determine if alignment is synchronized. If not, the processor 30 may compute deltas between the fiducials to determine how the support structure 16 is to be rotated. The processor 30 then causes the support structure 16 to rotated, and causes the medical imaging device 10 to acquire another image to confirm synchronization. Such calibration may be performed prior to treatment on a patient and/or when the processor 30 is provisioned for operation with the medical imaging device 10.


In some embodiments, the correlation file 40 may also include rotational limits of the medical imaging device 10. The rotational limits correspond to an end of a track of the support structure 16 with respect to the pivot section 19. These limits may also correspond to known or estimated locations of a patient, such as their head or legs. In the illustrated example of FIGS. 2 and 3, the rotational limits may correspond to 90° RAO, LAO, cranial, and/or caudal.



FIG. 5 is a flow diagram of an example procedure 50 that uses the correlation file 40 of FIG. 4 to synchronize a three-dimensional model of a patient's coronary arteries with the medical imaging device 10, according to an example embodiment of the present disclosure. Although the procedure 50 is described with reference to the flow diagram illustrated in FIG. 5, it should be appreciated that many other methods of performing the steps associated with the procedure 50 may be used. For example, the order of many of the blocks may be changed, certain blocks may be combined with other blocks, and many of the blocks described may be optional. In an embodiment, the number of blocks may be changed based on how the correlation is determined. The actions described in the procedure 50 are specified by one or more instructions that are stored in the memory device 32, and may be performed among multiple devices including, for example the processor 30 and/or the medical imaging device 10.


The example procedure 50 begins by receiving an instruction 51 from a clinician via a user interface or control device to register a three-dimensional model with a medical imaging device (block 52). The instruction 51 may include an identification of the three-dimensional model and/or the medical imaging device 10. In other examples, the processor 30 may identify which three-dimensional model is opened in a user interface, and a device model and/or type of communicatively coupled medical imaging device 10. The instruction 51 may be received for a specific three-dimensional model of a patient's coronary arteries that are constructed using two-dimensional angiographic images previously recorded by the medical imaging device 10.


The procedure 50 continues by the processor 30 determining an orientation of the selected three-dimensional model that corresponds to a zero-degree starting position of the medical imaging device 10 (block 54). The orientation may be determined by identifying a front of the three-dimensional model based on a known orientation of coronary arteries. The orientation may also be known based on an image that was recorded by the medical imaging device 10 (in a zero-point starting position) during creation of the three-dimensional model 10 and confirming the image aligns with the model. To provide a confirmation, the processor 30 may compare centerlines in the image to the centerlines of the three-dimensional model. If the difference between the centerlines is less than a threshold, the processor 30 confirms the three-dimensional model is aligned to the zero-point starting position of the medical imaging device 10.


Next, the processor 30 determines or receives potential rotational angulation positions for the medical imaging device 10 (block 56). The potential rotational angulation positions for the medical imaging device 10 are specified in the correlation file 40. In other embodiments, the potential rotational angulation positions for the medical imaging device 10 may be expressed within a transfer function. In some instances, the potential rotational angulation positions for the medical imaging device 10 may be input by a clinician or other operator.


The example processor 30 then determines or identifies angular coordinates for the three-dimensional model that correspond to the potential rotational angulation positions for the medical imaging device 10 (block 58). The angular coordinates for the three-dimensional model that correspond to the potential rotational angulation positions may also be specified in the correlation file 40 and/or expressed within a transfer function. In some instances, the angular coordinates for the three-dimensional model may be input by a clinician and manually correlated to the potential rotational angulation positions for the medical imaging device 10.


At this point, the three-dimensional model and the medical imaging device 10 are orientated in a zero-point starting position and the processor 30 has rotational correlations between the model and the imaging device. The processor 30 is now enabled to synchronize rotation or movement between the three-dimensional model and the medical imaging device 10, as discussed below.


When the three-dimensional model is synchronized with the medical imaging device 10, the processor 30 may cause an alert or other message to be displayed when a clinician rotates the three-dimensional model to a position or orientation that is outside the range of rotation of the support section 16. In some instances (e.g., if a lock feature is enabled via a user interface), the processor 30 may prevent the three-dimensional model from being rotated past the rotational limits of the medical imaging device 10.


Returning to FIG. 1, the example workstation 20 may additionally include a mouse, touchscreen, joystick, etc. that is communicatively coupled to the processor 30 to provide input commends, such as to rotate the three-dimensional model. The mouse, touchscreen, joystick, etc. may further be used to control rotation of the medical imaging device 10. In some embodiments, a smartphone or tablet computer may be communicatively coupled to the processor 30. In these embodiments, the smartphone or tablet may include an application that displays a control interface for rotating the three-dimensional model and/or rotating the medical imaging device 10.


Example Synchronization Embodiments


FIG. 6 is a flow diagram of an example procedure 60 for causing the medical imaging device 10 to rotate based on detected rotation of a three-dimensional model, according to an example embodiment of the present disclosure. Although the procedure 60 is described with reference to the flow diagram illustrated in FIG. 6, it should be appreciated that many other methods of performing the steps associated with the procedure 60 may be used. For example, the order of many of the blocks may be changed, certain blocks may be combined with other blocks, and many of the blocks described may be optional. In an embodiment, the number of blocks may be changed based on how the synchronization is configured. The actions described in the procedure 60 are specified by one or more instructions that are stored in the memory device 32, and may be performed among multiple devices including, for example the processor 30 and/or the medical imaging device 10.


The example procedure 60 begins when the processor 30 of FIG. 1 receives an indication of a movement of a three-dimensional model (block 62). The indication is received after the processor 30 has synchronized the three-dimensional model with the medical imaging device 10, as discussed above in connection with FIG. 5. In some embodiments, the indication may be received from a user interface that is displaying the three-dimensional model (such as model 34) on a display screen 28. The indication may be received via user interface controls or from a physical device, such as a mouse. The indication includes, for example, a number of degrees along a pitch axis and/or a roll axis the three-dimensional model was rotated with respect to a starting-point or origin.


The example processor 30 next determines if a lock feature is enabled (block 64). The lock feature may be displayed in a user interface with the three-dimensional model, and when selected, causes real-time rotation of the model to be synchronized with the medical imaging device 10. If the lock feature is enabled, the processor 30 uses the correlation file 40 (or transfer function) to determine corresponding rotation for the medical imaging device 10 based on the degrees of roll/pitch rotation of the three-dimensional model (block 66). The processor 30 determines, for example, rotational angulation positions for the support structure 16 of the medical imaging device 10 (e.g., a RAO angulation, a LAO angulation, a cranial angulation, and/or a caudal angulation).


The processor 30 then determines machine instructions or a command for the determined rotational angulation positions (block 68). The command includes a digital message or an analog signal that is indicative of the determined rotational angulation positions. For instance the command may specify a number of degrees certain motors are to rotate. In other embodiments, the command may include just the determined rotational angulation positions. The processor 30 transmits the command to the medical imaging device 10 (block 70), causing the device to rotate as instructed.



FIG. 7 is a diagram that is illustrative of a synchronization between a three dimensional model 34 and the medical imaging device 10, according to an example embodiment of the present disclosure. The synchronization may be provided by the processor 30 using the procedure 60 described above. In a first instance, a clinician rotates the three-dimensional model 34 to a first orientation 90. The processor 30 detects this rotation and determines a corresponding rotational angulation position for the medical imaging device 10 (LAO 29°, Cranial 21°). The processor 30 then transmits instructions causing the medical imaging device 10 to rotate accordingly. The commands or instructions cause the medical imaging device 10 to move such that the image intensifier 14 faces or otherwise projects to a region of the patient's coronary arteries that corresponds to the orientation 90 of the three-dimensional model 34 shown in the user interface.


The example processor 30 enables the clinician to rotate or otherwise maneuver the three dimensional model 34 to investigate a geometry of the vessel tree. Movement of the three dimensional model 34 reduces the number of x-ray images needed since two or three images could be used to construct a complete model, providing views that were not imaged using the medical imaging device. This reduces a radiation exposure of the patient and the amount of contrast used. Moreover, the three dimensional model 34, being a virtual object, can be rotated to viewpoints that are not possible for the medical imaging device to reach, enabling a clinician to investigate the vascular tree structure in greater detail.


In a second instance, a clinician rotates the three-dimensional model 34 to a second orientation 92. In response, the processor 30 causes the medical imaging device 10 to rotate in a synchronized manner (RAO 16°, Cranial 25°). In a third instance, a clinician rotates the three-dimensional model 34 to a third orientation 94. In response, the processor 30 causes the medical imaging device 10 to rotate in a synchronized manner (RAO 17°, Caudal 22°).


Returning to FIG. 6, the example procedure 60 continues when the processor 30 determines if an indication to record an image is received from a user interface or control device (block 72). If an indication has not been received, the procedure 60 returns to block 62 for detecting further rotation of the three-dimensional model. However, if an indication to record an image has been received, the processor 30 transmits an instruction or a command message to the medical imaging device 10 to record an image (block 74). The processor 30 then receives the medical image after it is acquired (block 76). The processor 30 may then display the medical image in a user interface in conjunction with the three-dimensional model and/or update the three-dimensional model using the new image (block 78). In some instances, the processor 30 may update the model and calculate new FFR values, which may be used to determine whether a treatment, such as placement of a stent, has been successfully preformed to remove or mitigate a vessel lesion or stenosis. The example procedure 60 then returns to block 62 for detection of further rotation of the three-dimensional model. The example procedure 60 accordingly enables a real-time synchronization of an actual viewpoint of a model with a current position of a medical imaging device.


The synchronization feature also enables a clinician to obtain a desired viewpoint more accurately for the imaging device based on a desired viewpoint of the model, which avoids irradiating and injecting dye when not necessary if the information within the model is sufficient for the clinician. Unlike an imported CT model, the example three-dimensional model can also be updated on the fly with every new navigation image acquired. Whenever contrast material is injected during the acquisition of the navigation verification images, the actual diameter data present in those images may be used by a clinician to update the model to maintain a live updated road map of the coronary arteries.


Returning to block 64 of the procedure 60, if the lock feature is not enabled, movement of the three-dimensional model does not cause real-time movement of the medical imaging device 10. Instead, the processor 30 determines if an indication to record an image has been received (block 80). If an indication has been received, the processor 30 proceeds to cause the medical imaging device 10 to rotate for synchronization with the model. This includes using the correlation file 40 (or transfer function) to determine corresponding rotation for the medical imaging device 10 based on the degrees of roll/pitch rotation of the three-dimensional model (block 82). The processor 30 then determines machine instructions or a command for the determined rotational angulation positions (block 84). The processor 30 transmits the command to the medical imaging device 10 (block 86), causing the device to rotate as instructed. The processor 30 then receives the medical image after it is acquired (block 88). The processor 30 may then display the medical image in a user interface in conjunction with the three-dimensional model and/or update the three-dimensional model using the new image (block 78). In some instances, the processor 30 may update the model and calculate new FFR values, which may be used to determine whether a treatment, such as placement of a stent, has been successfully preformed to remove or mitigate a vessel lesion or stenosis. The example procedure 60 then returns to block 62 for detection of further rotation of the three-dimensional model.


In some embodiments, the processor 30 is configured to register new images, acquired during device navigation, with the three-dimensional model. The example processor 30 uses the known viewpoint (angulation) of the medical imaging device 10 when the image is acquired to associate the image with the corresponding location on the three-dimensional model. In other words, based on the known viewpoint of each new image, the rays crossing of features of a device (e.g., a stent placed within the arteries) in the image provide two-dimensional x-ray data that is projected to the three-dimensional model. The missing depth data for each ray can then be recovered by the processor 30 from the intersection with the three-dimensional model. Registration errors are eliminated by constraining the rays to intersect with the vessels in the three-dimensional model, similar to the constraining of GPS-derived coordinates onto preliminarily-mapped streets. Such a registration provides anchor points on the three-dimensional vessel map, enabling a clinician to trace the path of a device (such as a stent) in the arteries, again while reducing the amount of radiation dose to the necessary minimum, e.g. to only where critical bifurcations may lead to a wrong navigation. The use of devices with wire push control enable even further reduction of dose and dye by adding information about the running length of the wire used to navigate the device at hand.


In some embodiments, the processor 30 registers a new image with a three-dimensional model by first identifying coronary arteries in the new image. The processor 30 then determines centerlines through the identified coronary arteries and determines sample points along the centerlines in the three-dimensional coordinate system of the model. The processor 30 next determines vascular geometric information for the sample points along the centerline. The processor 30 is configured to then determine a correspondence between the coronary arteries in the medical image and the three-dimensional model using at least the centerlines of the medical image and the centerlines of the three-dimensional model. The processor 30 may identify homologies between the centerlines of the model and the image, and determine if similar locations along the centerlines have the same coordinates. If there is at least some correspondence between the model and the image, the processor 30 is configured to update the three-dimensional model with the determined vascular geometric information from the medical image.



FIG. 8 is a flow diagram of an example procedure 100 for causing a three-dimensional model (such as the three-dimensional model 34 of FIG. 1) to rotate based on detected rotation of the medical imaging device 10, according to an example embodiment of the present disclosure. Although the procedure 100 is described with reference to the flow diagram illustrated in FIG. 8, it should be appreciated that many other methods of performing the steps associated with the procedure 100 may be used. For example, the order of many of the blocks may be changed, certain blocks may be combined with other blocks, and many of the blocks described may be optional. In an embodiment, the number of blocks may be changed based on how the synchronization is configured. The actions described in the procedure 100 are specified by one or more instructions that are stored in the memory device 32, and may be performed among multiple devices including, for example the processor 30 and/or the medical imaging device 10.


The example procedure 100 begins when the processor 30 receives an indication of movement of the medical imaging device 10 (block 102). The indication may include a digital message and/or analog signal indicative of relative or absolute angulation of the support structure 16. In some embodiments, the three-dimensional model may not be locked, such that movement of the medical imaging device 10 is disregarded or not used for synchronization by the processor 30. This enables the medical imaging device 10 to be rotated without affecting the display of the model on the display screen 28.


The processor 30 next determines, using the correlation file 40 (or transfer function), corresponding rotation for the synchronized three-dimensional medical image (block 104). The processor 30 then applies the determined rotation to the three-dimensional model such that it is rotated along its pitch and/or roll axes (block 106). The processor 30 also determines if an indication to record an image has been received (block 108). If an indication has not been received, the example processor 30 returns to block 102 for detecting additional rotation and/or movement of the medical imaging device 10.


However, if an indication to record an image is received, the processor 30 transmits a command or an instruction message to record an image (block 110). The processor 30 then receives the medical image after it is acquired (block 112). The processor 30 may then display the medical image in a user interface in conjunction with the three-dimensional model and/or update the three-dimensional model using the new image (block 114). The procedure 100 then returns to block 102 for detecting additional rotation and/or movement of the medical imaging device 10.


Physiological Index Calculation



FIG. 9 is a flow diagram of an example procedure 120 that uses synchronization between a medical imaging device and a three-dimensional model to generate a physiological index that is indicative of a functional significance of coronary lesions, according to an example embodiment of the present disclosure. Although the procedure 120 is described with reference to the flow diagram illustrated in FIG. 9, it should be appreciated that many other methods of performing the steps associated with the procedure 120 may be used. For example, the order of many of the blocks may be changed, certain blocks may be combined with other blocks, and many of the blocks described may be optional. In an embodiment, the number of blocks may be changed based on how the synchronization is configured or the physiological index is determined. The actions described in the procedure 120 are specified by one or more instructions that are stored in the memory device 32, and may be performed among multiple devices including, for example the processor 30 and/or the medical imaging device 10.


The example procedure 120 may be carried out during a medical diagnosis of a patient's coronary arteries. The example procedure 120 may also be carried out during a medical treatment, such as placement of a stent or an inflation of a section of a patient's coronary arteries. The example procedure 120 begins when the processor 30 receives medical images from the medical imaging device 10 (block 122). In some embodiments, data images are simultaneously acquired from a plurality of vantage points, for example, 2, 3, 4 or more imaging vantage points. In some embodiments, images are acquired at a frame rate of, for example, 15 Hz, 30 Hz, or another lesser, greater, or intermediate frame rate. In some embodiments, the number of frames acquired per imaging vantage point is about 50 frames (200 frames total for 4 imaging vantage points). In some embodiments, the number of frames per imaging vantage point is, for example, 10, 20, 40, 50, 60, 100, or another larger, smaller, or intermediate number. In some embodiments, the number of heartbeat cycles comprised in an imaging period is about 3-4 heartbeat cycles. In some embodiments, the number of cardiac cycles is, for example, 3-4, 3-5, 4-6, 5-10, or another range of heartbeat cycles having the same, lesser, greater, or intermediate range boundaries. FIG. 10 shows an example image 150 recorded by the medical imaging device 10 and a Frangi-filter processed image 152, according to an example embodiment of the present disclosure. It should be noted that when using two or more 2-D projections of a patient's vessels, for example heart vessels, it is a potential advantage for two or more two-dimensional projections be taken at the same time, or at least at a same phase during a heartbeat cycle, so that the two-dimensional projections correspond to a similar vessel shape. Deviations between the two-dimensional projections might arise from cardiac, and/or respiratory and/or to patient motions between the two-dimensional projection frames. In some embodiments, to reduce deviations that might arise from lack of cardiac phase synchronization, an ECG output is used by the processor 30 to select a same cardiac phase in the two-dimensional projections frames. In some embodiments, two-dimensional projection frames are selected to be at an end of the diastole phase of the cardiac cycle. In some embodiments, the temporal and/or phase order in which two-dimensional projection frames are acquired is used to perform registrations among images taken during adjacent movement cycle phases. In some embodiments, images registered from a first phase to a second phase are then re-registered into a third and/or further phase, such that images take at widely separated heartbeat cycle phases can be registered to one another.


In some embodiments, the heart is imaged under the influence of intravenous adenosine, which potentially exaggerates a difference between normal and abnormal segments. Optionally imaging with and without adenosine potentially allows for the determination of the (vascular expanding) effects of adenosine itself, which in turn potentially provides information about vascular compliance and/or autoregulatory state.


After the images are received, in some embodiments, the processor 30 calculates a Hessian transform from every pixel of a Gaussian smoothed input image (the Hessian is related to the 2nd derivative of the image data, and is a form of edge detection). The Hessian-transformed image is smoothed, for example, by a Gaussian filter. The eigenvectors and eigenvalues of the smoothed Hessian-transformed image are calculated by the processor 30. The resulting eigenvalues are in general larger where the original image comprises edges, while eigenvectors corresponding to large eigenvalues describe the direction in which the edge runs.


In some embodiments, the processor 30 calculates a diffusion image. A finite difference scheme is used to perform the diffusion, using, in some embodiments, the eigenvectors as diffusion tensor directions. Additionally or alternatively, a Frangi filter is applied by the processor 30, based on the Hessian eigenvectors, which comprises computation of the likeliness of an image region to be within a vessel. By way of a non-limiting example, the Frangi-filter processed image 152 of FIG. 10 depicts the original image 150 after image enhancement using a Frangi filter. In some embodiments, another filter is used, for example a threshold filter, or a hysteresis threshold filter, whereby pixels of an image are identified as belonging within an image region of a vessel.


Further, in some instances, the processor 30 processes the images to generate a black-white figure representing vascular locations in an angiographic projection image. In some embodiments, a hysteresis threshold filter is performed on the Frangi filter output with high and low thresholds. First an algorithm performed by the processor 30 detects the pixels which are (for example, with reference to image 152) brighter than the higher threshold, which are labelled as vascular pixels. In a second step, the algorithm also labels as vascular those pixels with brightness higher than the low threshold, and also connected across the image to pixels already labeled as vascular pixels. Square dilation may also be performed on the black-white image, and the result is subtracted from the original black-white image. The subtraction image comprises a one pixel-wide frame along which the growth of the region of vascular-labelled pixels is then examined by the processor 30. The values (brightnesses) of the pixels in this frame are compared locally to those of existing vascular pixels, and to the surrounding. A high relative result leads to expansion. Optionally, the process repeats until no more vascular pixels are found. Further, in some embodiments, a thinning convolution is applied by the processor to thin the black-white image segments down to lines which represent the vascular centerlines. FIG. 11 shows the image 110 after centerlines 210 have been determined, according to an example embodiment of the present disclosure.


Returning to FIG. 9, the processor 30 uses the images to create a three-dimensional model (block 124). To create a three-dimensional model, the processor 30 identifies vessels in the images and extracts centerlines of those vessels. Vascular centerlines have several properties which make them a useful reference for other phases of vascular tree reconstruction. For instance, centerlines are features determinable from two-dimensional images, enabling their use to relate individual images to one another in three-dimensions. Further, vascular centerlines are, by definition, distributed throughout an imaging region of interest when the target is to reconstruct a three-dimensional vascular model. Moreover, vascular centerlines are extended features which preserve sufficient similarity among images, even images taken from different views, that their homologies are readily identifiable, for example in the two-dimensional images themselves, and/or by back-projection along rays into a 3-D space, where ray intersections (and/or intersections among dilated volumes based on back-projected rays) identify homologous targets found in different image projections. Additionally, centerlines comprise a convenient frame of reference for organizing and/or analyzing features related to position along a blood vessel. For example, using distance along the centerline as a reference, morphological features such as diameter, and/or functional features such as flow resistance, can be expressed as functions in a simplified, one-dimensional space. Also, intersections of centerlines provide a convenient way for the processor 30 to identify vascular branch points, and/or divide a vascular tree into distinct segments which are optionally treated separately from one another, and/or further simplified, for example, for purposes of the functional analysis of flow properties.


Additionally or alternatively, in some embodiments, another type of image feature is identified by the processor 30. Optionally, an image feature includes, for example, a furcation of a vessel, or a location of minimal radius (a locally reduced radius compared to surrounding vascular regions) in a stenosed vessel. Optionally, an image feature includes any configuration of image pixels having a pattern of intensities generally lacking in self-identity (below a predetermined threshold of self-identity, for example, always above a threshold of intensity difference, or always within a criterion for statistical insignificance of self-identity) over translation in any direction, for example, a corner-like bend or furcation.



FIG. 12 shows a schematic of an exemplary arrangement 1200 of imaging coordinates for an imaging system, according to an example embodiment of the present disclosure. Several different spatial relationships of an imaging arrangement are used in determining the 3-D relationship of image data in a set of 2-D images. In some embodiments, the image coordinate systems 1210 and 1220 and associated imaging planes 1225 and 1230 describe how images taken of the same subject at different positions relate to one another, information which is used to reconstruct three-dimensional information about a patient. In some embodiments of the invention, these coordinates reflect the axes of the C-arm rotations of the medical imaging device 10. In some embodiments, the coordinate plane 1215 of the patient (for example, lying on bed 1205) is also used as part of three-dimensional reconstruction.


The example processor 30 extracts centerlines of a vascular tree from the acquired two-dimensional images. In some embodiments, image filtering by anisotropic diffusion comprises a portion of the processing operations which precede centerline extraction. Anisotropic diffusion of two-dimensional gray-scale images reduces image noise while preserving region edges—smoothing along the image edges, and removing gaps due to noise.


After centerlines are extracted from the images and projected to a three-dimensional coordinate system, the example processor 30 determines centerline correspondences between the images. A goal of finding centerline correspondences is to find correspondences between different two-dimensional images (points that image the same region of space, though potentially from different angles), such that a three-dimensional reconstruction of the target vasculature can be made. As described below, the centerline correspondence determination may include motion compensation and/or imaging position artifact compensation.


With ideal calibration information (each image plane perfectly identified relative to a common coordinate axis, for example), and no artifacts due to motion or other positioning errors, back-projecting a large enough number of two-dimensional images to three-dimensional space potentially yields intersecting rays uniquely defining the extents of the vascular centerline in three-dimensions. In practice, deviations among images originate, for example, from breathing, voluntary movements of the patient, and inaccurate and/or imprecise phase-locking of imaging exposures to the cardiac cycle. Calibration errors potentially introduce other forms of image position artifacts.


For motion compensation, a subset of images (comprising a plurality) with identified two-dimensional centerlines is selected by the processor 30. The centerlines are optionally dilated, and a centerline projection back into a three-dimensional coordinate system may be performed based on the current best-known projection parameters for each image (initially these are, for example, the parameters expected based on the known calibrations of the imaging device). The resulting projected volume is skeletonized, in some embodiments, to form a “consensus centerline”. The consensus centerline is projected back into the coordinate systems of two-dimensional images comprising those which were not used in forming the consensus centerline. An optimization procedure performed by the processor 30 adjusts projection parameters for the three-dimensional centerline into each two-dimensional image to fit more closely centerlines found within the image itself. This adjustment is used to adjust the projection parameters associated with each image.


In another embodiment for motion compensation, the processor 30 identifies features in a reference image R based on a feature detection method. Such an image feature is, for example, a furcation of a vessel, and origin of the coronary vessel tree, a location of minimal radius in a stenosed vessel, and/or any configuration of image pixels having a pattern of intensities generally lacking in self-identity over translation in any direction—for example, a corner-like bend or furcation. Similar features (putatively homologous to those of the reference image) are identified in remaining images F. In some embodiments, images in F are then registered to image R. For example, the best-known projection parameters of image F are used to transform into the best-known projection plane of image R, and then optimized to obtain an improved fit, for example using epipolar geometry to calculate parameters of shift, rotation, and/or scaling. Optionally, registration comprises application of a geometric distortion function. The distortion function is, for example, a first, second, or other-order function of the two image plane coordinates. Additionally or alternatively, the distortion function comprises parameters describing the adjustment of nodal points defined within the image coordinate planes to bring them into registration.


In some embodiments, the processor 30 calculates image correction parameters based on identified corresponding image features. Correction parameters typically describe, for example, translation and/or rotation of a system of coordinates of a particular image. Based on the calculated parameters, the angiographic images are registered to provide to provide mutual geometrically correspondence there amongst. In some embodiments, several images are registered relative to one of the images. For example, when corresponding image features are identified in N images (e.g., N=2, 3, 4 or more) one of the images can be selected as a reference, while the registration is applied for the remaining N−1 angiographic images such that each of those remaining images geometrically corresponds to the single angiographic image that was selected as a reference. In some embodiments, another registration scenario, for example, pairwise registration, is performed.


In some instances, the processor 30 may use a heart surface constraint for discarding bad ray intersections from calculated correspondences among images. In some imaging procedures, a “large enough” number of projections is potentially unavailable, such that the error in the determined position of ray intersections potentially prevents convergence to a correct output. In some embodiments, operations to reduce the effect of this source (and/or other sources) of positional error are performed by the processor 30, based on a heart surface constraint.


The processor 30 selects an image which comprises features expected to be within the projected outline of the heart. The features are representations of the coronary arteries, which course over the heart surface. In some embodiments, previously determined vascular centerlines comprise the identified features. A convex hull may be determined based on the vascular centerlines. This hull grossly represents the shape of the heart (where it is covered by identified artery centerlines) as projected into the plane of the selected two-dimensional image. The processor 30 may then determine the tree-dimensional hull position (heart shell) from the various available two-dimensional hull projections, for example by using the best-known projection parameters for each two-dimensional image plane, and/or the intersections of the three-dimensional polyhedra. Such a surface can be defined using any technique known in the art, including, without limitation, polyhedra stitching, based on the descriptions provided herein. The processor 30 may dilate the heart shell to a volume, the amount of dilation being determined, for example, as corresponding to an error limit, within which “true” vascular regions are expected to fall. The processor 30 may then exclude candidate 3-D positions of vascular centerline points which fall outside the heart shell.


After providing image compensation, the processor 30 identifies homologies of the different image centerlines for creating a three-dimensional model. In an embodiment, the processor selects a base two-dimensional image for homology determination. The vascular centerlines in one of the remaining images is projected by the processor 30 into the plane of the base image. For example, exemplary vascular centerline 503 of FIG. 13 is from a base image having a base coordinate system 504. Vascular centerline 501, taken from another image having a different coordinate system 502 is shown transformed into coordinate system 504 (translated in one direction for clarity in FIG. 13) as centerline 501B. In FIG. 14, the two centerlines are shown overlaid, illustrating their general similarity, and some artificial differences where they diverge.


In some instances, the projected vascular centerline 501B is dynamically dilated by the processor to create centerline 501C, noting where intersections with the base image vascular centerline first occurs, and continuing, for example, until all homologies have been identified. Dynamic dilation comprises, for example, gradual expansion of the centerline, for example by application of a morphological operator to pixel values of the image. In some embodiments, another method, for example, a nearest-neighbor algorithm, is used (additionally or alternatively) to determine correspondences. FIG. 15 shows examples of correspondence between vascular centerline points at either end of minimal distance lines 515 and 510.


After homologies are identified between the centerlines, the processor 30 performs a three-dimensional mapping of two-dimensional centerlines. In some embodiments, mapping begins with identification of optimal projection pairs. Where several different images have been acquired, there are potentially several different (although homologous) projections of each region of a vascular centerline into three-dimensional space, each based on a different pair of two dimensional images. Initially, the processor 30 selects a segment comprising a vascular centerline in addition to with an initial homologous group of centerline points along it (for example, a point from an end) in the different two-dimensional images.


As shown in FIG. 16, a point P1 on a vascular centerline (corresponding to some homologous group of centerline points P) is selected from a first base image. Other points P2 . . . PN are then selected from the homologous group P to be paired with P1 to find a position in three-dimensional space. FIG. 16 shows a schematic representation of epipolar determination of three-dimensional target locations from two-dimensional image locations and their geometrical relationships in space, according to an example embodiment of the present disclosure. A point P1, associated with image plane 410 is matched to a point P2 to determine a location P1,2 in three-dimensional space, using principles of epipolar geometry. In brief: the ray passing from a source S1 through a target region to point P1 is on a plane 417 which is determined also by rays passing from S2 to intersect it. The continuations of these rays intersect plane 412 along an epipolar line 415.


The processor 30 then evaluates points for their relative suitability as the optimal available projection pair to extend a vascular centerline in three-dimensional space. In some embodiments, a criterion for the optimal choice of a projection point is a distance of a projected point from its associated epipolar line. Ideally, each point P2 . . . PN lies on the epipolar line corresponding to the epipolar plane defined by S2 . . . SN. However, due to imaging position artifacts—for example, those described in relation to calibration and/or motion—there may remain some error, such that a point Pi, previously determined to be homologous to P1, lies off of its associated epipolar plane 418, and therefore a distance 420 away from its associated epipolar line 419. Optionally, the projection point closest to its associated epipolar line for a given homology group is scored as most suited as the projection point for extending the vascular centerline.


In some embodiments, one or more criteria for the optimal choice of a projection point relate to the continuity of extension that a projected point provides from projected points already determined. For example, a set of points along vascular centerline 421A, 421B can be used to determine a current direction of extension 423, and/or expected distance interval to the next extending point in three-dimensional space. In some embodiments, projection points which more closely match one or more of these, or another geometrical criterion, are scored as correspondingly more suitable choices.


The processor 30 next determines whether a different base point in the homology group should be chosen. If yes, the next base point is chosen, and further projections and evaluations continue. If no, the point having the optimal (most suited from among the available choice) score for inclusion in the three-dimensional vascular centerline is chosen. In some embodiments, the current vascular segment centerline is extended according to point specified by the identified optimal pair of projections. Vascular centerline determination continues, in some embodiments, where it is determined whether or not another sample (homology group) should be assessed for the current vascular centerline. If so, operations continue by selection of the next sample. If not, determination is made whether the last vascular segment centerline has been determined. If not, the next segment is selected, and processing continues at the first sample of that segment. If yes, the processor 30 is then able to determine vascular geometric information.


The example processor 30 determines vascular geometric information, such as vessel diameter across sample points of a selected two-dimensional projection, and extrapolated to the whole circumference of the blood vessel. In some embodiments, diameters across a plurality of projection angles are determined. In some embodiments, the projected view is selected from a single acquired image, optionally an image where the vessel is seen at its longest, and/or visible free of crossings. Optionally, the projected view is synthesized from two or more 2-D images.


To determine a diameter, the processor 30 determines an edge graph. For instance, a 2-D centerline projection is chosen which is mapped to locations relative to the locations of intensity values of the two-dimensional imaging data. Optionally, the projection selected by the processor 30 is one in which the blood vessel is projected at a maximum length. Optionally, the projection selected is one in which the blood vessel does not cross another vessel. In some embodiments, projections are chosen according to sub-regions of a two-dimensional centerline of a vascular segment, for example, to have maximum length and/or non-crossing properties within the sub-region. In some embodiments, images from orthogonal projections (and/or projections having another defined angular relationship) are selected.


The processor 30 may next estimate a starting vascular width (a radius, for example). The starting width is determined, for example, by generating an orthogonal profile to the centerline and identifying a peak of a weighted sum of the first and second derivatives of the image intensity along the profile. The processor 30 builds an orthogonal profile for points along the centerline; for example, for points sampled at intervals approximately equivalent to the vascular starting width. The precise choice of interval is not critical: using the radius as the interval is generally appropriate to provide a sufficient resolution for diameter estimation. Orthogonal profiles for sampled points are assembled by the processor 30 in a rectangular frame, somewhat as though the convolutions of the three-dimensional centerline were straightened, bringing the orthogonal profiles through the centerline into parallel alignment.


The example processor 30 may then determine connected routes along vascular edges. For instance, a first side (vascular edge) is chosen for route tracing. In some embodiments, a route is found by the processor 30 along the edge at approximately the distance of the initial radius, for example, by minimizing the energy that corresponds to a weighted sum of the first and second horizontal derivatives, optionally with the aid of an algorithm of the Dijkstra algorithm family. The processor 30 may reset centerline to the middle of the two vascular walls just determined. At this point, a three-dimensional model of a vascular tree includes centerlines specified by three-dimensional coordinates and diameters or other vascular geometric information.


In some embodiments, the processor 30 may divide the three-dimensional model into segments or branches, where a branch is defined as a section of a vessel (along the frame of reference established by the vascular centerline, for example) between bifurcations. The branches are numbered, for example, according to their generation in the tree. Branch points (nodes), in some embodiments, are determinable from points of the skeletal centerline representation which connect in more than two directions. In some embodiments, branch topology comprises division of a vascular tree model into distinct branches along the branch structure. In some embodiments, branch topology comprises recombination of branches, for example, due to collateral and/or shunting blood vessels.


In some embodiments, during a reconstruction process of the centerlines into a three-dimensional model, spatial location and radius of segments on each branch are sampled every small distance, for example every 1 mm, or every 0.1 mm to every 5 mm. In some embodiments, tree branches, corresponding to vessel segments between modeled bifurcations, correspond to vessel segments of a length of 1 mm, 5 mm, 10 mm, 20 mm, 50 mm, and even longer. In some embodiments, sampling every small distance increases the accuracy of a geometric model of the vessels, thereby increasing accuracy of flow characteristics calculated based on the geometric measurements. In some embodiments, the tree model is a reduced tree, limited to a single segment of a vessel, between two consecutive bifurcations of the vascular system. In some embodiments, the reduction is to a region of a bifurcation, optionally comprising a stenosis.


Returning to FIG. 9, after the three-dimensional model is created, the processor 30 synchronizes the model with the medical imaging device, as described above in connection with the procedure 50 of FIG. 5. After synchronization, the displayed three-dimensional model of a patient's coronary arteries is configured to rotate in correspondence with rotation of the medical imaging device, and vice versa, as described in connection with the procedures 60 and 100 respectively of FIGS. 6 and 8. While the model and imaging device are synchronized, the processor 30 determines if an indication to record an image is received (block 126). If an indication is received, the processor 30 receives the recorded image. The processor 30 may then display the medical image in a user interface in conjunction with the three-dimensional model and/or update the three-dimensional model using the new image (block 128).


The processor 30 next determines if a request has been received to calculate FFR values or another physiological index indicative of vascular function (block 130). If a request has not been received, the procedure 120 returns to rotate the three-dimensional model in correspondence with rotation of the medical imaging device, and vice versa, as described in connection with the procedures 60 and 100 respectively of FIGS. 6 and 8. However, if the request has been received, the example processor 30 calculates vascular parameters from the three-dimensional model (block 132). The processor 30 then generates and displays a physiological index that is indicative of functional significance of coronary lesions (e.g., FFR values). The FFR values, as shown in FIG. 7, may be color coded and displayed in conjunction with the three-dimensional model. The procedure 120 returns to rotate the three-dimensional model in correspondence with rotation of the medical imaging device, and vice versa, as described in connection with the procedures 60 and 100 respectively of FIGS. 6 and 8


FFR indicates differences in flow between a vascular model of a potentially stenotic vasculature, and a different vascular model derived from and/or homologous to the stenotic vasculature model. In some embodiments, changes to create an astenotic version of the vascular model comprise determinations of wall opening (widening) in stenotic regions based on reference width measurements obtained in one or more other portions of the vasculature. In some embodiments, reference width measurements are obtained from vascular portions on either side of a stenosis. In some embodiments, reference width measurements are obtained from a naturally astenotic vessel at a similar branch order to a stenotic segment.


In some embodiments, the FFR index comprises a ratio of flow in model comprising a potentially stenotic vascular segment to a model wherein said segment is replaced by a lower flow-resistance segment, and/or the resistance to flow due to said segment is removed. A potential advantage of this ratio determination is that the index comprises an expression of the effect of a potential therapeutic treatment to a vasculature, for example, an opening of a vascular region by percutaneous coronary intervention (“PCI”) such as stent implantation. Another potential advantage of this ratio is that it measures a parameter (fractional flow reserve) which, though well-accepted as providing an indication of a need for revascularization, is commonly determined in the art by invasive pressure measurements requiring direct access to both sides of stenotic lesion.


In some embodiments, a second model is constructed from the three-dimensional model. The second model optionally describes an at least partially healthier vascular system corresponding to the three-dimensional model. In some embodiments, the second model is constructed by changing a stenosis in the first model to be more open, as it would be if a stent were to open the stenosis; and in some embodiments the second model is constructed by choosing a section of a patient's vascular system which includes a healthy vessel similar to the problem vessel of the first model, and using it to replace a stenosed vessel. In some instances, a smoothing or normalization of the entire three-dimensional model is carried out such that specific targeting of a stenosis region is not necessary.


In some embodiments, an index indicative of the need for revascularization is calculated. This can be done based on the three-dimensional model or based on a comparison between the three-dimensional model and the second model of the vascular flow. The index is optionally used similarly to the pressure measurement-derived FFR index, to assess whether a stenosed vessel affects flow in the vascular system to such an extent that the prognosis for improvement in the subject's condition following inflation of the stenosed vessel is higher than the likelihood for complications resulting from the inflation itself.


The terms “FFR” and “FFR index” in all their grammatical forms are used throughout the present specification and claims to stand for the above-mentioned index, and not only for the FFR index mentioned in the Background section as an invasive measurement involving insertion of a guidewire equipped with a miniature pressure transducer across the stenosis. In some instances—in particular where distinctions among specific types of FFR and FFR-like indices are discussed—a subscript is used to distinguish them, for example FFRpressure for FFR derived from pressure measurements, and/or FFRflow where FFR is expressed in terms of flow determinations.


In some embodiments, the index is calculated based on a volume or other vascular parameter of a crown in the stenotic model and on a contribution of a stenosed vessel to the resistance to blood flow in the crown. In some embodiments, the FFR index is calculated as a ratio of flow resistance of a stenosed vessel in a vascular model which includes the stenosed vessel to flow resistance of an inflated version of the same vessel in a similar vascular model where the stenosed vessel was mathematically inflated. In some embodiments, the index is calculated as a ratio of flow resistance of a stenosed vessel in a vascular model to flow resistance of a neighboring similar healthy vessel in the vascular model. In some embodiments, the ratio is multiplied by a constant which accounts for different geometries of the stenosed vessel and the neighboring vessel.


In some embodiments, to evaluate an FFR index in a stenosed branch of a vessel tree, a one dimensional model of the vessel tree is created by the processor 30 to estimate a flow in the stenosed branch before and optionally also after (virtual) stent implantation. In some embodiments, in order to evaluate an FFR index in a stenosed branch of a vessel tree, a one dimensional model of the vessel tree is used to estimate the flow in the stenosed branch before and optionally also after stenosis inflation. Based on maximal peak flow of 500 mL/min and artery diameter of 5 mm, a maximal Reynolds number of the flow is:










Re

peak





_





flow


=



4


Q

peak





_





flow




π






d
max


v


=



4
·

500

mL
/
min




π
·

5

m

m


·

3.5

c

P





600






Equation





5.1







The above calculation assumes laminar flow. In laminar flow it is assumed, for example, that blood is a Newtonian and homogenous fluid. Another assumption which is optionally made is that flow in vessel branches is 1-D and fully developed across the cross section of the vessel.


Based on the assumptions, a pressure drop in each segment of a vessel tree is approximated according to Poiseuille formulation in straight tubes:










Δ


P
i


=




1

2

8

μ


L
i



π
·

d
i
4





Q
i


=


i



Q
i







Equation





5.2







Where custom characteri is a viscous resistance to flow of a segment of the vessel. Minor losses, due to bifurcations, constrictions and curvatures of the vessels are optionally added as additional resistances in series, according to the Darcy-Weisbach formulation:










Δ

p

=




ρ


V
2


2

·



K
i



=



8

ρ


Q
2




π
2



d
4



·



K
i








Equation





5.3









(
Q
)


=



Δ

p

Q

=


(



8

ρ



π
2



d
4



·



K
i



)

·
Q






Equation





5.4







where Ki are corresponding loss coefficients. The resistance of a branch to flow is calculated as the sum of the individual resistances of segments along the branch:











b

r

a

n

c

h


=




L





8

μ


π


r
4




d

l


=



8

μ

π





L




d

l



r


(
l
)


4









Equation





5.5





or












b

r

a

n

c

h


=



8
×

0
.
0


3


5


g
/
cm

·
s



π






d


l
i



r
i
4








Equation





5.6







In an example, a resistance array corresponding to the example depicted in FIG. 17:



custom character
s=[808 1923 1646 1569 53394 10543 55341 91454 58225], where the resistance to flow is in units of mmHg*s/mL. The above resistance array is for a vessel with stenosis, as evidenced by a peak of 91454 [mmHg*s/mL] in the resistance array. A resistance array for a tree model without stenosis is optionally calculated, based on Quantitative Coronary Angiography (“QCA”) methods for removing stenoses greater than 50% in area. In some embodiments, a tree model without stenosis is optionally calculated by replacing a stenosed vessel by an inflated vessel, that is, geometric measurements of a stenosed vessel section are replaced by measurements appropriate for an inflated vessel. In some embodiments geometric data (diameter and/or cross-sectional area) which is used for the inflated vessel is a maximum of the geometric data of the unstenosed vessel at a location just proximal to the stenosed location and at a location just distal to the stenosed location. In some embodiments geometric data (diameter and/or cross-sectional area) which is used for the inflated vessel is a minimum of the geometric data of the unstenosed vessel at a location just proximal to the stenosed location and at a location just distal to the stenosed location.


In some embodiments geometric data (diameter and/or cross-sectional area) which is used for the inflated vessel is an average of the geometric data of the unstenosed vessel at a location just proximal to the stenosed location and at a location just distal to the stenosed location. In some embodiments geometric data (diameter and/or cross-sectional area) which is used for the inflated vessel is calculated as a linear function of the geometric data of the unstenosed vessel between a location proximal to the stenosed location and a location distal to the stenosed location, that is, the inflated value is calculated taking into account distances of the stenosed location from the proximal location and from the distal location. A stented, also termed inflated, resistance array for the example is shown below:



custom character
n=[808 1923 1646 1569 53394 10543 55341 80454 51225]. The peak resistance, which was 91454 [mmHg*s/mL], is replaced in the inflated, or stented model, by 80454 [mmHg*s/mL].



FIG. 17 also depicts a coronary tree model 1010, a combination matrix 1020 depicting tree branch tags, and a combination matrix 1030 depicting tree branch resistances, all produced according to an example embodiment. The tree model is an example tree model with nine branches, tagged with branch numbers 0, 1a, 1b, 2a, 2b, 3a, 3b, 4a and 4b. The combination matrix 1020 includes nine rows, which contain data about nine stream lines, that is, nine paths through which fluid can flow through the tree model. Five of the rows include data for five full stream lines, in darker text, for five paths which go a full way to outlets of the tree model. Four of the rows include data for partial streamlines, in lighter text, for four paths which are not fully developed in the tree model, and do not go a full way to outlets of the tree model. The combination matrix 1030 depicts rows for the same tree model as depicted in the combination matrix 1020, with branch resistances located in matrix cells corresponding to branch tags in the combination matrix 1020.


After calculating a resistance of each branch, stream lines are defined from the tree origin, branch 0 to each outlet. To keep track of the stream lines, branches which constitute each stream line are listed in a combination matrix. In some embodiments, defined stream lines are also numbered, as shown in FIG. 17. As shown in FIG. 17, a tree model 1100 of a vascular system is provided with tags 1101 to 1105 numbering outlets of the tree model 1100, produced according to an example embodiment of the invention, the tags corresponding to stream lines. A pressure drop along a stream line j is calculated as a sum of pressure drops at each of its component branches (i), according to:










dp
j

=




i



Q
i







Equation





5.7







when each branch has a different flow Qi.


Based on a principle of mass conservation at each bifurcation, the flow rate in a mother branch is the sum of flow rates of daughter branches. For example:

Q1a=Q2a+Q2b=Q4a+Q4b+Q2b   Equation 5.8


Thus, for example, a pressure drop along a stream line which ends at branch 4a is:












dp

4

a


=




0


Q
0


+



1

a



Q

1

a



+



2

a



Q

2

a



+



4

a



Q

4

a




=


=




0


(


Q

4

a


+

Q

4

b


+

Q

2

b


+

Q

3

a


+

Q

3

b



)


+



1

a



(


Q

4

a


+

Q

4

b


+

Q

2

b



)


+



2

a



(


Q

4

a


+

Q

4

b



)


+



4

a



Q

4

a




=


=




Q

4

a


(


0

+


1

a


+


2

a


+


4

a



)

+


Q

4

b


(


0

+


1

a


+


2

a



)

+


Q

2

b


(


0

+


1

a



)

+


Q

3

a



0


+


Q

3

b



0



=



=



Q
4



ER

4
,
4



+


Q
5



ER

4
,
5



+


Q
1



ER

4
,
1



+


Q
2



ER

4
,
2



+


Q
3



ER

4
,
3















Equation

5.9







where Qj is a flow rate along stream line j, and ER4,j is a sum of common resistances of stream line j and stream line 4. A global expression is optionally formulated for the pressure drop along stream line j:

dpj=ΣQiERi,j   Equation 5.10


For a tree with k outlet branches, that is, for k full stream lines, a set of k linear equations are optionally used:












[




ER
11




ER
12















ER

1

k







ER
21




ER
22















ER

2

k



























ER

k





1





ER

k





2
















ER
kk




]



[





Q
1






Q
2











Q
k






]


=

[





dp
1






dp
2











dp
k






]











A
_

_

×

Q
_


=

DP
_






Equation





5.11







where indices 1 . . . k represent stream lines in the tree, and Q1 . . . Qk represent flow rates at corresponding outlet branches. The k×k matrix A consists of elements ER and is calculated from the combination matrix. For example, for the 5 stream lines tree shown in FIG. 6, the ER matrix is:









ER
=

[





0

+


1

a


+


2

b






0




0





0

+


1

a







0

+


1

a








0





0

+


1

b


+


3

a







0

+


1

b






0




0






0





0

+


1

b







0

+


1

b


+


3

b






0




0







0

+


1

a






0




0





0

+


1

a


+


2

a


+


4

a







0

+


1

a


+


2

a









0

+


1

a






0




0





0

+


1

a


+


2

a







0

+


1

a


+


2

a


+


4

b






]





Equation





5.12











ER
=

[




5

6

1

2

5




8

0

8



808



2

7

3

1



2731




808



1

2

9

9

7




2

4

5

4



808


808




808


2454



5

7

7

9

5



808


808




2731


808


808



9

5

7

5

4



4300





2

7

3

1



808


808


4300



5

5

5

2

5




]






Equation





5.13







In some embodiments, fluid pressure measurements are made, for example blood pressure measurements. Based on provided fluid pressure boundary conditions (Pin and Pout_i), a vector DP is defined, and Qi is calculated:

Q=A−1×DP   Equation 5.14


For example, for a constant pressure drop of 70 mmHg between the origin and all the outlets, the following flow distribution between the outlets is calculated:


Q=[1.4356, 6.6946, 1.2754, 0.7999, 1.4282], where the units of flow are mL/s.


In some embodiments, two models of a tree are calculated—a first model with stenoses, optionally as measured for a specific patient, and a second model without stenoses. FFR is calculated for each branch using the formula:









FFR
=


Q
S


Q
N






Equation





5.15







For example, for the tree described above, the FFR calculated for each one of the 9 branches is:


FFR=[1.00 1.00 1.00 1.00 1.00 1.00 1.00 0.8846 0.8874]


It should be emphasized that the above-calculated FFR, expressed directly in terms of flows QS, QN (FFRflow=QS/QN), is distinct in its determination from the pressure measurement-derived FFR (FFRpressure=Pd/Pa), calculated based on pressure differences distal Pd and proximal Pa to a stenosis. Furthermore, rather than being a comparison of two variables of a fixed system state, it is a comparison of two distinct states of the system.


For FFRpressure, a finding of a large difference in pressure measurements across a stenotic lesion (for example, FFRpressure≤0.75) suggests that removing the lesion would remove a substantial resistance custom character to flow, whereby blood flow, in turn, would substantially increase. “Substantially”, in this case, means “enough to be medically worthwhile”. This chain of reasoning relies on simplifying assumptions about remaining pressures and resistances in the vascular system different in detail from those recited hereinabove in relation to FFRflow.


Nevertheless, the two indices are closely related in what they describe. FFR as such—although it is commonly measured by pressure differences in a fixed system state—is defined as the ratio of maximum blood flow in a stenotic artery to maximum blood flow if the same artery were normal. Thus, FFRflow and FFRpressure may be characterized as differently arrived-at indexes of the same desired information: what fraction of flow can be restored, at least in principle, by intervention at a particular region of the cardiac vasculature.


Also, as for FFRpressure, a goal of determining FFRflow, in some embodiments, is the guidance of medical decision making by providing a rapidly calculable, easily interpreted, index. It is potentially sufficient for a medical professional seeking diagnostic assistance to establish by a vascular index such as FFRflow that intervention will make a medically meaningful change in perfusion. A ratio index is exemplary of an index that compactly expresses such change. It should also be noted that by describing an index that expresses a potential for change, FFRflow, like FFRpressure itself, potentially reduces the effects of errors and/or distraction in the absolute determination of vascular perfusion characteristics.


In embodiments in which the vascular function index is calculated based only on the stenotic model, the resistance custom characterS contributed by a stenosis to the total resistance of the lesion's crown is evaluated. The volume Vcrown of the crown distal to the stenosis is also calculated. An FFR index (FFRresistance) can then be calculated as a function which decreases with custom characterS and Vcrown. A representative example of such a function includes, without limitation,









FFR
=


(

1
+



S


k


V

c

r

o

w

n


3
/
4





P
a

-

P
0




)


-
1






Equation





5.15

a







where Pa is the aortic pressure, P0 is the pre-capillary pressure and k is a scaling law coefficient which can be adapted to the aortic pressure. A flow analysis of blood flow and optionally arterial pressure along a segment of interest, based on the tree model and optionally on other available hemodynamic measurements, such as aortic pressure and/or amount of injected contrast.


The example embodiment just described potentially provides a minimally-invasive physiological index indicative of functional significance of coronary lesions. The example method is optionally performed during a coronary angiography procedure, and calculations are optionally performed during the coronary angiography procedure, such that the minimally-invasive physiological index is provided in real-time.


In an example implementation, given a proximal arterial pressure, Pa, [mmHg], flow rate through a segment of interest Qs, [mL/s] is optionally derived from a concentration of iodine contrast material, based on an analysis of concentration-distance-time curves, and a geometric description of the segment of interest, including diameter d(l) [cm], and/or volume V(l) [ml] as a function of segment length.


In some embodiments, especially in case of large vessels such as the Left Anterior Descending coronary artery (LAD), blood flow can be measured for obtaining a flow model using a transthoracic echo Doppler, or other modalities such as MRI or SPECT. For a given segment, a total resistance of the segment (Rt, [mmHg*s/mL]) is optionally calculated by dividing arterial pressure by flow rate:










R
t

=


P
a


Q
s






Equation





5.16







where Rt corresponds to total resistance, Pa corresponds to arterial pressure, and Qs corresponds to flow rate through the vessel segment. From geometric description of the segment, a local resistance of the stenosis in the segment Rs, [mmHg*s/mL] is estimated. Estimation of Rs may be made by any one or more of the following methods: using an empirical lookup table; and/or using a function such as described in the above mentioned Kirkeeide reference; and/or by a cumulative summation of Poiseuille resistances:










R
s

=



1

2

8

μ

π






d

l


d
4








Equation





5.17







where integration is over samples of the segment (dl), d is optionally an arterial diameter of each sample, and μ is 0.035 g·cm−1·s−1, optionally blood viscosity. The segment's downstream resistance is calculated for the segment Rn, [mmHg*s/mL] as follows:

Rn=Rt−Rs   Equation 5.18


A normal flow through the segment without stenosis Qn, [mL/s], is calculated for example as follows:










Q
n

=


P
a


R
n






Equation





5.19







where Qn is an input flow to the segment, Pa is pressure proximal to the segment, and Rn is resistance to flow by vessels distal to the segment.


Another form of Fractional Flow Reserve (FFRcontrast-flow) is optionally derived as a ratio between measured flow rate through the stenosed segment and normal flow rate through the segment without stenosis:









FFR
=


Q
s


Q
n






Equation





5.20







In some embodiments, an index indicative of the potential effect of revascularization, such as an FFR index (for example, FFRcontrast-flow), is calculated using the data described below:


proximal arterial pressure Pa, [mmHg] is measured;


a total inlet flow through a vessel origin, such as the coronary origin Qtotal, [mL/s], is derived from a concentration of contrast material (such as iodine), optionally based on the analysis of concentration-distance-time curves. In some embodiments, especially for large vessels such as the Left Anterior Descending (LAD) coronary artery, flow is optionally recorded using a transthoracic echo Doppler and/or other modalities such as MRI and SPECT;


a subject's specific anatomy, including one or more of the following:

    • a geometric description of arterial diameters along vessel tree segments, for example up to 3-4 generations as a function of segment length d(l) [cm];
    • a geometric description of arterial lengths along the vessel tree segments (Li [cm]), for example up to 1-2 generations downstream of the segment of interest, and an accumulative crown length (Lcrown [cm]) downstream to the segment of interest: Lcrown=ΣLi;
    • a geometric description of arterial volumes along the vessel tree segments Vi [ml], for example up to 1-2 generations downstream of the segment of interest, and the accumulative crown volume (Vcrown [ml]) downstream to the segment of interest: Vcrown=ΣVi;
    • a myocardial mass (LV mass) distribution for the arterial segment of interest M [ml] (in some embodiments LV mass is optionally calculated using, for example, a transthoracic echo Doppler);
    • and
    • a reference parameter K or function F which correlates anatomic parameters such as described above with normal flow through the segment (without stenosis) Qn, [mL/s], for example:

      Qn=K·M or Qn=F(M)   Equation 5.21


Using the above data, the index indicative of the potential effect of revascularization, such as the FFR index, is optionally calculated by performing the following calculations for each vessel segment under consideration:

    • from the geometric parameter of the tree, such as length, volume, mass and/or diameter, a normal flow Qn in the segment is obtained;
    • from arterial pressure a resistance distal to the segment (Rn, [mmHg*s/mL]) is calculated, for example as follows: Rn=Pa/Qn;
    • from geometry a local resistance of the stenosis in the segment Rs, [mmHg*s/mL] is estimated, for example using one of the following methods:
    • a lookup table;
    • an empirical function such as described in the above mentioned Kirkeeide reference; and/or
    • a cumulative summation of Poiseuille resistances Rs=(128μ)/π∫(dl)/(d4) where the integration is over samples of the segment (dl), d is an arterial diameter of each sample, and μ is 0.035 g·cm−1·s−1 is optionally blood viscosity;
    • the total resistance for the segment Rt [mmHg*s/mL] is optionally calculated as: Rt=Rn+Rs
    • the flow through the stenosis segment Qs [mL/s] is optionally calculated as: Qs=Pa/Rt; and
    • the index, such as the fractional flow reserve (FFR), for the segment is optionally calculated as: FFR=Qs/Qn.


In some embodiments, the extent of the first model is such that it includes a stenosis, and extends distally as far as resolution of the imaging modality which produced the vessel model allows, and/or several bifurcations, for example 3 or 4 bifurcations distally to the stenosis. In some embodiments, the number of bifurcations is limited by the resolution to which vascular width can be determined from an image. For example, cutoff of the bifurcation order is set where the vascular width is no longer determinable to within a precision of 5%, 10%, 15%, 20%, or another larger, smaller, or intermediate precision. In some embodiments, sufficient precision is unavailable due, for example, to insufficient imaging resolution in the source images. Availability of a larger number of measurable bifurcations is a potential advantage for fuller reconstruction of the detailed vascular resistance in the crown vessels of a stenosis. It should be noted that in the current state of the art, CT scans generally provide a lower resolution than X-ray angiographic imaging, leading to a lowered availability of blood vessels from which vascular resistances can be determined.


CONCLUSION

It should be understood that various changes and modifications to the presently preferred embodiments described herein will be apparent to those skilled in the art. Such changes and modifications can be made without departing from the spirit and scope of the present subject matter and without diminishing its intended advantages. It is therefore intended that such changes and modifications be covered by the appended claims.

Claims
  • 1. An apparatus for synchronizing a three-dimensional model of a patient's coronary arteries with an orientation of a medical imaging device, the apparatus comprising: a memory device storing a three-dimensional model of a patient's coronary arteries, the three-dimensional model including a centerline through each of the coronary arteries, each centerline having sample points, each sample point along the respective centerline being defined in a three-dimensional coordinate system and being associated with vascular geometric information; anda processor communicatively coupled to the memory device, the processor configured to: receive an instruction to register the three-dimensional model to a medical imaging device,determine an orientation of the three-dimensional model that corresponds to a zero-degree starting position of the medical imaging device,receive potential rotational angulation positions of the medical imaging device,determine angular coordinates for the three-dimensional model that correspond to the potential rotational angulation positions of the medical imaging device,store to the memory device a correlation between the determined angular coordinates for the three-dimensional model and the potential rotational angulation positions of the medical imaging device,determine a current view angle orientation of the medical imaging device,use the correlation between the determined angular coordinates for the three-dimensional model and the potential rotational angulation positions of the medical imaging device to rotate the three-dimensional model using the current view angle orientation of the medical imaging device, anddisplay the rotated three-dimensional model in a user interface in a viewpoint orientation that matches the current view angle orientation of the medical imaging device,wherein the user interface is configured to: respond to user input to record a medical image and cause transmission of instructions to the medical imaging device, wherein the recorded medical image is displayed in conjunction with the three-dimensional model,and wherein based on a lock control being enabled, cause the medical imaging device to rotate in real-time synchronized with three-dimensional model, and based on the lock control not being enabled, allow the three-dimensional model to rotate without causing the medical imaging device to rotate in real-time.
  • 2. The apparatus of claim 1, wherein the processor is further configured to determine the orientation of the three-dimensional model by identifying a two-dimensional face or a plane of the three-dimensional model that aligns with a view angle at the zero-degree starting position of an image intensifier of the medical imaging device, wherein the identified two-dimensional face of the plane of the three-dimensional model corresponds to a top-down view of the patient's coronary arteries when the patient is laying supine.
  • 3. The apparatus of claim 1, wherein the vascular geometric information includes at least one of a vascular diameter, a vascular radius, a cross sectional area, a cross sectional profile, a vascular wall curvature, or vascular branching.
  • 4. The apparatus of claim 1, wherein the medical imaging device includes a C-arm configured to record x-ray angio-graphic images.
  • 5. The apparatus of claim 1, wherein the potential rotational angulation positions of the medical imaging device include RAO angulation positions, LAO angulation positions, cranial angulation positions, and caudal angulation positions.
  • 6. The apparatus of claim 5, wherein the angular coordinates for the three-dimensional model include coordinates along a roll axis and a pitch axis.
  • 7. The apparatus of claim 6, wherein the angular coordinates correspond to an amount the three-dimensional model is rotated along the roll axis and the pitch axis.
  • 8. The apparatus of claim 5, wherein the potential rotational angulation positions of the medical imaging device are at least one of stored in the memory device, received from the medical imaging device, or received via user input via an interface.
  • 9. The apparatus of claim 1, wherein the processor is further configured to: receive from the medical imaging device a message that is indicative of at least one of (i) a relative position change from the zero-degree starting position of the medical imaging device provided in a rotational angulation position, or (ii) an absolute position of the medical imaging device provided in a rotational angulation position;determine a new viewpoint orientation for the three-dimensional model based on the at least one of (i) or (ii) and the correlation between the determined angular coordinates for the three-dimensional model and the potential rotational angulation positions of the medical imaging device;rotate the three-dimensional model to the new viewpoint orientation; anddisplay in the user interface the rotated three-dimensional model.
  • 10. The apparatus of claim 9, wherein the processor is further configured to: receive, via the user interface, an imaging message indicative that a medical image is to be acquired;transmit an imaging instruction message to the medical imaging device; andreceive the medical image, the medical image acquired by the medical imaging device in the new viewpoint orientation.
  • 11. The apparatus of claim 10, wherein the processor is further configured to: identify coronary arteries in the medical image; determine centerlines through the identified coronary arteries;determine sample points along the centerlines in the three-dimensional coordinate system;determine vascular geometric information for the sample points along the centerline;determine a correspondence between the coronary arteries in the medical image and the three-dimensional model using at least the centerlines of the medical image and the centerlines of the three-dimensional model; andupdate the three-dimensional model with the determined vascular geometric information from the medical image.
  • 12. The apparatus of claim 10, wherein the imaging instruction message includes at least one of an indication to record the medical image, a rotation instruction, a lateral movement instruction, or a zoom-magnification instruction.
  • 13. The apparatus of claim 10, wherein the processor is configured to calculate and display, in the user interface, fractional flow reserve (“FFR”) values for the three-dimensional model.
  • 14. The apparatus of claim 1, wherein the memory device is located remotely from the processor.
  • 15. A method for synchronizing a three-dimensional model of a patient's coronary arteries with an orientation of a medical imaging device, the method comprising: receiving, in a processor from a memory device, a three-dimensional model of a patient's coronary arteries, the three-dimensional model including a centerline through each of the coronary arteries, each centerline including sample points, each sample point along the respective centerline being defined in a three-dimensional coordinate system and being associated with vascular geometric information;determining, via the processor, an orientation of the three-dimensional model that corresponds to a zero-degree starting position of a medical imaging device;receiving, in the processor, potential rotational angulation positions of the medical imaging device;determining, via the processor, angular coordinates for the three-dimensional model that correspond to the potential rotational angulation positions of the medical imaging device;storing, to the memory device via the processor, a correlation between the determined angular coordinates for the three-dimensional model and the potential rotational angulation positions of the medical imaging device;determining, via the processor, a current viewpoint of the three-dimensional model displayed in a user interface; andusing, via the processor, the current viewpoint of the three-dimensional model and the correlation between the angular coordinates for the three-dimensional model and the potential rotational angulation positions of the medical imaging device to cause the medical image device to rotate to a corresponding view angle orientation based on a lock control being enabled.
  • 16. The method of claim 15, wherein the medical imaging device includes a C-arm and the method is performed in a catheterization laboratory during at least one of a stent placement, a percutaneous coronary intervention, or an FFR determination.
  • 17. The method of claim 15, further comprising: receiving, in the processor, for the medical imaging device, at least two medical images recorded at different view angles with respect to the patient, the at least two medical images including depictions of the patient's coronary arteries;identifying, via the processor, the coronary arteries in the at least two medical images;determining, via the processor, centerlines through the identified coronary arteries;determining, via the processor, sample points along the centerlines in the three-dimensional coordinate system;determining, via the processor, vascular geometric information for the sample points along the centerlines;creating the three-dimensional model using the centerlines, the sample points along the centerlines in the three-dimensional coordinate system, and the vascular geometric information; andstoring the three-dimensional model to the memory device.
  • 18. The method of claim 15, wherein the processor causes the medical image device to rotate by transmitting at least one instruction message to the medical image device, the at least one instruction message including at least one of (i) a relative position change from the zero-degree starting position positon of the medical imaging device to the corresponding view angle orientation provided in a rotational angulation position, or (ii) an absolute position of the medical imaging device for the corresponding view angle orientation provided in a rotational angulation position.
  • 19. The method of claim 15, wherein the potential rotational angulation positions of the medical imaging device include RAO angulation positions, LAO angulation positions, cranial angulation positions, and caudal angulation positions, and wherein the angular coordinates for the three-dimensional model include coordinates along a roll axis and a pitch axis.
  • 20. The method of claim 15, wherein based on the lock control not being enabled, movement of the medical imaging device is disregarded thereby enabling the medical imaging device to be rotated without affecting display of the three-dimensional model.
  • 21. The method of claim 15, wherein the processor is configured to display an alert indicative that movement of the three-dimensional model is outside a movement capability of the medical imaging device.
PRIORITY CLAIM

The present application is a national phase filing of International Application No. PCT/IB2020/058901, filed on Sep. 23, 2020, which claims priority to U.S. Provisional Patent Application No. 62/904,147, filed on Sep. 23, 2019, the entire contents of each of which are incorporated herein by reference and relied upon.

PCT Information
Filing Document Filing Date Country Kind
PCT/IB2020/058901 9/23/2020 WO
Publishing Document Publishing Date Country Kind
WO2021/059165 4/1/2021 WO A
US Referenced Citations (420)
Number Name Date Kind
5150292 Hoffmann et al. Sep 1992 A
5638823 Akay et al. Jun 1997 A
6047080 Chen et al. Apr 2000 A
6236878 Taylor et al. May 2001 B1
6842638 Suri et al. Jan 2005 B1
7113623 Chen et al. Sep 2006 B2
7339585 Verstraelen et al. Mar 2008 B2
7369691 Kondo et al. May 2008 B2
7574026 Rasche et al. Aug 2009 B2
7657299 Huizenga et al. Feb 2010 B2
7693315 Krishnan et al. Apr 2010 B2
7738626 Weese et al. Jun 2010 B2
7808503 Duluk, Jr. et al. Oct 2010 B2
7864997 Aben Jan 2011 B2
7912260 Breeuwer et al. Mar 2011 B2
7970187 Puts et al. Jun 2011 B2
8073224 Strobel et al. Dec 2011 B2
8086000 Weijers et al. Dec 2011 B2
8090164 Bullitt et al. Jan 2012 B2
8155411 Hof et al. Apr 2012 B2
8298147 Huennekens et al. Oct 2012 B2
8311748 Taylor et al. Nov 2012 B2
8311750 Taylor Nov 2012 B2
8315812 Taylor Nov 2012 B2
8321150 Taylor Nov 2012 B2
8331314 Quiang et al. Dec 2012 B2
8496594 Taylor et al. Jul 2013 B2
8523779 Taylor et al. Sep 2013 B2
8548778 Hart et al. Oct 2013 B1
8554490 Tang et al. Oct 2013 B2
8560968 Nair Oct 2013 B1
8768669 Hart et al. Jul 2014 B1
8771195 Kim et al. Jul 2014 B2
8787641 Hof et al. Jul 2014 B2
8812246 Taylor Aug 2014 B2
8824752 Fonte et al. Sep 2014 B1
8837860 Grady et al. Sep 2014 B1
8861820 Fonte et al. Oct 2014 B2
8917925 Grady et al. Dec 2014 B1
8970578 Voros et al. Mar 2015 B2
9008405 Fonte et al. Apr 2015 B2
9042613 Spilker et al. May 2015 B2
9070214 Grady et al. Jun 2015 B1
9078564 Taylor Jul 2015 B2
9087147 Fonte Jul 2015 B1
9129418 Schormans et al. Sep 2015 B2
9138147 Schmitt et al. Sep 2015 B2
9153047 Grady et al. Oct 2015 B1
9189600 Spilker et al. Nov 2015 B2
9256936 Jacobs et al. Feb 2016 B2
9314584 Riley et al. Apr 2016 B1
9375191 Verstraelen et al. Jun 2016 B2
9406141 Kelm et al. Aug 2016 B2
9430827 Kelm et al. Aug 2016 B2
9466117 Habets et al. Oct 2016 B2
9471999 Ishii et al. Oct 2016 B2
9572495 Schmitt et al. Feb 2017 B2
9576360 Schormans et al. Feb 2017 B2
9613186 Fonte Apr 2017 B2
9615755 Riley et al. Apr 2017 B2
9633454 Lauritsch et al. Apr 2017 B2
9646361 Koo et al. May 2017 B2
9743835 Taylor Aug 2017 B2
9754082 Taylor et al. Sep 2017 B2
9786068 Ishii et al. Oct 2017 B2
9814433 Benishti et al. Nov 2017 B2
9858387 Lavi et al. Jan 2018 B2
9870634 Grady et al. Jan 2018 B2
9888896 Lauritsch et al. Feb 2018 B2
9934566 Sun et al. Apr 2018 B2
9940736 Ishii et al. Apr 2018 B2
9943233 Lavi et al. Apr 2018 B2
9965873 Grady et al. May 2018 B2
9968256 Taokowsky et al. May 2018 B2
9977869 Lavi et al. May 2018 B2
9999361 Sharma et al. Jun 2018 B2
10141074 Lavi et al. Nov 2018 B2
10143390 Ledoux et al. Dec 2018 B2
10159529 Taylor Dec 2018 B2
10176575 Isgum et al. Jan 2019 B2
10210956 Lavi et al. Feb 2019 B2
10219704 Lavi et al. Mar 2019 B2
10229516 Aben et al. Mar 2019 B2
10235796 Aben et al. Mar 2019 B2
10245001 Redel et al. Apr 2019 B2
10342442 Hattangadi et al. Jul 2019 B2
10354744 Sharma et al. Jul 2019 B2
10360674 Contini et al. Jul 2019 B2
10363018 Fukuda et al. Jul 2019 B2
10373700 Sharma et al. Aug 2019 B2
10376165 Lavi et al. Aug 2019 B2
10395366 Isgum et al. Aug 2019 B2
10395774 Lavi et al. Aug 2019 B2
10420610 Bai et al. Sep 2019 B2
10424063 Lavi et al. Sep 2019 B2
10441235 Lavi et al. Oct 2019 B2
10441239 Abe Oct 2019 B2
10456094 Fonte et al. Oct 2019 B2
10463336 Itu et al. Nov 2019 B2
10470730 Benishti et al. Nov 2019 B2
10559388 Lavi et al. Feb 2020 B2
10580526 Ma et al. Mar 2020 B2
10595807 Lavi et al. Mar 2020 B2
10631737 Lavi et al. Apr 2020 B2
10636146 Zhong et al. Apr 2020 B2
10650522 Hoi et al. May 2020 B2
10682180 Taylor Jun 2020 B2
10699407 Isgum et al. Jun 2020 B2
10733792 Aben et al. Aug 2020 B2
10740961 Reiber et al. Aug 2020 B2
10748285 Igarashi et al. Aug 2020 B2
10758200 Passerini et al. Sep 2020 B2
10776988 Grady et al. Sep 2020 B2
10803994 Lavi et al. Oct 2020 B2
10803995 Sharma et al. Oct 2020 B2
10828109 Redel Nov 2020 B2
10854329 Mohr et al. Dec 2020 B2
10964017 Pack et al. Mar 2021 B2
10964071 Grady et al. Mar 2021 B2
11004198 Isgum et al. May 2021 B2
11017531 Harish et al. May 2021 B2
11031136 Grass et al. Jun 2021 B2
11055845 Nickisch et al. Jul 2021 B2
11076770 Lavi et al. Aug 2021 B2
11081237 Lavi et al. Aug 2021 B2
11083377 Bouwman et al. Aug 2021 B2
11083524 Taylor Aug 2021 B2
11087884 Sankaran et al. Aug 2021 B2
11090118 Taylor Aug 2021 B2
11116575 Taylor Sep 2021 B2
11127503 Rabbat et al. Sep 2021 B2
11138733 Lavi et al. Oct 2021 B2
11141123 Homann et al. Oct 2021 B2
11160524 Lavi et al. Nov 2021 B2
11179043 Haase et al. Nov 2021 B2
11185368 Spilker et al. Nov 2021 B2
11195278 Nickisch et al. Dec 2021 B2
11202612 Sakaguchi Dec 2021 B2
11216944 Reiber et al. Jan 2022 B2
11272845 Cheline et al. Mar 2022 B2
11278208 Lavi et al. Mar 2022 B2
11282170 Gauriau et al. Mar 2022 B2
11288811 Tu et al. Mar 2022 B2
11288813 Grady et al. Mar 2022 B2
11295864 Benishti et al. Apr 2022 B2
11298187 Taylor Apr 2022 B2
11304665 Sharma et al. Apr 2022 B2
11308621 Tu et al. Apr 2022 B2
11328824 Fonte May 2022 B2
11341631 Song et al. May 2022 B2
11375904 Igarashi Jul 2022 B2
11382569 Grady et al. Jul 2022 B2
11389130 Itu et al. Jul 2022 B2
11398029 Grady et al. Jul 2022 B2
11406337 Lavi et al. Aug 2022 B2
11406339 Mistretta et al. Aug 2022 B2
11409422 Olivan Bescos et al. Aug 2022 B2
11410308 Gulsun et al. Aug 2022 B2
11423532 Takahashi Aug 2022 B2
11424036 Fonte et al. Aug 2022 B2
11424038 Grady et al. Aug 2022 B2
11443428 Petersen et al. Sep 2022 B2
11445923 Tu et al. Sep 2022 B2
11462326 Wang et al. Oct 2022 B2
11462329 Rabbat et al. Oct 2022 B2
11468567 Groth et al. Oct 2022 B2
11482339 Koo et al. Oct 2022 B2
11490867 Homann et al. Nov 2022 B2
11494904 Fonte et al. Nov 2022 B2
11501485 Grady et al. Nov 2022 B2
11508460 Wang et al. Nov 2022 B2
11510587 Kristanto et al. Nov 2022 B2
11521755 Taylor et al. Dec 2022 B2
11523744 Freiman et al. Dec 2022 B2
11538161 Wang et al. Dec 2022 B2
11540931 Grady et al. Jan 2023 B2
11557036 Liao et al. Jan 2023 B2
11557069 Senzig et al. Jan 2023 B2
11559274 Auvray et al. Jan 2023 B2
11564746 Spilker et al. Jan 2023 B2
11564748 Thienphrapa et al. Jan 2023 B2
11574406 Chen et al. Feb 2023 B2
11576621 Sharma et al. Feb 2023 B2
11576626 Fonte et al. Feb 2023 B2
11576637 Schmitt et al. Feb 2023 B2
11576639 Song et al. Feb 2023 B2
11583340 Taylor Feb 2023 B2
11589924 Passerini et al. Feb 2023 B2
11599996 Isgum et al. Mar 2023 B2
11607189 Tu et al. Mar 2023 B2
11610309 Kweon et al. Mar 2023 B2
11610318 Grady et al. Mar 2023 B2
11615894 Lavi et al. Mar 2023 B2
11617620 Tran et al. Apr 2023 B2
11633118 Freiman et al. Apr 2023 B2
11638609 Sankaran et al. May 2023 B2
11642171 Jaquet et al. May 2023 B2
11653833 Sanders et al. May 2023 B2
11664128 Haase et al. May 2023 B2
11688502 Anderson et al. Jun 2023 B2
11690518 Haase et al. Jul 2023 B2
11694339 Schormans et al. Jul 2023 B2
11707196 Lavi et al. Jul 2023 B2
11707242 Van Walsum et al. Jul 2023 B2
11710569 Grass et al. Jul 2023 B2
11728037 Lavi et al. Aug 2023 B2
11741574 Kweon et al. Aug 2023 B2
11741602 Reiber et al. Aug 2023 B2
11744472 Zhao et al. Sep 2023 B2
11744544 Sheehan et al. Sep 2023 B2
11748902 Bai et al. Sep 2023 B2
11756195 Kweon et al. Sep 2023 B2
11769254 Song et al. Sep 2023 B2
11776149 Wang et al. Oct 2023 B2
11779225 Adiyoso Oct 2023 B2
11779233 Huo et al. Oct 2023 B2
11779294 Liu et al. Oct 2023 B2
11786202 Yin et al. Oct 2023 B2
11793575 Taylor Oct 2023 B2
11803966 Denzinger et al. Oct 2023 B2
11810290 Flohr et al. Nov 2023 B2
11810661 Barley et al. Nov 2023 B2
11816836 Isgum et al. Nov 2023 B2
11816837 Lavi et al. Nov 2023 B2
11826106 Hart et al. Nov 2023 B2
11826175 Itu et al. Nov 2023 B2
11847547 Wang et al. Dec 2023 B2
20030105401 Jago et al. Jun 2003 A1
20040019264 Suurmond et al. Jan 2004 A1
20040066958 Chen et al. Apr 2004 A1
20050043614 Huizenga et al. Feb 2005 A1
20050249327 Wink et al. Nov 2005 A1
20050272992 O'Donnell Dec 2005 A1
20060036167 Shina Feb 2006 A1
20060084862 Suurmond et al. Apr 2006 A1
20070031019 Lesage et al. Feb 2007 A1
20070167833 Redel et al. Jul 2007 A1
20080020362 Cotin et al. Jan 2008 A1
20080187199 Gulsun et al. Aug 2008 A1
20080205722 Schaefer et al. Aug 2008 A1
20090016483 Kawasaki et al. Jan 2009 A1
20090171321 Callaghan Jul 2009 A1
20090312648 Zhang et al. Dec 2009 A1
20100010428 Yu et al. Jan 2010 A1
20100017171 Spilker et al. Jan 2010 A1
20100021025 Hof et al. Jan 2010 A1
20100067760 Zhang et al. Mar 2010 A1
20100125197 Fishel May 2010 A1
20100160764 Steinberg et al. Jun 2010 A1
20100220917 Steinberg et al. Sep 2010 A1
20100296709 Ostrovsky-Berman et al. Nov 2010 A1
20100298719 Thrysoe et al. Nov 2010 A1
20110015530 Misawa Jan 2011 A1
20110096907 Mohamed Apr 2011 A1
20110134433 Yamada Jun 2011 A1
20110135175 Ostrovsky-Berman et al. Jun 2011 A1
20110142313 Pack et al. Jun 2011 A1
20110182492 Grass et al. Jul 2011 A1
20120014574 Ferschel Jan 2012 A1
20120041318 Taylor Feb 2012 A1
20120041739 Taylor Feb 2012 A1
20120053918 Taylor Mar 2012 A1
20120053919 Taylor Mar 2012 A1
20120053921 Taylor Mar 2012 A1
20120059246 Taylor Mar 2012 A1
20120059249 Verard Mar 2012 A1
20120062841 Stetson et al. Mar 2012 A1
20120072190 Sharma et al. Mar 2012 A1
20120150048 Kang et al. Jun 2012 A1
20120177275 Suri Jul 2012 A1
20120230565 Steinberg et al. Sep 2012 A1
20120236032 Arvidsson Sep 2012 A1
20120243761 Senzig et al. Sep 2012 A1
20130060133 Kassab et al. Mar 2013 A1
20130094745 Sundar Apr 2013 A1
20130158476 Olson Jun 2013 A1
20130226003 Edic et al. Aug 2013 A1
20130229621 Stetson et al. Sep 2013 A1
20130324842 Mittal et al. Dec 2013 A1
20140005535 Edic et al. Jan 2014 A1
20140086461 Yao et al. Mar 2014 A1
20140094693 Cohen et al. Apr 2014 A1
20140100451 Tolkowsky et al. Apr 2014 A1
20140121513 Tolkowsky et al. May 2014 A1
20140142398 Patil et al. May 2014 A1
20140200867 Lavi et al. Jul 2014 A1
20140303495 Fonte et al. Oct 2014 A1
20140371578 Auvray et al. Dec 2014 A1
20150201897 Kyriakou Jul 2015 A1
20150250395 Igarashi Sep 2015 A1
20150265162 Lavi et al. Sep 2015 A1
20150302578 Grady et al. Oct 2015 A1
20150335304 Lavi et al. Nov 2015 A1
20150339847 Benishti et al. Nov 2015 A1
20150342551 Lavi et al. Dec 2015 A1
20160007945 Taylor Jan 2016 A1
20160022371 Sauer et al. Jan 2016 A1
20160073928 Soper Mar 2016 A1
20160110866 Taylor Apr 2016 A1
20160110867 Taylor Apr 2016 A1
20160128661 Taylor May 2016 A1
20160157802 Anderson Jun 2016 A1
20160228000 Spaide Aug 2016 A1
20160247279 Lavi et al. Aug 2016 A1
20160371456 Taylor et al. Dec 2016 A1
20170018116 Sun et al. Jan 2017 A1
20170039736 Aben et al. Feb 2017 A1
20170224418 Boettner et al. Aug 2017 A1
20170286628 Shim Oct 2017 A1
20170325770 Edic et al. Nov 2017 A1
20180032653 Aben et al. Feb 2018 A1
20180075221 Vergaro et al. Mar 2018 A1
20180089829 Zhong et al. Mar 2018 A1
20180182096 Grady et al. Jun 2018 A1
20180211386 Ma et al. Jul 2018 A1
20180235561 Lavi et al. Aug 2018 A1
20180243033 Tran et al. Aug 2018 A1
20180268941 Lavi et al. Sep 2018 A1
20180315193 Paschalakis et al. Nov 2018 A1
20180330507 Schormans et al. Nov 2018 A1
20180344173 Tu et al. Dec 2018 A1
20180344174 Schmitt et al. Dec 2018 A9
20190005737 Auvray et al. Jan 2019 A1
20190019347 Auvray et al. Jan 2019 A1
20190130578 Gulsun May 2019 A1
20190282199 Merritt Sep 2019 A1
20200126229 Lavi et al. Apr 2020 A1
20200138521 Aben et al. May 2020 A1
20200160509 Pack et al. May 2020 A1
20200222018 van Walsum et al. Jul 2020 A1
20200265958 Haase et al. Aug 2020 A1
20200337664 Homann et al. Oct 2020 A1
20200394795 Isgum et al. Dec 2020 A1
20210022617 Zhao et al. Jan 2021 A1
20210035290 Aben et al. Feb 2021 A1
20210244475 Taylor Aug 2021 A1
20210259559 Tu et al. Aug 2021 A1
20210267690 Taylor Sep 2021 A1
20210272030 Sankaran et al. Sep 2021 A1
20210275124 Huo et al. Sep 2021 A1
20210280318 Huo et al. Sep 2021 A1
20210282731 Vaillant et al. Sep 2021 A1
20210282860 Taylor Sep 2021 A1
20210290308 Mihalef et al. Sep 2021 A1
20210298706 Tu et al. Sep 2021 A1
20210298708 Aben et al. Sep 2021 A1
20210334963 Isgum et al. Oct 2021 A1
20210338088 Bouwman et al. Nov 2021 A1
20210345889 Tu et al. Nov 2021 A1
20210358634 Sankaran et al. Nov 2021 A1
20210361174 Lavi et al. Nov 2021 A1
20210361176 Huo et al. Nov 2021 A1
20210374950 Gao et al. Dec 2021 A1
20210375474 Lavi et al. Dec 2021 A1
20210383539 Haase et al. Dec 2021 A1
20210401400 Sheehan et al. Dec 2021 A1
20220012876 Sommer et al. Jan 2022 A1
20220015730 Haase et al. Jan 2022 A1
20220028080 Lavi et al. Jan 2022 A1
20220036646 Song et al. Feb 2022 A1
20220039769 M et al. Feb 2022 A1
20220047236 Lavi et al. Feb 2022 A1
20220054022 Van Lavieren et al. Feb 2022 A1
20220079455 Haase et al. Mar 2022 A1
20220079540 Sankaran et al. Mar 2022 A1
20220079563 Kemp Mar 2022 A1
20220087544 Schmitt et al. Mar 2022 A1
20220092775 Denzinger et al. Mar 2022 A1
20220092784 Tu et al. Mar 2022 A1
20220101535 Thamm et al. Mar 2022 A1
20220110687 Spilker et al. Apr 2022 A1
20220151580 Itu et al. May 2022 A1
20220164953 Gulsun et al. May 2022 A1
20220167938 Grass et al. Jun 2022 A1
20220172368 Lavi et al. Jun 2022 A1
20220183655 Huang et al. Jun 2022 A1
20220211280 Lavi et al. Jul 2022 A1
20220211439 Sankaran et al. Jul 2022 A1
20220230312 Choi et al. Jul 2022 A1
20220233081 Cheline et al. Jul 2022 A1
20220254028 Liu et al. Aug 2022 A1
20220261997 Liu et al. Aug 2022 A1
20220262000 Haase et al. Aug 2022 A1
20220273180 Lavi et al. Sep 2022 A1
20220277447 Wang et al. Sep 2022 A1
20220285034 Lavi et al. Sep 2022 A1
20220319004 Bruch-el et al. Oct 2022 A1
20220319116 Wang et al. Oct 2022 A1
20220335612 Bruch-El et al. Oct 2022 A1
20220351369 Haase et al. Nov 2022 A1
20220415510 Wang et al. Dec 2022 A1
20230037338 Wang et al. Feb 2023 A1
20230038364 Bhowmick et al. Feb 2023 A1
20230084748 Lavi et al. Mar 2023 A1
20230108647 Tu et al. Apr 2023 A1
20230113721 Kassel et al. Apr 2023 A1
20230144795 Wang et al. May 2023 A1
20230148977 Fonte et al. May 2023 A1
20230186472 Kweon et al. Jun 2023 A1
20230196582 Grady et al. Jun 2023 A1
20230197286 Grady et al. Jun 2023 A1
20230230235 Isgum et al. Jul 2023 A1
20230237652 Flexman et al. Jul 2023 A1
20230245301 Wang et al. Aug 2023 A1
20230252628 Haase et al. Aug 2023 A1
20230263401 Escaned-Barbosa et al. Aug 2023 A1
20230277247 Taylor et al. Sep 2023 A1
20230282365 Lavi et al. Sep 2023 A1
20230298176 Choi et al. Sep 2023 A1
20230309943 van Walsum et al. Oct 2023 A1
20230320789 Bai et al. Oct 2023 A1
20230326127 Zhong et al. Oct 2023 A1
20230334659 Kuo et al. Oct 2023 A1
20230346236 Lavi et al. Nov 2023 A1
20230352152 Grady et al. Nov 2023 A1
20230355107 Haase et al. Nov 2023 A1
20230360803 Sankaran et al. Nov 2023 A1
20230386037 Denzinger et al. Nov 2023 A1
20230404525 Sheehan et al. Dec 2023 A1
20240029529 Scalisi Jan 2024 A1
Foreign Referenced Citations (100)
Number Date Country
2010298333 Jan 2012 AU
104282009 Jan 2015 CN
113837985 Dec 2021 CN
1396274 Mar 2004 EP
2163272 Mar 2010 EP
2633815 Sep 2013 EP
2779907 Sep 2014 EP
2873371 May 2015 EP
3125764 Feb 2017 EP
2633815 Jun 2017 EP
3363350 Aug 2018 EP
3460688 Mar 2019 EP
3477551 May 2019 EP
3763285 Jan 2021 EP
3847956 Jul 2021 EP
2776960 Sep 2021 EP
3534372 Sep 2021 EP
3871184 Sep 2021 EP
3881758 Sep 2021 EP
3884868 Sep 2021 EP
3282380 Nov 2021 EP
3282381 Nov 2021 EP
3903672 Nov 2021 EP
3912139 Nov 2021 EP
3664026 Feb 2022 EP
3945469 Feb 2022 EP
3949860 Feb 2022 EP
3951705 Feb 2022 EP
3076854 Apr 2022 EP
3979259 Apr 2022 EP
3258446 May 2022 EP
4026143 Jul 2022 EP
4026491 Jul 2022 EP
4026492 Jul 2022 EP
4029438 Jul 2022 EP
3298959 Sep 2022 EP
3989828 Nov 2022 EP
3157411 Dec 2022 EP
3606437 Dec 2022 EP
4104765 Dec 2022 EP
4131150 Feb 2023 EP
4145391 Mar 2023 EP
3169237 Apr 2023 EP
4160528 Apr 2023 EP
3403582 Jun 2023 EP
3743883 Jun 2023 EP
3989832 Aug 2023 EP
3652747 Sep 2023 EP
4104766 Sep 2023 EP
3602485 Oct 2023 EP
4064181 Nov 2023 EP
3602487 Dec 2023 EP
H08-131429 May 1996 JP
2003-508152 Mar 2003 JP
2003-514600 Apr 2003 JP
2004-243117 Sep 2004 JP
2007-502644 Feb 2007 JP
2007-325920 Dec 2007 JP
4177217 Nov 2008 JP
2010-042247 Feb 2010 JP
2011-212314 Oct 2011 JP
2013-090799 May 2013 JP
2010-505493 Jul 2013 JP
2013-534154 Sep 2013 JP
2014-064915 Apr 2014 JP
2015-503416 Feb 2015 JP
2015-527901 Sep 2015 JP
2012324 Aug 2015 NL
WO 200121057 Mar 2001 WO
WO 2007066249 Jun 2007 WO
WO 2010033971 Mar 2010 WO
WO 2011038044 Mar 2011 WO
WO 2012021037 Feb 2012 WO
WO 2012021307 Feb 2012 WO
WO 2012173697 Dec 2012 WO
WO 2014027692 Feb 2014 WO
WO 2014064702 May 2014 WO
WO 2014111927 Jul 2014 WO
WO 2014111929 Jul 2014 WO
WO 2014111930 Jul 2014 WO
WO 2015059706 Apr 2015 WO
WO 2017199245 Nov 2017 WO
WO 2017199246 Nov 2017 WO
WO 2018165478 Sep 2018 WO
WO 2020053099 Mar 2020 WO
WO 2020084101 Apr 2020 WO
WO 2020201942 Oct 2020 WO
WO 2021016071 Jan 2021 WO
2021059165 Apr 2021 WO
WO 2021175039 Sep 2021 WO
WO 2021191909 Sep 2021 WO
WO 2021258835 Dec 2021 WO
WO 2022000727 Jan 2022 WO
WO 2022000729 Jan 2022 WO
WO 2022000733 Jan 2022 WO
WO 2022000734 Jan 2022 WO
WO 2022000976 Jan 2022 WO
WO 2022000977 Jan 2022 WO
WO 2022002765 Jan 2022 WO
WO 2022069208 Apr 2022 WO
Non-Patent Literature Citations (70)
Entry
Chen et al.; “3-D Reconstruction of Coronary Arterial Tree to Optimize Angiographic Visualization;” IEEE Transactions on Medical Imaging, vol. 19, No. 4, Apr. 2000; pp. 318-336 (Year: 2000).
International Search Report and Written Opinion, dated Dec. 15, 2020, for International Application Serial No. PCT/IB2020/058901 filed Sep. 23, 2020.
Abraham et al., “Alternative routes in road networks”, ACM Journal of Experimental Algorithmics, Association of Computing Machinery, vol. 18(1):1.3:2-1.3:17 (2013).
Andriotis et al., “A new method of three-dimensional coronary artery reconstruction from X-Ray angiography: Validation against a virtual phantom and multislice computed tomography”, Catheterization and Cardiovascular Interventions, vol. 71:28-43 (2008).
Barnea, “Model-based estimation of coronary vessel diameter in angiographic images”, Proceedings of the 20th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, vol. 20:513-516 (1998).
Barratt et al., “Reconstruction and quantification of the carotid artery bifurcation from 3-D ultrasound images”, IEEE Transactions on Medical Imaging, vol. 23(5):567-583 (2004).
Bullitt et al., “Determining malignancy of brain tumors by analysis of vessel shape”, Medical Image Computing and Computer-Assisted Intervention, MICCAI 2004 Conference Proceedings, Lecture notes in Computer Science, LNCS, 3217:645-653.
Caiati et al., “New noninvasive method for coronary flow reserve assessment: Contrast-enhanced transthoracic second harmonic echo doppler”, Circulation, vol. 99:771-778 (1999).
Caiati et al., “Detection, location, and severity assessment of left anterior descneding coronary artery stenoses by means of contrast-enhanced transthoracic harmonic echo dopper”, European Heart Journal, vol. 30:1797-1806 (2009).
Chung, “Image segmentation methods for detecting blood vessels in angiography”, 2006 9th International Conference on Control, Automation, Robotics and Vision, Singapore, pp. 1-6 (2006).
Dickie et al., “Live-vessel: interactive vascular image segmentation with simultaneous extraction of optimal medial and boundary paths”, Technical Report TR 2009-23, School of Computing Science, Simon Fraser University, Burnaby, BC, Canada, Nov. 2009.
Frangi et al., “Multiscale vessel and enhancement filtering”, Medical Image Computing and Computer-Assisted Intervention, MICCA '98 Lecture Notes in Computer Science, vol. 1496:130-137 (1998).
Fraz, “Blood vessel segmentation methodologies, in retinal images—a survey”, Computer Methods and Programs in Biomedicine, vol. 108:407-433 (2012).
Fusejima, “Noninvasive measurement of coronary artery blood flow using combined two-dimensional and doppler echocardiography”, JACC vol. 10(5):1024-1031 (1987).
Hawkes et al., “Validation of volume blood flow measurements using three-dimensional distance-concentration functions detived from digital X-Ray angiograms”, Investigative Radiology, vol. 29(4):434-442 (1994).
Hoffmann et al., “Determination of instantaneous and average blood flow rates from digital angiograms of vessel phantoms using distance-density curves”, Investigative Radiology, vol. 26(3):207-212 (1991).
Holdsworth et al., “Quantitative angiographic blood-flow measurement using pulsed intra-arterial injection”, Medical Physics, vol. 26(10):2168-2175 (1999).
Huo et al., “Intraspecific scaling laws of vascular trees”, J.R. Soc. Interface vol. 9:190-200 (2012).
Janssen et al., “New approaches for the assessment of vessel sizes in quantitative (cardio-)vascular X-ray analysis”, Int J Cardiovasc Imaging vol. 26:259-271 (2010).
Kappetein et al., “Current percutaneous coronary intervention and coronary artery bypass grafting practices for three-vessel and left main coronary artery disease: Insights from the SYNTAX run-in phase”, European Journal of Cardio-Thoracic Surgery, vol. 29:486-491 (2010).
Kass et al., “Snakes: active contour models”, Int. J. Comput. Vis. vol. 1:321-331 (1987).
Kirkeeide, “Coronary obstructions, morphology and physiologic significance”, Quantitative Coronary Arteriography, Chap. 11:229-244 (1991).
Lethen et al., “Validation of noninvasive assessment of coronary flow velocity reserve in the right coronary artery—a comparison of transthoracic echocardiographic results with intracoronary doppler flow wire measurements”, European Heart Journal, vol. 24:1567-1575 (2003).
Li et al., “Minimization of region-scalable fitting energy for image segmentation”, in IEEE Transactions on Image Processing, vol. 17(10):1940-1949 (2008).
Meimoun et al., “Non-invasive assessment of coronary flow and coronary flow reserve by transthoracic doppler echocardiography: a magic tool for the real world”, European Journal of Echocardiography, vol. 9:449-457 (2008).
Mercer-Rosa et al., “Illustration of the additional value of real-time 3-dimensional echocardiography to conventional transthoracic and transesophageal 2-dimensional echocardiography in imaging muscular ventricular septal defects: does this have any impact on individual patient treatment”, Journal of the American Society of Echocardiography, vol. 19(12):1511-1519 (2006).
Molloi et al., “Quantification of fractional flow reserve using angiographic image data”, World Congress on Medical Physics and Biomedical Engineering, Munich, Germany, Sep. 7-12, 2009.
Molloi et al., “Estimation of coronary artery hyperemic blood flow based on arterial lumen vol. using angiographic images”, Int J Cardiovasc Imaging, vol. 28:1-11 (2012).
Ng et al., “Novel QCA methodologies and angiographic scores”, Int J Cardiovasc Imaging vol. 27:157-165 (2011).
Pellot et al., “A 3D reconstruction of vascular structures from two X-Ray angiograms using an adapted simulated annealing algorithm”, IEEE Transactions of Medical Imaging, vol. 13(1):48-60 (1994).
Pinho et al., “Assessment and stenting of tracheal stenosis using deformable shape models”, Medical Image Analysis, vol. 15(2):250-266 (2010).
Polytimi et al., “Close to transplant renal artery stenosis and percutaneous transluminal treatment”, Journal of Transplantation, vol. 2011, 7 pages (2011).
Sarwal et al., “3-D reconstruction of coronary arteries”, Proceedings of the 16th Annual Intl. Conference of the IEEE Engineering in Medicine and Biology Society, Engineering Advances: New Opportunities for Biomedical Engineers, Nov. 3, 1994, pp. 504-505.
Sato et al., “A viewpoint determination system for stenosis diagnosis and quantification in coronary angiogrphic image acquisition”, IEEE Transactions on Medical Imaging, vol. 17(1):121-137 (1998).
Seifalian et al., “A new algorithm for deriving pulsatile blood flow waveforms tested using simulated dynamic angiographic data”, Neuroradiology, vol. 31:263-269 (1989).
Seifalian et al., “Blood flow measurements using 3D distance-concentration functions derived from digital x-ray angiograms”, Cardiovascular Imaging, Chap. 33:425-442 (1996).
Seifalian et al., “Validation of a quantitative radiographic technique to estimate pulsatile blood flow waveforms using digital subtraction angiographic data”, Journal of Biomedical Engineering, vol. 13(3):225-233 (1991).
Shang et al., “Vascular active contour for vessel tree segmentation”, in IEEE Transactions on Biomedical Engineering, vol. 58(4):1023-1032 (2011).
Shpilfoygel et al., “Comparison of methods for instantaneous angiographic blood flow measurement”, Medical Physics, vol. 26(6):862-871 (1999).
Sianos et al., “The SYNTAX score: an angiographic tool grading the complexity of coronary artery disease”, Euro Intervention, vol. 1(2):219-227 (2005).
Siogkas et al., “Quantification of the effect of percutaneous coronary angioplasty on a stenosed right coronary artery”, 2010 10th IEEE Intl. Conference on Information Technology and Applications in Biomedicine, Nov. 3-5, 210, pp. 1-4 (2010).
Slomka et al., “Fully automated wall motion and thickening scoring system for myocardial perfusion SPECT: Method development and validation in large population”, Journal of Nuclear Cardiology, vol. 19(2):291-302 (2012).
Sprague et al., “Coronary x-ray angiographic reconstruction and image orientation”, Medical Physics, vol. 33(3):707-718 (2006).
Sun et al., “Coronary CT angiography: current status and continuing challenges”, The British Journal of Radiology, vol. 85:495-510 (2012).
Takarada et al., “An angiographic technique for coronary fractional flow reserve measurement: in vivo validation”, International Journal of Cardiovascular Imaging, published online pp. 1-10, Aug. 31, 2012.
Termeer et al., “Visualization of myocardial perfusion derived from coronary anatomy”, IEEE Transactions on Visualization and Computer Graphics, vol. 14(6):1595-1602 (2008).
Tomasello et al., “Quantitative coronary angiography in the interventional cardiology”, Advances in the Diagnosis of Coronary Atherosclerosis, Chap. 14:255-272 (2011).
Tu et al., Assessment of obstruction length and optimal viewing angle from biplane X-ray angiograms, Int J Cardiovasc Imaging, vol. 26:5-17 (2010).
Tu et al., “In vivo assessment of optimal viewing angles from X-ray coronary angiography”, EuroIntervention, vol. 7:112-120 (2011).
Tu et al., “In vivo assessment of bifurcation optimal viewing angles and bifurcation angles by three-dimentional (3D) quantitative coronary angiography”, Int J Cardiovasc Imaging, published online Dec. 15, 2011, in 9 pages.
Tu et al., “The impact of acquisition angle differences on three-dimensional quantitative coronary angiography”, Catheterization and Cardiovascular Interventions, vol. 78:214-222 (2011).
Tuinenburg et al., “Dedicated bifurcation analysis: basic principles”, Int J Cardiovasc Imaging, vol. 27:167-174 (2001).
Voci et al., “Coronary flow: a new asset for the echo lab?”, European Heart Journal, vol. 25:1867-1879 (2004).
Weickert et al., “A scheme for coherence-enhancing diffusion filtering with optimized rotation invariance”, Computer Vision, Graphics, and Pattern Recognition Group, Technical Report, Computer Science Series, pp. 1-20 (2000).
Weickert, “Anisotropic diffusion in image processing”, ECMI, published by Teubner Stuttgart, Germany, 181 pages (2008).
Weickert et al., “A scheme for coherence-enhancing diffusion filtering with optimized rotation invariance”, Journal of Visual Communication and Image Representation, vol. 13(1-2):103-118 (2002).
Wong et al., “Quantification of fractional flow reserve based on angiographic image data”, The International Journal of Cardiac Imaging, vol. 28(1):13-22 (2012).
Wong et al., “Determination of fractional flow reserve (FFR) based on scaling laws: a simulation study”, Physics in Medicine and Biology, vol. 53:3995-4011 (2008).
Wong et al., “Automated technique for angiographic determination of coronary blood flow and lumen volume”, Acad. Radiol. vol. 13:186-194 (2006).
Xu et al., “Snakes, shapes, and gradient vector flow”, IEEE Transactions on Image Processing, vol. 7:359-369 (1998).
Yang et al., “Novel approach for 3-D reconstruction of coronary arteries from two uncalibrated angiographic images”, IEEE Transactions on Image Processing, vol. 18(7):1563-1572 (2009).
Youssef et al., “Role of computed tomography coronary angiography in the detection of vulnerable plaque, where does it stand among others?”, Angiology, vol. 1(2):1000111-1-1000111-8 (2013).
Zhang et al., “Quantification of coronary microvascular resistance using angiographic images for volumetric blood flow measurement: in vivo validation”, Am J Physio Heart Circ vol. 300(6):H2096-H2104 (2011).
Barrett et al., “Interactive live-wire 1-3 boundary extraction”, Medical Image Analysis, Oxford University Press, vol. 1(4):331-341 (1997).
Jiang et al., “Vascular tree reconstruction by minimizing a physiological functional cost”, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition—workshops, San Francisco, CA, pp. 178-185, doi: 10.1109/CVPRW.2010.5543593.
Marchenko, et al., “Vascular editor: from angiographic images to 3D vascular models”, Journal of Digital Imaging, vol. 23:386-398 (2010).
Rabbat et al., “Interpreting results of coronary computed tomography angiography-derived fractional flow reserve in clinical practice”, Journal of Cardiovascular Computed Tomography, vol. 11(5):1-6 (2017).
Wang et al., “Optimal viewing angle determination for multiple vessel segments in coronary angiographic image”, IEEE Transactions on Nuclear Science, vol. 61(3):1290-1303 (2014).
Wang et al., “Global optimization angiographic viewing angles for coronary arteries with multiple segments”, 35th Annual International Conference of the IEEE EMBS, pp. 2640-2643, Osaka, Japan, Jul. 3-7, 2013.
Office Action in European Application No. 20781104.3, dated Sep. 4, 2023, in 5 pages.
Related Publications (1)
Number Date Country
20220254131 A1 Aug 2022 US
Provisional Applications (1)
Number Date Country
62904147 Sep 2019 US