The disclosure relates generally to systems and methods suitable for use in the field of intravascular diagnostics and imaging, more specifically to systems and methods that support identifying side branches, junctions or other sections or features of a blood vessel.
Coronary artery disease is one of the leading causes of death worldwide. The ability to better diagnose, monitor, and treat coronary artery disease can be of life saving importance. Intravascular optical coherence tomography (OCT) is a catheter-based imaging modality that uses light to peer into coronary artery walls and generate images thereof for study. Utilizing coherent light, interferometry, and micro-optics, OCT can provide video-rate in-vivo tomography within a diseased vessel with micrometer level resolution.
Viewing subsurface structures with high resolution using fiber-optic probes makes OCT especially useful for minimally invasive imaging of internal tissues and organs. This level of detail made possible with OCT allows a clinician to diagnose, as well as monitor, the progression of coronary artery disease. OCT images provide high-resolution visualization of coronary artery morphology and can be used alone or in combination with other information such as angiography data and other sources of subject data to aid in diagnosis and treatment planning.
OCT imaging of portions of a patient's body provides a useful diagnostic tool for doctors and others. For example, imaging of coronary arteries by intravascular OCT may reveal the location of a narrowing or stenosis, which reduce blood flow and increase the risk of ischemia. This information helps cardiologists to choose between an invasive coronary bypass surgery and a less invasive catheter-based procedure such as angioplasty or stent delivery to mitigate the stenosis and restore blood flow. The presence of arterial side branches in the stenosis region also affects blood flow through the artery, and therefore is an important factor when designing a treatment plan for the patient.
The quantitative assessment of vascular pathology and its progression involves the calculation of different quantitative measures such as pressure drops which can rely on the accurate identification of fluid flow and geometry of the lumen, including side branch geometry. Side branches extending from a lumen in OCT images are often not easily identified. In part, this results because side branches can be obscured by the guidewire used in various OCT probes or otherwise obscured by stent struts, blood, and shadows.
Further, shadows and other imaging data artifacts can be challenging to resolve and eliminate. As a result, important landmarks along the length of an artery such as side branches can be mistaken for tissue or simply not identified. Given that placing a stent over a side branch can be damaging or when performed, it should be done knowingly, there is a need for a reliable technique that can identify side branches.
The present disclosure addresses these challenges and others.
In part, the disclosure relates to a method of detecting one or more branches of a blood vessel. The method includes storing one or more intravascular image datasets of the blood vessel, each intravascular dataset comprising a plurality of A-lines; detecting a lumen boundary in a first A-line image generated from a set of A-lines from the plurality of A-lines, wherein the first A-line image has an r dimension and an A-line dimension; specifying a search distance T; defining a search region, the search region bounded by the detected lumen boundary and a boundary offset therefrom by distance T; detecting edges in the search region; and identifying candidate branching region in response to the detected edges.
In one embodiment, the method includes flattening the A-line image using a first image processing operator; applying median smoothing to A-line image using a second image processing operator; and applying smoothing to A-line image using a third image processing operator to generate a filtered image. In one embodiment, the method includes identifying a first minimum-maximum pair in the filtered image, wherein one or more distances between the first minimum-maximum pair defines a first search window. In one embodiment, the method includes identifying a second minimum-maximum pair in the filtered image, wherein one or more distances between the second minimum-maximum pair defines a second search window.
In one embodiment, the method includes searching along r dimension in corresponding pre-processed input image within first search window. In one embodiment, the method includes designating pixels below noise floor threshold located in the first search window as corresponding to the candidate branching region. In one embodiment, noise floor threshold is less than about 2 mm. In one embodiment, the method includes splitting the candidate branch region into three bands, wherein sum of widths of three bands is equal to T. In one embodiment, the method includes for each band, accumulating pixels along each A-line that correspond to the candidate branching region.
In one embodiment, the method includes wherein if a particular A-line has more than between about 10% and about 30% pixels marked as a candidate branch, mark that A-line in that band as corresponding to a branch. In one embodiment, the method includes outputting set of A-lines for each band that correspond to a candidate branch.
In one embodiment, the method includes generating a branching matrix using frames of a pullback, the frames comprising A-lines and angular data. In one embodiment, the method includes isolating pixels corresponding to a grouping of all three bands and a grouping of first two bands to select pixels corresponding to a side branch. In one embodiment, the method includes removing a guidewire region from the branching matrix. In one embodiment, the method includes eliminating branches that appear only in one frame. In one embodiment, the method includes replicating branching matrix to account for overlap across zero.
In one embodiment, the first band ranges from 0 to T/3, and wherein second band ranges from T/3 to 2/3T, and wherein third band ranges from 2/3T to T. In one embodiment, the first band ranges from 0 to T/3, and wherein second band ranges from T/3 to 2/3T, and wherein third band ranges from 2/3T to T. In one embodiment, the method includes displaying one or more detected side branches in a user interface. In one embodiment, the method includes validating one or more candidate side branches using a branching matrix, the branch matrix generated using pixel selected from two or more bands, wherein the sum of the bands is T.
Other features and advantages of the disclosed embodiments will be apparent from the following description and accompanying drawings.
The patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawing(s) will be provided by the Office upon request and payment of the necessary fee.
The figures are not necessarily to scale, emphasis instead generally being placed upon illustrative principles. The figures are to be considered illustrative in all aspects and are not intended to limit the disclosure, the scope of which is defined only by the claims.
In part, the disclosure relates to an automated method of branch detection with regard to a blood vessel imaged using an intravascular modality such as OCT, IVUS, or other imaging modalities. The term branch refers to one or more branches of a blood vessel such as a side branch. In one embodiment, the disclosure relates to performing branch detection as an intermediate step in a pipeline of software modules, operators, and stages. The various stages transform intravascular data and perform feature detection such as shadow and lumen detection thereon. Branch detection can be performed after an OCT or IVUS pullback and the resulting intravascular data can be processed using a lumen detection software module to extract lumen data such as information relating to a lumen boundary.
In part, the invention relates to various methods of collecting and processing data such as frames of intravascular data. In one embodiment, a frame of intravascular data or image data includes a cross-sectional image generated from a plurality of A-lines (scan lines) obtained using a rotatable intravascular probe. A cross-sectional image of blood vessel is formed by a collection of scan lines as the probe rotates.
In one embodiment, prior to branch detection, shadow detection is performed to identify regions of interest from the underlying intravascular data. Shadows are of interest because they can correspond to different features such as blood pools, branches such as side branches, and guidewire segments. Guidewire segments arise from the guidewire used to position the intravascular imaging probe in the artery. In one embodiment, once a guidewire (or guidewires) have been identified and validated, guidewire generated positional or pixel markings on a given frame or scan line can be provided to other intravascular data processing modules. As an example, validated guidewire detections can be an input to a side branch detection module. The process of detecting side branches can also be input into other processing stages to generate information of interest with regard to the intravascular pullback data.
In part, the disclosure describes various methods and sub-methods relating to branch detection and the evaluation of parameters relating thereto. In one embodiment, the method is an automated method that operates upon intravascular data based on a user interface input to detect side branches or as part of other image processing that uses side branch detections as inputs.
Side branches in coronary arteries can be used to model the normal diameter of the artery at each segment as the artery tapers. The location and diameter of the side branch is an important input to calculate and predict flow along the artery.
A new software algorithm has been developed that will automatically detect the location of side branches in OCT images and provide an estimate of its diameter. The algorithm will identify the frames and the scan lines of OCT frames that are part of the branching region.
In one embodiment, the software-based methods work on scan lines in polar coordinate space and use a combination of image processing filters and algorithms to detect the rising and falling intensity gradients of the side branch wall. In one embodiment, the software generates a matrix from scan line data that is organized based on frames along the pullback. The matrix includes data from the pull-back that collects information beyond the lumen by an offset amount or other distance extending into the tissue or side branches. This matrix (Branching Matrix) is parsed to get information about possible branch locations and is used to measure the branch diameters.
As shown in
In one embodiment, these data sets, or collections of frames of image data, can be used to identify regions of interest such as a stenosis or a deployed stent. In one embodiment, the data collection probe 7 is an OCT probe. The probe 7 can include a probe tip 17. When an OCT probe is used for probe 7, it is configured for use with a version of system 10 that includes an interferometer and a data processing system. The distance measurements collected using the data collection probe 7 can be processed to generate frames of image data such as cross-sectional views or longitudinal views (L-mode views) of the blood vessel. For clarity, a cross-sectional view can include without limitation a longitudinal view. These images can be processed using one or more image data processing modules or stages.
The data collection probe 7 is shown prior to or after insertion in a blood vessel. The data collection probe 7 is in optical communication with an OCT system 10. The OCT system 10 that connects to data collection probe 7 via an optical fiber 15 can include a light source such as a laser, an interferometer having a sample arm and a reference arm, various optical paths, a clock generator, photodiodes, and other OCT system components. The probe 7 is disposed in an artery 8 having branches B and blood pools BP.
In one embodiment, an optical receiver 31, such as a balanced photodiode based system, can receive light exiting the data collection probe 7. A computing device 40 such as a computer, processor, ASIC, or other device can be part of the OCT system 10 or can be included as a separate subsystem in electrical or optical communication with the OCT system 10. The computing device 40 can include memory device(s) 41, storage, buses and other components suitable for processing data and software components 44 such as image data processing stages configured for stent visualization, stent malapposition detection, lumen detection, offset generation, search region 151 definition, side branch detection 45, guidewire detection, branching matrix generation, pullback data collection and others. Although the branch detection module 45 is shown as a separate software module it can also be one of the software components 44. The branching matrix generation software can be part of the branch detection module 45 or be a separate software module.
In various embodiments, the computing device 40 includes or accesses software modules or programs 44, such as a side branch detection module, a guidewire detection module, a lumen detection module, a stent detection module, a median mask clearing module, an intensity averaging module, a stent malapposition detection module, a carina detection module, and other software modules. For example, the computing device 40 can access a side branch detection module 45 for detecting side branches. In particular, the module is calibrated to use certain branching characteristics as signatures to improve branching accuracy.
In one embodiment, the side branch detection module 45 generates or operates upon a two dimensional branching matrix and isolates candidate side branches using the matrix or as otherwise described herein. In one embodiment, the branching characteristics can include an arrangement of intravascularly detected features such as a noise floor and rising or falling gradients. The software modules or programs 44 can include an image data processing pipeline or component modules thereof and one or more graphical user interfaces (GUI).
An exemplary image processing pipeline is used for transforming collected intravascular data into two dimensional and three dimensional views of blood vessels and stents. The image data processing pipeline or any of the methods described herein are stored in memory and executed using one or more computing devices such as a processor, device, or other integrated circuit.
In one embodiment, the software modules 44 also includes additional features relating to blood flow detection or includes such features in lieu of side branch detection. In one embodiment, the diameter of one or more side branches and predicting blood flow across these side branches. The software modules 44 can also include or be in communication with user interface software components to toggle side branch blood flow views on and off and to display and toggle the various user interface display modes such as stent planning, fly through and other viewing modes described herein.
As shown in
Data collection system 10 can be used to display image data relating to blood flow associated with detected side branches for the vessel. In one embodiment, one or more steps can be performed automatically or without user input other than initial user input to navigate relative to one or more images, enter information, select or interact with an input such as a controller or user interface component, or otherwise indicate one or more system outputs. In one embodiment, a blood flow view is presented as an option to select to facilitate review of a two or three-dimensional view of a representation of the vessel and one or more side branches. Toggling between one or more viewing modes in response to user inputs can be performed relative to various steps described herein.
Representations of a stent and a lumen boundary such as OCT or IVUS images thereof can be shown to a user via display 46. Side branch detection, shadow detection and stent detection are performed prior to the display of these features and any coding or tagging with identifying indicia that may be included in the displayed image. This OCT-based information 47 can be displayed using one or more graphic user interface(s) (GUI). The images of
In addition, this information 47 can include, without limitation, cross-sectional scan data, longitudinal scans, diameter graphs, image masks, shadow regions, stents, areas of malapposition, lumen border, perpendicular distances measured relative to a automatically detected lumen border and a perpendicular distance extending from the lumen border having a distance T, and other images or representations of a blood vessel or the underlying distance measurements obtained using an OCT system and data collection probe.
The computing device 40 can also include software or programs 44, which can be stored in one or more memory devices 41, configured to identify stent struts and malapposition levels (such as based on a threshold and measured distance comparison) and other blood vessel features such as with text, arrows, color coding, highlighting, contour lines, or other suitable human or machine readable indicia.
The display 46 depicts various views of the blood vessel, in accordance with an embodiment. The display can include a menu for showing or hiding various features, such as a menu for selecting blood vessel features to display, and a menu for selecting the virtual camera angle of the display. The user can toggle between multiple view angles on the user display. In addition, the user can toggle between different side branches on the user display, such as by selecting particular side branches and/or by selecting a view associated with a particular side branch.
For example, the user can select an ostium view, which can be the default view in one embodiment or a carinal/carina view to allow them to view a carina for one or more side branches. In one embodiment, the image processing pipeline and associated software modules detect the lumen boundary, guidewires, other shadows, stents, and the side branches in the artery imaged using the data collected during a pullback.
For example, the lumen boundary can be detected using the distance measurements obtained from the optical signals collected at the probe tip 17 using a lumen detection software component or module. In lieu of a fiber, an ultrasound transducer can be used suitable for collecting IVUS signals with regard to the vessel wall and one or more stents.
The lumen detection software can include one or more steps. For example, to perform lumen detection in one embodiment a filter or other image processing device can be applied to a two dimensional image to detect edges in the images, the edges indicative a lumen boundary. In another embodiment, a scan line based approach is used. During one or more pullbacks, optical or ultrasound signals are collected as scan lines with respect to a blood vessel and one or more stents disposed in the lumen of the vessel. In one embodiment, the lumen detection software executing a computing device generates one or more images from the set of scan lines using a computing device.
Further, lumen detection can include generating a binary mask of the vascular image using the computing device, wherein the binary mask is generated using an intensity threshold. As another step, a plurality of scan lines is defined in the binary mask. With regard to each scan line of the plurality of scan lines, in one embodiment, a region is identified as lumen boundary tissue thereon. Contour segments of the boundary are identified based on the presence of a region of lumen boundary tissue. In one embodiment, the method identifies neighboring contour segments. The lumen boundary detection method can also include interpolating missing contour data between neighboring contour segments. As a result, in one embodiment, the neighboring contour segments and the interpolated missing contour data define the lumen boundary.
Once the intravascular data, such as frames and scan lines from the pullback, is obtained with a probe and stored in memory 41, it can be processed to generate information 47 such as a cross-sectional, a longitudinal, and/or a three-dimensional view of the blood vessel along the length of the pullback region or a subset thereof. These views can be depicted as part of a user interface as shown in the figures. The images of the blood vessel generated using the distance measurements obtained from the intravascular data collection system provide information about the blood vessel and objects disposed therein.
Accordingly, in part, the disclosure relates to software-based methods and related systems and devices suitable for evaluating and depicting information regarding a blood vessel, a stent or other vascular information of interest. The intravascular data can be used to generate 2-D views such as cross-sectional and longitudinal views of a blood vessel before or after an initial stent deployment or corrective stent related procedure. The intravascular data obtained using a data collection probe and various data processing software modules can be used to identify, characterize, and visualize a stent and/or one or more properties relating to the stent and/or the lumen in which it is disposed.
Stent position relative to the wall of the blood vessel and in relation to openings for side branches in the wall of the blood vessel can be visualized such that the side branch openings are not blocked by the stent. In one embodiment, side branches are identified and visualized to aide in treatment planning and stent placement.
In one embodiment, guidewire detection is performed initially such that shadows and guidewire segments can be excluded from side branch shadows to increase detection accuracy. In various embodiments, the intravascular data collection system and associated software modules detects branching characteristics in the OCT image data within a predetermined scan depth region T. The T value can be used to define a search region 151. The T value can be input via a graphic user interface.
In one embodiment, T is specified as about 660 μm and is used as an offset from the lumen to define an offset lumen boundary demarcated by line 108. The region of interest/searching for branch detection is defined, in one embodiment, by the line 108, which is the lumen boundary 106 shifted by distance T and lumen boundary 106. The curved strip or ribbon 151 of width T bounded by dotted line (lumen boundary 106) and shifted lumen boundary 108 specifies a subset of intravascular image data to search for side branches. T can range from about 500 μm to about 800 μm.
Branch Detection Embodiment
The software-based branching method first scans for a branching signature or pattern defined by a noise floor 110 (also referred to as NF), or a signalless region, between a falling response gradient 112 and rising response gradient 114. In one embodiment, the noise floor is the zone or region in which the tissue intensity has dropped off to the same value as in the cleared lumen. The noise floor 110 can correspond to the transitional region in between rising and falling intensity gradients in a search zone or region. Next, any image frames that fit this signature or pattern are marked as candidate branching regions. Candidate branching regions are then combined across all frames. In turn, the branching regions are optionally parsed relative to a branching matrix and the diameter of each side branch is estimated. The software-based branching method is described in more detail herein.
Initially, after a pullback, raw A-line images are pre-processed to flatten the images, making it easier to identify side branches.
The pre-processed image 125 of
In one embodiment, a portion of the flattened image 125 of
Further, in one embodiment the filter is a smoothing filter. In one embodiment, the filter 133 is an edge finding filter. In one embodiment, the filter 133 is a combination smoothing and an edge finding filter which can be filters F1 and F2.
As an additional step, an edge filtered image such as image 135 of
As shown in
With regard to
The noise floor of
Further, with regard to
In one embodiment, the occurrence of a peak and a trough corresponds to a change in gradient intensity that defines a search space. If there is a signal below the noise floor in the search space then the corresponding pixels correspond to candidate branching regions. These candidate regions are subsequently analyzed to determine if they are valid side branches.
In one embodiment, the candidate regions are split into search bands or zones as shown in
In one embodiment, as part of the selection/specification of regions to be searched, the region is split or subdivided into three bands (band 1, band 2, and band 3). In one embodiment, each band is processed separately. The bands can also be processed in parallel or differentially with features from one band being compared to features in one or more of the other bands. Although three bands are shown, one, two or more bands can be specified for candidate branch searching.
In one embodiment, for each specified search band the method accumulates marked pixels along each A-line. If a particular A-line has more than 10-35% pixels mark that A-line in that band as corresponding to branch. This approach implies at least 10-35% of the pixels in the search region were at or below the noise floor.
In one embodiment, the software modules are used to parse a branching matrix to isolate candidate branch regions that are most likely to be branches. This step is performed based on one or more rules for all three branching regions. The guidewire is removed in some embodiments.
In one embodiment, blood pooling and other sources of false branch positives is addressed. Thrombus, blood pooling and other artifacts represent major causes of false positives. The blood attenuates the signal, which can mimic the branching region. An intensity threshold that can identify the blood pooling pixels is calculated for each frame.
A blocking index is calculated based on the number of blood pooling pixels detected within the lumen for each detected branch. This index correlates to blood pooling and thrombus inside the lumen and provides a score for each detected branch. This index is high when there is a lot of signal attenuating blood or thrombus inside the lumen. Branches with a high Blocking Index are rejected as false positive. Those within an acceptable range are preserved as true positives. Branches with a mid-range blocking index can be flagged for further review.
The branching matrix 250 and 300 (
Branching Matrix Legend (distance from lumen/band groupings for example T=660 μm):
With respect to
The method may also include eliminating branches that appear only in 1 frame. In one embodiment, given that the angles span 360 degrees, depending on orientation and overlap of ends of matrix (it is based on a cylindrical arranged) sections of the matrix can be replicated to cover upper and lower horizontal axes of matrix. In one embodiment, the morphological operators can include the application of a 1D Image opening operation (7 pixels) along A-line dimension to eliminate a-lines with a negligible amount of data coverage. In addition, filters to emphasize connections can be applied to perform a connected component analysis to identify an individual component as a single branch. Cross frame data can also be used to connect blobs in the matrix that are part of one branch.
In part, the disclosure relates to an automated method of branch detection that includes the step of detecting branching characteristics within a region having a scan depth T. In one embodiment, T is a distance measured from the lumen and can be used to define a boundary offset by a distance T from the lumen boundary. In one embodiment, T ranges from about 400 μm to about 700 μm. In one embodiment, T is about 660 μm. In one embodiment, T is an approximation of a vessel wall thickness or a scan depth thickness selected to specify a search zone for finding side branches. In one embodiment, the branching characteristics include one or more of the following a noise floor or a signalless region between falling and rising gradients. The use of rising and falling segments relative to a noise floor as a detection signature advantageously improves detection accuracy with regard to large branches.
In one embodiment, the automated method of branch detection includes the step of combining candidate branching regions across all frames, substantially all frames or M frames, wherein M is 2 or more. In one embodiment, the automated method of branch detection includes the step of parsing the branching regions to identify candidate branches and estimating branch diameter of such candidate branches. In one embodiment, side branches having a diameter D greater than about 1 mm are tracked. In one embodiment, a large branch is a branch having a diameter D greater than or equal to about 1 mm.
Large branches make an increased contribution to flow and thus can significantly affect FFR, VFR, and other blood flow based measurements. As a result, once detected, branches having diameters of interest such as those greater than equal to about 1 mm, are tracked and evaluated to verify their characterization as a branch as opposed to a false detection such as a shadow. Other diameters of interest can include a branch diameter D that ranges from greater than about 0.4 mm to less than about 2 mm.
In one embodiment, the automated method of branch detection includes the step of generating a representative two-dimensional matrix of A-lines (also referred to as scan lines) versus frames to define a branching matrix. In one embodiment, the vertical axis is used to represent the A-lines with units of angles such as degrees ranging from 0 to 360 degrees and the horizontal axis has units corresponding to frame numbers. The angular range can be shown as greater than 360 degrees; however, the additional angle section of the matrix typically overlaps with earlier angular values in the matrix.
Accordingly, in one embodiment, the frame numbers can start at 0 or 1 and continue through J, wherein J is the number of frames in the pullback. The automated method of branch detection includes the step of parsing the branching matrix to isolate branch candidates. In one embodiment, guide wire detection and lumen detection are performed prior to performing branch detection. The guidewire visible in frames is removed in some embodiments.
A system of one or more computers can be configured to perform particular operations or actions by virtue of having software, firmware, hardware, or, a combination of them installed on the system that in operation causes or cause the system to perform the actions. One or more computer programs can be configured to perform particular operations or actions by virtue of including instructions that, when executed by data processing apparatus, cause the apparatus, computing device, to perform the actions.
One general aspect includes a method of automatically detecting one or more side branches that includes generating a branching matrix comprising scan line data and frame designators. The method also includes storing, using an intravascular imaging system, one or more intravascular image datasets of the blood vessel; each intravascular dataset including a plurality of A-lines. The method also includes generating an A-line image with a detected lumen boundary of the blood vessel. An offset T can also be used to shift a representation of a detected lumen in a direction away from the imaging probe into a tissue or branch region.
In one embodiment, the method also includes increasing intensity of edges in A-line image. The method also includes suppressing smooth regions in A-line image. The method also includes specifying a search distance T offset relative to the lumen. The offset T can define a region or band to search for candidate side branch regions. The method also includes identifying local min-max pairs in a filtered image. In one embodiment, the method also includes searching the radial dimension r in the corresponding pre-processed input image.
In one embodiment, the method also includes marking pixels below noise floor threshold in the pre-processed input image as candidate branching regions. Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.
Implementations may include one or more of the following features. The method further including: flattening the A-line image using a first image processing operator. The method may also include applying median smoothing to A-line image using a second image-processing operator. The method may also include applying Gaussian smoothing to A-line image using a third image-processing operator.
In one embodiment, the method further includes dividing the candidate branching regions into N bands, such as 1, 2, 3 or more bands and processing each band separately. In one embodiment, the bands have the same thickness. In one embodiment, the thickness or width of a band is T/N for N bands. The method further includes accumulating marked pixels along each A-line. The pixels can be marked or otherwise tracked using software to identify a given pixel as corresponding to a shadow, a guidewire pixel, a branch pixel such as a side branch pixel, a lumen pixel, a blood pixel and other pixels corresponding to imaged intravascular objects or shadows or reflections thereof.
In one embodiment, if a particular A-line has more than between about 10% and about 30% pixels marked as a branch, the method marks the A-line in that band as a branch or an A-line containing branch. The method further includes generating a branching matrix during frame by frame processing. The method further includes isolating pixels with white (all 3 bands) and yellow (first two bands) pixels that neighbor white pixels. The method further includes removing guidewire region. The method further includes eliminating branches that appear only in one frame. Thus, failure of a branch to appear in multiple frames can be used to exclude candidate branches. The method further includes replicating branching matrix to account for overlap across zero. Implementations of the described techniques may include hardware, a method or process, or computer software on a computer-accessible medium.
Although the invention relates to different aspects and embodiments, it is understood that the different aspects and embodiments disclosed herein can be integrated together as a whole or in part, as appropriate. Thus, each embodiment disclosed herein can be incorporated in each of the aspects to varying degrees as appropriate for a given implementation and steps from various methods can be combined without limitation. Notwithstanding the foregoing and the other disclosure herein, embodiments disclosed herein may also be applied in the context of bi-polar based systems and methods as applicable.
Non-limiting Software Features and Embodiments for Implementing Branch Detection
The following description is intended to provide an overview of device hardware and other operating components suitable for performing the methods of the disclosure described herein. This description is not intended to limit the applicable environments or the scope of the disclosure. Similarly, the hardware and other operating components may be suitable as part of the apparatuses described above. The disclosure can be practiced with other system configurations, including personal computers, multiprocessor systems, microprocessor-based or programmable electronic devices, network PCs, minicomputers, mainframe computers, and the like.
Some portions of the detailed description are presented in terms of algorithms and symbolic representations of operations on data bits within a computer memory. These algorithmic descriptions and representations can be used by those skilled in the computer and software related fields. In one embodiment, an algorithm is here, and generally, conceived to be a self-consistent sequence of operations leading to a desired result. The operations performed as methods stops or otherwise described herein are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, transformed, compared, and otherwise manipulated.
Unless specifically stated otherwise as apparent from the following discussion, it is appreciated that throughout the description, discussions utilizing terms such as “processing” or “computing” or “angling” or “selecting” or “toggling” or “calculating” or “comparing” or “arc length measuring” or “detecting” or “tracing” or “masking” or “sampling” or “operating” or “generating” or “determining” or “displaying” or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
The present disclosure, in some embodiments, also relates to the apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, or it may comprise a general purpose computer selectively activated or reconfigured by a computer program stored in the computer.
The algorithms and displays presented herein are not inherently related to any particular computer or other apparatus. Various general purpose systems may be used with programs in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the required method steps. The required structure for a variety of these systems will appear from the description below.
Embodiments of the disclosure may be implemented in many different forms, including, but in no way limited to, computer program logic for use with a processor (e.g., a microprocessor, microcontroller, digital signal processor, or general purpose computer), programmable logic for use with a programmable logic device, (e.g., a Field Programmable Gate Array (FPGA) or other PLD), discrete components, integrated circuitry (e.g., an Application Specific Integrated Circuit (ASIC)), or any other means including any combination thereof. In a typical embodiment of the present disclosure, some or all of the processing of the data collected using an OCT probe, an FFR probe, an angiography system, and other imaging and subject monitoring devices and the processor-based system is implemented as a set of computer program instructions that is converted into a computer executable form, stored as such in a computer readable medium, and executed by a microprocessor under the control of an operating system. Thus, user interface instructions and triggers based upon the completion of a pullback or a co-registration request, for example, are transformed into processor understandable instructions suitable for generating OCT data, performing image procession using various and other features and embodiments described above.
Computer program logic implementing all or part of the functionality previously described herein may be embodied in various forms, including, but in no way limited to, a source code form, a computer executable form, and various intermediate forms (e.g., forms generated by an assembler, compiler, linker, or locator). Source code may include a series of computer program instructions implemented in any of various programming languages (e.g., an object code, an assembly language, or a high-level language such as Fortran, C, C++, JAVA, or HTML) for use with various operating systems or operating environments. The source code may define and use various data structures and communication messages. The source code may be in a computer executable form (e.g., via an interpreter), or the source code may be converted (e.g., via a translator, assembler, or compiler) into a computer executable form.
The computer program may be fixed in any form (e.g., source code form, computer executable form, or an intermediate form) either permanently or transitorily in a tangible storage medium, such as a semiconductor memory device (e.g., a RAM, ROM, PROM, EEPROM, or Flash-Programmable RAM), a magnetic memory device (e.g., a diskette or fixed disk), an optical memory device (e.g., a CD-ROM), a PC card (e.g., PCMCIA card), or other memory device. The computer program may be fixed in any form in a signal that is transmittable to a computer using any of various communication technologies, including, but in no way limited to, analog technologies, digital technologies, optical technologies, wireless technologies (e.g., Bluetooth), networking technologies, and internetworking technologies. The computer program may be distributed in any form as a removable storage medium with accompanying printed or electronic documentation (e.g., shrink-wrapped software), preloaded with a computer system (e.g., on system ROM or fixed disk), or distributed from a server or electronic bulletin board over the communication system (e.g., the internet or World Wide Web).
Hardware logic (including programmable logic for use with a programmable logic device) implementing all or part of the functionality previously described herein may be designed using traditional manual methods, or may be designed, captured, simulated, or documented electronically using various tools, such as Computer Aided Design (CAD), a hardware description language (e.g., VHDL or AHDL), or a PLD programming language (e.g., PALASM, ABEL, or CUPL).
Programmable logic may be fixed either permanently or transitorily in a tangible storage medium, such as a semiconductor memory device (e.g., a RAM, ROM, PROM, EEPROM, or Flash-Programmable RAM), a magnetic memory device (e.g., a diskette or fixed disk), an optical memory device (e.g., a CD-ROM), or other memory device. The programmable logic may be fixed in a signal that is transmittable to a computer using any of various communication technologies, including, but in no way limited to, analog technologies, digital technologies, optical technologies, wireless technologies (e.g., Bluetooth), networking technologies, and internetworking technologies. The programmable logic may be distributed as a removable storage medium with accompanying printed or electronic documentation (e.g., shrink-wrapped software), preloaded with a computer system (e.g., on system ROM or fixed disk), or distributed from a server or electronic bulletin board over the communication system (e.g., the internet or World Wide Web).
Various examples of suitable processing modules are discussed below in more detail. As used herein a module refers to software, hardware, or firmware suitable for performing a specific data processing or data transmission task. In one embodiment, a module refers to a software routine, program, or other memory resident application suitable for receiving, transforming, routing and processing instructions, or various types of data such as angiography data, OCT data, FFR data, IVUS data, co-registration data, pixels, branching matrixes, and orientation and coordinates, user interface signals, and various graphical display elements and other information of interest as described herein.
Computers and computer systems described herein may include operatively associated computer-readable media such as memory for storing software applications used in obtaining, processing, storing and/or communicating data. It can be appreciated that such memory can be internal, external, remote or local with respect to its operatively associated computer or computer system.
Memory may also include any means for storing software or other instructions including, for example and without limitation, a hard disk, an optical disk, floppy disk, DVD (digital versatile disc), CD (compact disc), memory stick, flash memory, ROM (read only memory), RAM (random access memory), DRAM (dynamic random access memory), PROM (programmable ROM), EEPROM (extended erasable PROM), and/or other like computer-readable media.
In general, computer-readable memory media applied in association with embodiments of the disclosure described herein may include any memory medium capable of storing instructions executed by a programmable apparatus. Where applicable, method steps described herein may be embodied or executed as instructions stored on a computer-readable memory medium or memory media. These instructions may be software embodied in various programming languages such as C++, C, Java, and/or a variety of other kinds of software programming languages that may be applied to create instructions in accordance with embodiments of the disclosure.
The aspects, embodiments, features, and examples of the disclosure are to be considered illustrative in all respects and are not intended to limit the disclosure, the scope of which is defined only by the claims. Other embodiments, modifications, and usages will be apparent to those skilled in the art without departing from the spirit and scope of the claimed disclosure.
The use of headings and sections in the application is not meant to limit the disclosure; each section can apply to any aspect, embodiment, or feature of the disclosure.
Throughout the application, where compositions are described as having, including, or comprising specific components, or where processes are described as having, including or comprising specific process steps, it is contemplated that compositions of the present teachings also consist essentially of, or consist of, the recited components, and that the processes of the present teachings also consist essentially of, or consist of, the recited process steps.
In the application, where an element or component is said to be included in and/or selected from a list of recited elements or components, it should be understood that the element or component can be any one of the recited elements or components and can be selected from a group consisting of two or more of the recited elements or components. Further, it should be understood that elements and/or features of a composition, an apparatus, or a method described herein can be combined in a variety of ways without departing from the spirit and scope of the present teachings, whether explicit or implicit herein.
The use of the terms “include,” “includes,” “including,” “have,” “has,” or “having” should be generally understood as open-ended and non-limiting unless specifically stated otherwise.
The use of the singular herein includes the plural (and vice versa) unless specifically stated otherwise. Moreover, the singular forms “a,” “an,” and “the” include plural forms unless the context clearly dictates otherwise. In addition, where the use of the term “about” is before a quantitative value, the present teachings also include the specific quantitative value itself, unless specifically stated otherwise. As used herein, the term “about” refers to a ±10% variation from the nominal value.
It should be understood that the order of steps or order for performing certain actions is immaterial so long as the present teachings remain operable. Moreover, two or more steps or actions may be conducted simultaneously.
It should be appreciated that various aspects of the claimed disclosure are directed to subsets and substeps of the techniques disclosed herein. Further, the terms and expressions employed herein are used as terms of description and not of limitation, and there is no intention, in the use of such terms and expressions, of excluding any equivalents of the features shown and described or portions thereof, but it is recognized that various modifications are possible within the scope of the disclosure claimed. Accordingly, what is desired to be secured by Letters Patent is the disclosure as defined and differentiated in the following claims, including all equivalents.
The term “machine-readable medium” includes any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure. While the machine-readable medium is shown in an example embodiment to be a single medium, the term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a database, one or more centralized or distributed databases and/or associated caches and servers) that store the one or more sets of instructions.
It can be appreciated that, in certain aspects of the disclosure, a single component may be replaced by multiple components, and multiple components may be replaced by a single component to provide an element or structure or to perform a given function or functions. Except where such substitution would not be operative to practice certain embodiments of the disclosure, such substitution is considered within the scope of the disclosure.
The examples presented herein are intended to illustrate potential and specific implementations of the disclosure. It can be appreciated that the examples are intended primarily for purposes of illustration of the disclosure for those skilled in the art. There may be variations to these diagrams or the operations described herein without departing from the spirit of the disclosure. For instance, in certain cases, method steps or operations may be performed or executed in differing order, or operations may be added, deleted or modified.
Furthermore, whereas particular embodiments of the disclosure have been described herein for the purpose of illustrating the disclosure and not for the purpose of limiting the same, it will be appreciated by those of ordinary skill in the art that numerous variations of the details, materials and arrangement of elements, steps, structures, and/or parts may be made within the principle and scope of the disclosure without departing from the disclosure as described in the claims.
This application is a continuation of U.S. patent application Ser. No. 15/488,005 filed Apr. 14, 2017, which claims priority to and the benefit of U.S. Provisional Patent Application No. 62/322,771 filed on Apr. 14, 2016, the disclosures of which are herein incorporated by reference in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
4548473 | Lo et al. | Oct 1985 | A |
5054492 | Scribner et al. | Oct 1991 | A |
5321501 | Swanson et al. | Jun 1994 | A |
5459570 | Swanson et al. | Oct 1995 | A |
5465147 | Swanson | Nov 1995 | A |
5477858 | Norris et al. | Dec 1995 | A |
5488674 | Burt et al. | Jan 1996 | A |
5509093 | Miller et al. | Apr 1996 | A |
5518810 | Nishihara et al. | May 1996 | A |
5531227 | Schneider | Jul 1996 | A |
5586201 | Whiting et al. | Dec 1996 | A |
5619368 | Swanson | Apr 1997 | A |
5632767 | Sinofsky | May 1997 | A |
5643253 | Baxter et al. | Jul 1997 | A |
5662109 | Hutson | Sep 1997 | A |
5715827 | Corl et al. | Feb 1998 | A |
5748598 | Swanson et al. | May 1998 | A |
5771895 | Slager | Jun 1998 | A |
5784352 | Swanson et al. | Jul 1998 | A |
5797849 | Vesely et al. | Aug 1998 | A |
5822391 | Whitting | Oct 1998 | A |
5908415 | Sinofsky | Jun 1999 | A |
5947959 | Sinofsky | Sep 1999 | A |
5956355 | Swanson et al. | Sep 1999 | A |
5989189 | LeBlanc et al. | Nov 1999 | A |
6111645 | Tearney et al. | Aug 2000 | A |
6134003 | Tearney et al. | Oct 2000 | A |
6148095 | Prause et al. | Nov 2000 | A |
6160826 | Swanson et al. | Dec 2000 | A |
6191862 | Swanson et al. | Feb 2001 | B1 |
6195445 | Jolly et al. | Feb 2001 | B1 |
6208883 | Holupka et al. | Mar 2001 | B1 |
6270492 | Sinofsky | Aug 2001 | B1 |
6282011 | Tearney et al. | Aug 2001 | B1 |
6302875 | Makower et al. | Oct 2001 | B1 |
6348960 | Etori et al. | Feb 2002 | B1 |
6381350 | Klingensmith et al. | Apr 2002 | B1 |
6385332 | Zahalka et al. | May 2002 | B1 |
6421164 | Tearney et al. | Jul 2002 | B2 |
6445939 | Swanson et al. | Sep 2002 | B1 |
6471656 | Shalman et al. | Oct 2002 | B1 |
6485413 | Boppart et al. | Nov 2002 | B1 |
6501551 | Tearney et al. | Dec 2002 | B1 |
6552796 | Magnin et al. | Apr 2003 | B2 |
6564087 | Pitris et al. | May 2003 | B1 |
6565514 | Svanerudh et al. | May 2003 | B2 |
6570659 | Schmitt | May 2003 | B2 |
6585660 | Dorando et al. | Jul 2003 | B2 |
6692824 | Benz et al. | Feb 2004 | B2 |
6697667 | Lee et al. | Feb 2004 | B1 |
6706004 | Tearney et al. | Mar 2004 | B2 |
6716178 | Kilpatrick et al. | Apr 2004 | B1 |
6718089 | James et al. | Apr 2004 | B2 |
6728566 | Subramanyan et al. | Apr 2004 | B1 |
6731973 | Voith | May 2004 | B2 |
6760112 | Reed et al. | Jul 2004 | B2 |
6785409 | Suri | Aug 2004 | B1 |
6868736 | Sawatari et al. | Mar 2005 | B2 |
6879851 | McNamara et al. | Apr 2005 | B2 |
6891984 | Petersen et al. | May 2005 | B2 |
6932809 | Sinofsky | Aug 2005 | B2 |
6937696 | Mostafavi | Aug 2005 | B1 |
6942657 | Sinofsky et al. | Sep 2005 | B2 |
6947040 | Tek et al. | Sep 2005 | B2 |
6973202 | Mostafavi | Dec 2005 | B2 |
6974557 | Webler et al. | Dec 2005 | B1 |
7068831 | Florent et al. | Jun 2006 | B2 |
7134994 | Alpert et al. | Nov 2006 | B2 |
7191100 | Mostafavi | Mar 2007 | B2 |
7208333 | Flanders et al. | Apr 2007 | B2 |
7231243 | Tearney et al. | Jun 2007 | B2 |
7241286 | Atlas | Jul 2007 | B2 |
7298478 | Gilbert et al. | Nov 2007 | B2 |
7301644 | Knighton et al. | Nov 2007 | B2 |
7321677 | Evron et al. | Jan 2008 | B2 |
7329223 | Ainsworth et al. | Feb 2008 | B1 |
7355699 | Gilbert et al. | Apr 2008 | B2 |
7359554 | Klingensmith et al. | Apr 2008 | B2 |
7397935 | Kimmel et al. | Jul 2008 | B2 |
7408648 | Kleen et al. | Aug 2008 | B2 |
7412141 | Gowda et al. | Aug 2008 | B2 |
7414779 | Huber et al. | Aug 2008 | B2 |
7415049 | Flanders et al. | Aug 2008 | B2 |
7450241 | Zuluaga | Nov 2008 | B2 |
RE40608 | Glover et al. | Dec 2008 | E |
7492522 | Gilbert et al. | Feb 2009 | B2 |
7532920 | Ainsworth et al. | May 2009 | B1 |
7576861 | Gilbert et al. | Aug 2009 | B2 |
7593559 | Toth et al. | Sep 2009 | B2 |
7610081 | Redel | Oct 2009 | B2 |
7619646 | Freifeld et al. | Nov 2009 | B2 |
7625366 | Atlas | Dec 2009 | B2 |
7627156 | Margolis et al. | Dec 2009 | B2 |
7650179 | Redel et al. | Jan 2010 | B2 |
7679754 | Zuluaga | Mar 2010 | B2 |
7697972 | Verard et al. | Apr 2010 | B2 |
7706585 | Kleen | Apr 2010 | B2 |
7711413 | Feldman et al. | May 2010 | B2 |
7729746 | Redel et al. | Jun 2010 | B2 |
7733497 | Yun et al. | Jun 2010 | B2 |
7742797 | Redel et al. | Jun 2010 | B2 |
7783337 | Feldman et al. | Aug 2010 | B2 |
7783338 | Ainsworth et al. | Aug 2010 | B2 |
7785286 | Magnin et al. | Aug 2010 | B2 |
7792342 | Barbu et al. | Sep 2010 | B2 |
7801343 | Unal et al. | Sep 2010 | B2 |
7813609 | Petersen et al. | Oct 2010 | B2 |
7831078 | Unal et al. | Nov 2010 | B2 |
7843976 | Cable et al. | Nov 2010 | B2 |
7848791 | Schmitt et al. | Dec 2010 | B2 |
7853316 | Milner et al. | Dec 2010 | B2 |
7869663 | Buckland et al. | Jan 2011 | B2 |
7872759 | Tearney et al. | Jan 2011 | B2 |
7916387 | Schmitt | Mar 2011 | B2 |
7918793 | Altmann et al. | Apr 2011 | B2 |
7925327 | Weese | Apr 2011 | B2 |
7930014 | Huennekens et al. | Apr 2011 | B2 |
7935060 | Schmitt et al. | May 2011 | B2 |
7967743 | Ishihara | Jun 2011 | B2 |
7988633 | Hossack et al. | Aug 2011 | B2 |
7991105 | Mielekamp et al. | Aug 2011 | B2 |
8029447 | Kanz et al. | Oct 2011 | B2 |
8116605 | Petersen et al. | Feb 2012 | B2 |
8206374 | Duane et al. | Jun 2012 | B2 |
8206377 | Petroff | Jun 2012 | B2 |
8208995 | Tearney et al. | Jun 2012 | B2 |
8223143 | Dastmalchi et al. | Jul 2012 | B2 |
8259303 | Johnson et al. | Sep 2012 | B2 |
8290228 | Cohen et al. | Oct 2012 | B2 |
8298147 | Huennekens et al. | Oct 2012 | B2 |
8315282 | Huber et al. | Nov 2012 | B2 |
8325419 | Schmitt | Dec 2012 | B2 |
8351665 | Tearney et al. | Jan 2013 | B2 |
8358461 | Huber et al. | Jan 2013 | B2 |
8423121 | Wang et al. | Apr 2013 | B2 |
8449468 | Petersen et al. | May 2013 | B2 |
8457375 | Rieber et al. | Jun 2013 | B2 |
8457440 | Johnson | Jun 2013 | B1 |
8463007 | Steinberg et al. | Jun 2013 | B2 |
8478384 | Schmitt et al. | Jul 2013 | B2 |
8478387 | Xu | Jul 2013 | B2 |
8503844 | Petersen et al. | Aug 2013 | B2 |
8542900 | Tolkowsky et al. | Sep 2013 | B2 |
8556820 | Alpert et al. | Oct 2013 | B2 |
8562537 | Alpert et al. | Oct 2013 | B2 |
8571639 | Mostafavi | Oct 2013 | B2 |
8581643 | Schmitt | Nov 2013 | B1 |
8582109 | Schmitt | Nov 2013 | B1 |
8582619 | Adler | Nov 2013 | B2 |
8582934 | Adler et al. | Nov 2013 | B2 |
8670603 | Tolkowsky et al. | Mar 2014 | B2 |
8687201 | Adler | Apr 2014 | B2 |
8693756 | Tolkowsky et al. | Apr 2014 | B2 |
8700130 | Iddan et al. | Apr 2014 | B2 |
8781193 | Steinberg et al. | Jul 2014 | B2 |
8786336 | Schmitt | Jul 2014 | B1 |
8831321 | Elbasiony | Sep 2014 | B1 |
8855744 | Tolkowsky et al. | Oct 2014 | B2 |
8909323 | Baumgart | Dec 2014 | B2 |
8913084 | Chen et al. | Dec 2014 | B2 |
8948228 | Adler | Feb 2015 | B2 |
8953911 | Xu et al. | Feb 2015 | B1 |
8983580 | Boppart et al. | Mar 2015 | B2 |
9069396 | Adler et al. | Jun 2015 | B2 |
9173591 | Elbasiony | Nov 2015 | B2 |
9308052 | Tolkowsky et al. | Apr 2016 | B2 |
9351698 | Dascal et al. | May 2016 | B2 |
9404731 | Adler et al. | Aug 2016 | B2 |
9435956 | Xu et al. | Sep 2016 | B1 |
9488464 | Schmitt | Nov 2016 | B1 |
9629571 | Tolkowsky et al. | Apr 2017 | B2 |
20020115931 | Strauss et al. | Aug 2002 | A1 |
20020161351 | Samson et al. | Oct 2002 | A1 |
20040006277 | Langenhove et al. | Jan 2004 | A1 |
20050043614 | Huizenga et al. | Feb 2005 | A1 |
20050201662 | Petersen et al. | Sep 2005 | A1 |
20050238067 | Choi | Oct 2005 | A1 |
20050249391 | Kimmel et al. | Nov 2005 | A1 |
20060095065 | Tanimura et al. | May 2006 | A1 |
20060135870 | Webler | Jun 2006 | A1 |
20060165270 | Borgert et al. | Jul 2006 | A1 |
20060187537 | Huber et al. | Aug 2006 | A1 |
20060203859 | Cable et al. | Sep 2006 | A1 |
20060241465 | Huennekens et al. | Oct 2006 | A1 |
20060241503 | Schmitt et al. | Oct 2006 | A1 |
20060244973 | Yun et al. | Nov 2006 | A1 |
20070024617 | Poole | Feb 2007 | A1 |
20070060822 | Alpert et al. | Mar 2007 | A1 |
20070066890 | Maschke | Mar 2007 | A1 |
20070115481 | Toth et al. | May 2007 | A1 |
20070123771 | Redel et al. | May 2007 | A1 |
20070135803 | Belson | Jun 2007 | A1 |
20070165916 | Cloutier et al. | Jul 2007 | A1 |
20070167710 | Unal et al. | Jul 2007 | A1 |
20070232933 | Gille et al. | Oct 2007 | A1 |
20070260198 | Atlas | Nov 2007 | A1 |
20070293932 | Zilla et al. | Dec 2007 | A1 |
20080100612 | Dastmalchi et al. | May 2008 | A1 |
20080161696 | Schmitt et al. | Jul 2008 | A1 |
20080165366 | Schmitt et al. | Jul 2008 | A1 |
20080221439 | Iddan et al. | Sep 2008 | A1 |
20080221440 | Iddan et al. | Sep 2008 | A1 |
20080221442 | Tolkowsky et al. | Sep 2008 | A1 |
20080228086 | Ilegbusi et al. | Sep 2008 | A1 |
20080281205 | Naghavi et al. | Nov 2008 | A1 |
20090027051 | Stuber et al. | Jan 2009 | A1 |
20090174931 | Huber et al. | Jul 2009 | A1 |
20090204134 | Kassab | Aug 2009 | A1 |
20090306520 | Schmitt et al. | Dec 2009 | A1 |
20100076320 | Petersen et al. | Mar 2010 | A1 |
20100094127 | Xu | Apr 2010 | A1 |
20100157041 | Klaiman et al. | Jun 2010 | A1 |
20100160764 | Steinberg et al. | Jun 2010 | A1 |
20100160773 | Cohen et al. | Jun 2010 | A1 |
20100161023 | Cohen et al. | Jun 2010 | A1 |
20100172556 | Cohen et al. | Jul 2010 | A1 |
20100191102 | Steinberg et al. | Jul 2010 | A1 |
20100222671 | Cohen et al. | Sep 2010 | A1 |
20100228076 | Blank et al. | Sep 2010 | A1 |
20100253949 | Adler et al. | Oct 2010 | A1 |
20110007315 | Petersen et al. | Jan 2011 | A1 |
20110071404 | Schmitt et al. | Mar 2011 | A1 |
20110071405 | Judell et al. | Mar 2011 | A1 |
20110101207 | Schmitt | May 2011 | A1 |
20110151980 | Petroff | Jun 2011 | A1 |
20110157686 | Huber et al. | Jun 2011 | A1 |
20110172511 | Schmitt et al. | Jul 2011 | A1 |
20110178413 | Schmitt et al. | Jul 2011 | A1 |
20110190586 | Kemp | Aug 2011 | A1 |
20110216325 | Schmitt | Sep 2011 | A1 |
20110228280 | Schmitt et al. | Sep 2011 | A1 |
20110230758 | Eichler | Sep 2011 | A1 |
20110257545 | Suri | Oct 2011 | A1 |
20110319752 | Steinberg et al. | Dec 2011 | A1 |
20120004529 | Tolkowsky et al. | Jan 2012 | A1 |
20120029339 | Cohen et al. | Feb 2012 | A1 |
20120057157 | Petersen et al. | Mar 2012 | A1 |
20120075638 | Rollins et al. | Mar 2012 | A1 |
20120162660 | Kemp | Jun 2012 | A1 |
20120224751 | Kemp et al. | Sep 2012 | A1 |
20120236883 | Adler | Sep 2012 | A1 |
20120238869 | Schmitt et al. | Sep 2012 | A1 |
20120250028 | Schmitt et al. | Oct 2012 | A1 |
20120300215 | Johnson et al. | Nov 2012 | A1 |
20120300216 | Johnson et al. | Nov 2012 | A1 |
20120310081 | Adler et al. | Dec 2012 | A1 |
20130006105 | Furuichi | Jan 2013 | A1 |
20130010303 | Petersen et al. | Jan 2013 | A1 |
20130012811 | Schmitt et al. | Jan 2013 | A1 |
20130023761 | Petroff | Jan 2013 | A1 |
20130051728 | Petroff | Feb 2013 | A1 |
20130072805 | Schmitt et al. | Mar 2013 | A1 |
20130123616 | Merritt et al. | May 2013 | A1 |
20130303910 | Hubbard et al. | Nov 2013 | A1 |
20130310698 | Judell et al. | Nov 2013 | A1 |
20140018669 | Xu | Jan 2014 | A1 |
20140024931 | Winston et al. | Jan 2014 | A1 |
20140094660 | Tolkowsky et al. | Apr 2014 | A1 |
20140094689 | Cohen et al. | Apr 2014 | A1 |
20140094691 | Steinberg et al. | Apr 2014 | A1 |
20140094692 | Tolkowsky et al. | Apr 2014 | A1 |
20140094693 | Cohen et al. | Apr 2014 | A1 |
20140094697 | Petroff et al. | Apr 2014 | A1 |
20140114182 | Petersen et al. | Apr 2014 | A1 |
20140114184 | Klaiman et al. | Apr 2014 | A1 |
20140114185 | Tolkowsky et al. | Apr 2014 | A1 |
20140142427 | Petroff | May 2014 | A1 |
20140142432 | Hutchins et al. | May 2014 | A1 |
20140142436 | Hutchins et al. | May 2014 | A1 |
20140187929 | Schmitt et al. | Jul 2014 | A1 |
20140218742 | Adler | Aug 2014 | A1 |
20140249407 | Adler et al. | Sep 2014 | A1 |
20140268167 | Friedman et al. | Sep 2014 | A1 |
20140270445 | Kemp | Sep 2014 | A1 |
20140276011 | Schmitt et al. | Sep 2014 | A1 |
20140276020 | Hutchins et al. | Sep 2014 | A1 |
20140309536 | Douk et al. | Oct 2014 | A1 |
20140379269 | Schmitt | Dec 2014 | A1 |
20150153157 | Schmitt et al. | Jun 2015 | A1 |
20150119707 | Schmitt | Jul 2015 | A1 |
20150192405 | Schmitt | Jul 2015 | A1 |
20150297373 | Schmitt et al. | Oct 2015 | A1 |
20150370229 | Adler et al. | Dec 2015 | A1 |
20160000406 | Petroff | Jan 2016 | A1 |
20160022208 | Gopinath | Jan 2016 | A1 |
20160058307 | Svanerudh | Mar 2016 | A1 |
20160070066 | Schmitt et al. | Mar 2016 | A1 |
20160073885 | Adler | Mar 2016 | A1 |
20160174925 | Dascal et al. | Jun 2016 | A1 |
20160313507 | Adler et al. | Oct 2016 | A1 |
20160335763 | Ambwani et al. | Nov 2016 | A1 |
20160335766 | Ambwani et al. | Nov 2016 | A1 |
20170024910 | Griffin | Jan 2017 | A1 |
20170140243 | Ambwani | May 2017 | A1 |
Number | Date | Country |
---|---|---|
2062526 | May 2009 | EP |
63-127201 | May 1988 | JP |
2016508750 | Mar 2016 | JP |
2006076409 | Jul 2006 | WO |
2007002685 | Jan 2007 | WO |
2011038044 | Mar 2011 | WO |
2012176191 | Dec 2012 | WO |
2013175472 | Nov 2013 | WO |
2014002095 | Mar 2014 | WO |
2014092755 | Jun 2014 | WO |
Entry |
---|
Shape-Driven Segmentation of the Arterial Wall in Intravascular Ultrasound Images. Unal et al. (Year: 2008). |
Unal G et al: “Shape-Driven Segmentation of the Arterial Wall in Intravascular Ultrasound Images”, IEEE Transactions on Information Technology in Biomedicine, IEEE Service Center, Los Alamitos, CA, US, May 1, 2008, pp. 335-347, vol. 12, No. 3, XP11345473. |
Wang et al: “3D assessment of stent cell size and side branch access in intravascular optical coherence tomographic pullback runs,” Computerized Medical Imaging and Graphics: The Official Journal of the Computerized Medical Imaging Society, Sep. 7, 2013, pp. 113-122, XP55364587, abstract only. |
Briguori et al., “Intravascular ultrasound criteria for the assessment of the functional significance of intermediate coronary artery stenoses and comparison with fractional flow reserve,” Am J. Cardiol 87:136-141, 2001. |
Kassab et al., “The pattern of coronary arteriolar bifurcations and the uniform shear hypothesis,” Annals of Biomedical Engineering 23 (1): 13-20, 1995. |
Hariri et al., “An automatic image processing algorithm for initiating and terminating intracoronary OFDI pullback” Biomedical Optics Express 1:2 566-573 (Sep. 1, 2010). |
Harrison et al., “The value of lesion cross-sectional area determined by quantitative coronary angiography in assessing the physiologic significance of proximal left anterior descending coronary arterial stenoses,” Circulation 69:6 1111-1119, 1984. |
Kirkeeide, “Coronary obstructions, morphology, and physiological significance,” in Reiber JHC and Serruys PW (eds.), Quantitative Coronary Arteriography, Kluwer Academic Publishers, the Netherlands, 1991, pp. 229-244. |
Kolyva et al., “Increased diastolic time fraction as beneficial adjunct of α1-adrenergic receptor blockade after percutaneous coronary intervention,” Am J Physiol Heart Circ Physiol 295: H2054-H2060, 2008. |
Kolyva et al., “ ‘Windkesselness’ of coronary arteries hampers assessment of human coronary wave speed by single-point technique,” Am J Physiol Heart Circ Physiol, 295: H482-H490, 2008. |
Laslett, “Normal left main coronary artery diameter can be predicted from diameters of its branch vessels,” Clinical Cardiology 18 (10): 580-582, 1995. |
Ofili et al., “Differential characterization of blood flow, velocity, and vascular resistance between proximal and distal normal epicardial human coronary arteries: analysis by intracoronary Doppler spectral flow velocity,” Am Heart J. 130:1 37-46, 1995. |
Ohta et al., “Rheological Changes After Stenting of a Cerebral Aneurysm: A Finite Element Modeling Approach,” Cardiovascular and Interventional Radiology (2005) 28:768-772. |
Pijls et al., “Fractional Flow Reserve (FFR) Post-Stent Registry Investigators Coronary pressure measurement after stenting predicts adverse events at follow-up: a multicenter registry”, Circulation 2002; 105:2950-2954. |
Seiler et al., “Basic structure-function relations of the epicardial coronary vascular tree, Basis of quantitative coronary arteriography for diffuse coronary artery disease,” Circulation 85 (6): 1987-2003, 1992. |
Siebes et al., “Single-wire pressure and flow velocity measurement to quantify coronary stenosis hemodynamics and affects of percutaneous interventions,” Circulation 109:756-762, 2004. |
Sihan et al., “A Novel Approach to Quantitative Analysis of Intravascular Optical Coherence Tomography Imaging,” Computers in Cardiology 2008; 35:1089-1092. |
Sihan et al., “Fully Automatic Three-Dimensional Quantitative Analysis of Intracoronary Optical Coherence Tomography: Method and Validation,” Catheterization and Cardiovascular Interventions 74:1058-1065 (2009). |
Spaan, “Coronary Blood Flow,” Ch 12. Dordrecht, The Netherlands: Kluwer Acedemic Publishers, Boston; 1991: pp. 333-361. |
Takagi et al., “Clinical potential of intravascular ultrasound for physiological assessment of coronary stenosis,” Circulation 100: 250-255, 1999. |
Verhoeff et al., “Influence of percutaneous coronary intervention on coronary microvascular resistance index,” Circulation 111:76-82, 2005. |
White et al., “Does visual interpretation of the coronary angiogram predict the physiologic importance of coronary stenoses?,” N. Engl J Med 310:13 819-824, 1984. |
Wilson et al., “Prediction of the physiologic significance of coronary arterial lesions by quantitative lesion geometry in patients with limited coronary artery disease,” Circulation 75: 723-732, 1987. |
Perez-Rovira et al., “Deformable Registration of Retinal Fluorescein Angiogram Sequences Using Vasculature Structures”, 32nd Annual Conf. of IEEE EMBS, 2010, pp. 4383-4386. |
Herrington et al., “Semi-automated boundary detection for intravascular ultrasound,” Computers in Cardiology 1992 Proceedings., pp. 103-106, Oct. 1992. |
Sonka et al., “Segmentation of intravascular ultrasound images: a knowledge-based approach,” IEEE Transactions on Medical Imaging, 14(4):719-732, Dec. 1995. |
Mojsilovic et al., “Automatic segmentation of intravascular ultra-sound images: A texture-based approach,” Annals of Biomedical Engineering, 25:1059-1071, Nov. 1997. |
Gil et al., “Automatic segmentation of artery wall in coronary IVUS images: a probabilistic approach,” Computers in Cardiology 2000; 27:687-690. |
Haas et al., “Segmentation of 3D intravascular ultrasonic images based on a random field model,” Ultrasound in Medicine & Biology, 26:2, 297-306, 2000. |
Kovalski et al., “Three-dimensional automatic quantitative analysis of intravascular ultrasound images,” Ultrasound in Medicine & Biology, 26(4):527-537, 2000. |
Pujol et al., “Intravascular Ultrasound Images Vessel Characterization using AdaBoost,” Functional Imaging and Modeling of the Heart: Lecture Notes in Computer Science, pp. 242-251, 2003. |
Taki et al., “Automatic segmentation of calcified plaques and vessel borders in IVUS images,” International Journal of Computer Assisted Radiology and Surgery, 3(3-4):347-354, Sep. 2008. |
Van den Berg et al., “Using three-dimensional rotational angiography for sizing of covered stents,” Am. J. Roentgenology, 178:149-152 (2002). |
Wong et al., “A novel method of coronary stent sizing using intravascular ultrasound: safety and clinical outcomes,” Int. J. Angiol. , 18(1): 22-24 2009. |
Bonnema et al., “An automatic algorithm for detecting stent endothelialization from volumetric optical coherence tomography datasets”, Physics in Medicine and Biology, 53 :12, Jun. 21, 2008, pp. 3083-3098. |
Unal et al., “Stent implant follow-up in intravascular optical coherence tomography images,” Int J Cardiovasc Imaging, DOI 10.1007/s10554-009-9508-4, published online Sep. 24, 2009, 8 pgs. |
Xu et al., “Characterization of atherosclerosis plaques by measuring both backscattering and attenuation coefficients in optical coherence tomography,” Journal of Biomedical Optics, 13:3, May/Jun. 2008, 8 pgs. |
Takano et al.. “Evaluation by Optical Coherence Tomography of Neointimal Coverage of Sirolimus-Eiuting Stent Three Months After Implantation,” American Journal of Cardiology, vol. 99, No. 8, Apr. 14, 2007, pp. 1033-1038. |
Tung et al., “Automatic Detection of Coronary Stent Struts in Intravascular OCT Imaging,” Proceedings of SPIE, vol. 8315, Feb. 22, 2012 (8 pgs.). |
Shengxian Tu et al., “In vivo comparison of arterial lumen dimensions assessed by co-registered three-dimensional (3D) quantitative coronary angiography, intravascular ultrasound and optical coherence tomography”, Int. J. Cardiovasc Imaging (2012) 28:1315-1327. |
Palti-Wasserman et al., “Identifying and Tracking a Guide Wire in the Coronary Arteries During Angioplasty from X-Ray Images”, IEEE transactions on biomedical engineering, 44:2, Feb. 1997, pp. 152-164. |
Dave Fomell, “The Advantages and Disadvantages of OCT vs. IVUS”, Diagnostic and Interventional Cardiology, May 18, 2011, pp. 1-4. |
Wang et al., “3D assessment of stent cell size and side branch access in intravascular optical coherence tomographic pullback runs”, Computerized Medical Imaging and Graphics (38) 2014, pp. 113-122. |
Unal et al., “Shape-Driven Segmentation of the Arterial Wall in Intravascular Ultrasound Images”, IEEE Transactions an Information Technology in Biomedicine, vol. 12, No. 3, May 2008 pp. 335-347. |
International Search Report and Written Opinion of the International Searching Authority for International application No. PCT/US2017/027680 mailed from the ISA dated Aug. 22, 2017 (20 pages). |
Number | Date | Country | |
---|---|---|---|
20200167923 A1 | May 2020 | US |
Number | Date | Country | |
---|---|---|---|
62322771 | Apr 2016 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15488005 | Apr 2017 | US |
Child | 16778252 | US |