Motion tracking system for real time adaptive imaging and spectroscopy

Information

  • Patent Grant
  • 10869611
  • Patent Number
    10,869,611
  • Date Filed
    Monday, December 11, 2017
    7 years ago
  • Date Issued
    Tuesday, December 22, 2020
    4 years ago
Abstract
This invention relates to a system that adaptively compensates for subject motion in real-time in an imaging system. An object orientation marker (30), preferably a retro-grate reflector (RGR), is placed on the head or other body organ of interest of a patient (P) during a scan, such as an MRI scan. The marker (30) makes it possible to measure the six degrees of freedom (x, y, and z-translations, and pitch, yaw, and roll), or “pose”, required to track motion of the organ of interest. A detector, preferably a camera (40), observes the marker (30) and continuously extracts its pose. The pose from the camera (40) is sent to the scanner (120) via an RGR processing computer (50) and a scanner control and processing computer (100), allowing for continuous correction of scan planes and position (in real-time) for motion of the patient (P). This invention also provides for internal calibration and for co-registration over time of the scanner's and tracking system's reference frames to compensate for drift and other inaccuracies that may arise over time.
Description
FIELD

This invention relates generally to the field of medical imaging, and more specifically to a system for correcting defects in medical images that are caused by a patient's movement during long duration in vivo (in the living body) scans, such as magnetic resonance scans.


BACKGROUND

“Tomographic” imaging techniques make images of multiple slices of an object. Multiple tomographic images can then be aligned and assembled using a computer to provide a three dimensional view. Some commonly used tomographic imaging techniques include magnetic resonance imaging (MRI) and magnetic resonance spectroscopy (MRS) techniques, which are ideal for assessing the structure, physiology, chemistry and function of the human brain and other organs, in vivo. Because the object of interest is often imaged in many slices and scanning steps in order to build a complete three dimensional view, scans are of long duration, usually lasting several minutes. To increase resolution (detail) of a tomographic scan, more slices and more scanning steps must be used, which further increases the duration of a scan. Magnetic resonance and other long duration imaging techniques (including tomographic techniques), now know or hereafter invented (hereinafter collectively referred to as “MW” or “MRI”) can also afford relatively high spatial and temporal resolution, are non-invasive and repeatable, and may be performed in children and infants.


In addition to MR, other types of scans require multiple repeated exposures, separated in time, of an entire (not slices) object (such as an organ), such as angiograms, in which a dye is injected into a blood vessel and then scans separated in time are taken to determine how and where the dye spreads. These types of scans that detect motion inside a patient or other object over time (“digital angiography systems”) can also have a long duration, and be subject to the problem of patient or object motion.


Many tomographic imaging techniques rely on detecting very small percentage changes in a particular type of signal, which makes these techniques even more susceptible to movements. In functional magnetic resonance imaging, for example, changes in the properties of blood in brain areas activated while subjects are performing tasks causes small signal changes (on the order of a few percent) that can be detected with MR. However, these small signal changes may easily be obscured by signal changes of similar or even greater size that occur during unintentional subject movements.


Because tomographic techniques require that so many images be taken (because so many slices and scanning steps are necessary), the scan has a long duration, so that motion of the subject is a substantial problem for acquiring accurate data. Consequently, subjects commonly are required to lie still to within one millimeter and one degree over extended time periods. Similar requirements exist for other modem imaging techniques, such as Positron Emission Tomography (PET), Single Photon Emission Computerized Tomography (SPECT) and “computer tomography” (CT). These strict requirements cannot be met by many subjects in special populations, such as children and infants, very sick patients, subjects who are agitated perhaps due to anxiety or drug use, or patients with movement disorders, resulting in data with motion artifacts. Similarly, it is exceedingly difficult to perform scans in awake animals.


The basic problem is that it may take several minutes for a scan to be completed, but the patient or other object being scanned cannot remain still for several minutes. Further, the space for a patient or other object being scanned (the “scanning volume”) in an MR machine is very limited—there is very little space in an MR machine once a patient has been positioned inside for a scan.


Several techniques have been developed over the past decades to reduce the sensitivity of scans to motion of the patient or other object being scanned.


Early techniques utilized specially designed scan sequences (“first-order flow/motion compensation”) to minimize the effects of motion. While these approaches are particularly useful for reducing artifacts (or imaging errors) due to flowing blood, swallowing or eye movements, they afford little improvement during movements of entire organs, such as head movements.


Articles entitled “Self-navigated spiral fMRI: interleaved versus single-shot” by Glover G H, et al, in Magnetic Resonance in Medicine 39: 361-368 (1998), and “PROPELLER MRI: clinical testing of a novel technique for quantification and compensation of head motion” by Forbes K, et al, in the Journal of Magnetic Resonance Imaging 14(3): 215-222 (2001), both incorporated herein by reference, disclose how improved sampling schemes for the MRI data can reduce sensitivity to motion. These techniques can reduce motion sensitivity of MR scans under certain conditions, but cannot eliminate errors from motion under all conditions or for very quick movements.


With certain modern ultra-fast “single-shot” imaging techniques (such as “echoplanar imaging”), the entire head (or other organ of interest) is scanned continuously every few seconds (over the course of minutes), for instance, for “functional MRI”. This makes it possible to determine the “pose”, defined as position and rotation, of the head at each instant relative to the initial pose, using image registration (alignment of images). Once the pose for a given instant is known (relative to the initial image), the scanner's image for that instant can be re-aligned to the initial image. For example, the article entitled “Processing strategies for time-course data sets in functional MRI of the human brain” by Bandettini P A, et aI, in Magnetic Resonance Medicine 30: 161-173 (1993), incorporated herein by reference, disclosed how realignment of MRI volumes (consisting of multiple slices) can be used to correct for head motion in functional MRI time series. However, these methods are inherently slow because they use MRI, i.e. they correct movements only every few seconds, and are unable to correct for motion in certain directions (orthogonal to the scan planes; in other words, towards or away from the planes in which the scans are being taken).


While all of these techniques reduce sensitivity to subject motion, several problems remain. One major problem is related to the manner in which typical tomographic imaging methods acquire data. Specifically, the data for each cross section


(slice) is acquired by moving step by step along “lines” in a mathematical space (“k-space”). The data acquisition step is typically repeated hundreds of times, until all lines in the k-space have been filled. For all methods described above, even if motion sensitivity for each individual acquisition (defining a line in k-space) is reduced, these methods typically do not account for variations in head pose amongst the different k-space lines. Second, the methods poorly tolerate fast movements within individual acquisition steps. Finally, one of the most significant issues is that none of these techniques can be applied universally across all the various scanning methods (pulse sequences—the order and manner in which slices are imaged) used in MRI or other tomographic scanning techniques.


One of the most promising approaches to motion correction is to track the pose of the head, brain or other organ of interest (or other object) in real time, during a scan, and to use this pose information to compensate for the detected motion in data acquisitions for subsequent slices within the same scan. This is called adaptive imaging, because the image is adapted during the scan to compensate for the detected motion.


One important aspect of adaptive imaging is the accuracy (or “resolution”) of the motion tracking system. Because of the high resolution needed for medical imaging, the motion tracking system must also have a high resolution, because the motion tracking system's information will be used to align the images of each slice. If the motion tracking system's resolution is high enough, each of the scan images can be accurately aligned (registered) despite a patient's motion.


An article entitled “Prospective multi axial motion correction for fMRI” by Ward H A, et al, in Magnetic Resonance in Medicine 43:459-469 (2000), incorporated herein by reference, discloses the use of “navigator” signals to estimate the pose of the head and to dynamically correct for head motion.


An article entitled “Spherical navigator echoes for full 3D rigid body motion measurement in MRI” by Welch E B, et al, in Magnetic Resonance in Medicine 47:32-41 (2002), incorporated herein by reference, discloses the use of an MR-based navigator for adaptive motion correction in MRI.


Similarly, an article entitled “Endovascular interventional magnetic resonance imaging.” by Bartels L W, et al, in Physics in Medicine and Biology 48(14): R37-R64 (2003), and another article entitled “Real-time, Interactive MRI for cardiovascular interventions” by McVeigh E R, et aI, in Academic Radiology 12(9): 1121-1127 (2005), both of which are incorporated herein by reference, disclose the use of small radio frequency (RF) coils for tracking catheters during interventional MRI.


While these MR-based “adaptive MRI” techniques provide good results in many situations, they intrinsically interfere with MR acquisitions, work only for a limited number of MR sequences, and are limited to measuring the position or pose a few times per second only.


In order to overcome these shortcomings, recent approaches to real time (lion the fly”) motion correction utilize optical techniques to track subject motion, rather than MR-based methods. The pose information from the tracking system is sent to the scanner and used by the scanner to compensate for the motion in real time. Optical systems are very suitable among alternative tracking technologies because they provide accurate, non-contact sensing with a passive and non-magnetic target. In particular, stereovision (SV) systems have been used for motion tracking for medical imaging.


Stereovision systems employ a target with 3 or more visible landmarks, and at least 2 tracking cameras. By detecting the landmarks in images captured by the cameras and comparing their measured positions and shapes to the known shape of the target, the target position and orientation can be determined, SV systems offer important features including sub-millimeter accuracy when fully calibrated, and update rates limited only by the camera and computing hardware.


However, SV systems have three limitations for adaptive MR imaging: (1) measurement accuracy decreases as the distance between the cameras becomes smaller, (2) the accuracy of orientation measurement decreases as the target becomes smaller; and (3) SV systems have high sensitivity to errors in internal calibration, i.e. small errors in the relative position or rotation of the cameras may cause large errors in the measured target pose. Therefore, SV systems require periodic recalibration. However, accurate calibration has to be performed manually, using a specialized calibration tool or target, is time consuming, and cannot be done while patients are being scanned.


Furthermore, stereovision systems achieve their best accuracy when the separation distance between the cameras is comparable to the distance between the cameras and the target. However, this ideal separation is not possible in an MR scanner because the opening to the scanning volume (the volume which can be scanned by the scanner) is relatively narrow, making it impossible to move the cameras sufficiently far apart and still view into the scanning volume. Additionally, tracking with SV cameras works optimally with larger tracking targets; however, the space in the MR or other scanner environment is very limited.


As noted above, slight errors in the internal calibration of SV systems can produce large measurement errors. For example, an article entitled “Prospective Real-Time Slice-by-Slice 3D Motion Correction for EPI Using an External Optical Motion Tracking System” by Zaitsev, M C et al, ISMRM 12, Kyoto (2004), which is incorporated herein by reference, tested the use of an SV system for adaptive functional MRI. The system was able to provide 0.4 mm accuracy when ideally calibrated. However, the study contains information showing that a tiny 1I100th degree change in the camera alignments can produce a 2.0 mm error in the position measurement and the study co-authors privately communicated to the present inventors that maintaining calibration was impracticably difficult. Even with extremely careful and rigid engineering of the camera module of an SV system, a measurement drift on the order of 1 mm can be observed while the SV motion tracker warms up, and recommend warm-up periods are 1 to 1.5 hours to avoid drift. Tremblay M, Tam F, Graham S J. Retrospective Coregistration of Functional Magnetic Resonance Imaging Data Using External Monitoring. Magnetic Resonance in Medicine 2005; 53:141-149, incorporated herein by reference.


The prior art has no means to track or correct for these slow changes while the medical imaging system is in service, imaging patients. The error which accumulates in the co-registration, because of loss of camera calibration, is a severe problem for motion compensation in medical imaging using an external tracking system.


As a result, an SV tracking system requires frequent recalibration to accurately determine its position relative to the imaging system. The recalibration procedure involves scanning a specialized calibration tool or sample (“phantom”) at multiple, manually-adjusted positions, both with the Medical imaging system and the SV system.


An article entitled “Closed-form solution of absolute orientation using unit quaternions” by Horn, B K P, J. Opt. Soc. Am. 1987; 4:629-642, which is incorporated herein by reference, describes the commonly used “absolute orientation” method. However, since time on a medical imaging system is limited and expensive, removing patients and conducting repeated recalibration with a specialized calibration tool is prohibitively expensive.


Furthermore, Zaitsev et al utilized a relatively large reflective marker approximately 10 em (4 inches) in size, which was affixed to the subjects' head in the scanner by means of a bite bar. While a bite bar may be tolerated by healthy and cooperative volunteers, it is an impractical solution for sick or demented patients, or young children.


Therefore, while stereovision systems are able to track subject motion for use with adaptive imaging techniques when conditions are ideal, the use of SV systems for routine clinical scans proves impractical due to cumbersome recalibration procedures, instabilities over time, and awkward size and attachment of tracking markers (i.e. large marker requiring use of a bite bar).


Motion tracking can be improved using prediction means to predict motion, including (without limitation) motion filter and prediction methods. For adaptive MR imaging, the scanner controller requires values of the subject pose at the exact instant adjustments to the scan are applied (Scanning Timing Information), The determination of the subject pose based on actual measurements is an estimation problem. The simplest estimator takes the most recent measurement as the current pose. This simple estimator has been used frequently, for example in an article entitled “Prospective Real-Time Slice-by-Slice 3D Motion Correction for EPI Using an External Optical Motion Tracking System” by Zaitsev, M. C., et al, ISMRM 12, Kyoto (2004), incorporated herein by reference.


However, this simple estimator neglects three types of information that can improve the accuracy of the estimate of subject pose: (1) measurements prior to the most recent measurement may add information (reduce the covariance of the estimate) if those prior measurements disclose a velocity of the subject's motion; (2) a biomechanical model, in conjunction with the measurement statistics, can be used to constrain the estimated motion (the subject's body only moves in certain ways); and (3) information about the lag time between the pose measurement and the time of the MR scans. By utilizing these additional sources of information, the accuracy of motion tracking and thus of adaptive imaging will be enhanced.


Extended Kalman filtering, which is essentially model-based filtering with simultaneous estimation of the signals and their statistics, is statistically optimal in certain cases and is the most effective framework for incorporating information of types (1), (2) and (3). Kalman filtering has a long history of use in aerospace applications, such as target tracking, aircraft guidance and formation flying of spacecraft, for example in U.S. Pat. No. 5,886,257 “Autonomous Local Vertical Determination Apparatus and Methods for a Ballistic Body,” incorporated herein by reference, which teaches the use of Kalman filtering applied to inertial signals. Kalman filtering has also been previously demonstrated for head motion tracking, for example in “Predictive Head Movement Tracking Using a Kalman Filter”, IEEE Trans. on Systems, Man, and Cybernetics Part B: Cybernetics 1997; 27:326-331, by Kiruluta A, Eizenman M, and Pasupathy S, incorporated herein by reference. Kalman filtering is also disclosed in US Patent reference.


Of course, persons of ordinary skill in the art are aware that the prediction means can be implemented in hardware, software, or by other means, and that there are equivalent processes and algorithms to perform the prediction function of the motion filtering and prediction means disclosed above.


U.S. Pat. Nos. 5,936,722, 5,936,723 and 6,384,908 by Brian S. R. Armstrong and Karl B. Schmidt, et al, which are incorporated herein by reference, disclose “Retro-Grate Reflectors”, or RGRs, which allow accurate and fast position measurements with a single camera and a single, relatively small and light orientation marker. The RGR allows the visual determination of orientation with respect to the six degrees of freedom (the three linear directions of left and right, up and down, and forward and back, plus the three rotational directions of roll (rotation around a horizontal axis that points straight ahead), pitch (rotation around a horizontal axis that points side to side) and yaw (rotation around a vertical axis that points up and down)) by viewing a single marker. Pose (position and rotation) is orientation with respect to the six degrees of freedom. As used herein, an object orientation marker is any marker, such as an RGR marker, from which at least three degrees of freedom can be determined by viewing or otherwise remotely detecting the marker.


SUMMARY

Conceptually, the present invention generally includes a motion tracking system for an object in the scanning volume of a scanner, comprising: an object orientation marker attached to the object; a detector that repeatedly detects poses of the object orientation marker; a motion tracking computer that analyzes the poses of the object orientation marker to determine motion of the object between the repeated detections and to send tracking information to the scanner to dynamically adjust scans to compensate for motion of the object.


More specifically, the invention comprises: an object orientation marker attached to the object; a camera that records repeated images; a mirror in a fixed position with respect to the scanner positioned so that the camera records repeated reflected images of the orientation marker in the mirror; a motion tracking computer that analyzes the repeated reflected images of the object orientation marker to determine motion of the object between the repeated images and to send tracking information to the scanner to dynamically adjust scans to compensate for motion of said object.


Another aspect of the invention is a process for compensating for patient motion in the scanning volume of a scanner that has a motion tracking system, without a specialized calibration tool, even if the motion tracking system is out of alignment with the scanner, comprising: recording the patient motion both in scans of the patient by the scanner and in the motion tracking system, whereby the patient motion is simultaneously recorded in the coordinate frame of the scanner and in the coordinate frame of the motion tracking system; updating the measurement coordinate transformation from the motion tracking system coordinate frame to the scanner coordinate frame to compensate for drift and other calibration inaccuracies; transforming patient motion recorded in the coordinate frame of the motion tracking system into patient motion in the coordinate frame of the scanner using the updated measurement coordinate transformation.


A general embodiment of this invention comprises an object orientation marker attached to an object; a camera that views the object orientation marker directly; a first mirror in a fixed position with respect to the scanner positioned so that the camera can view a reflected image of the object orientation marker in the first mirror, so that the camera simultaneously records repeated direct images and repeated reflected images of the object orientation marker; and a motion tracking computer that analyzes both the repeated direct images and the repeated reflected images of the object orientation marker to determine motion of the object between the repeated images and to send tracking information to the scanner to dynamically adjust scans to compensate for motion of said object; a mirror orientation marker in a fixed position with respect to the first mirror positioned so that the camera can view a direct image of the mirror orientation marker simultaneously with a reflected image in the first mirror of the object orientation marker; a motion tracking computer that analyzes repeated reflected images of the object orientation marker in the first mirror and repeated direct repeated images of the mirror orientation marker to determine motion of the object between the repeated images and to send tracking information to the scanner to dynamically adjust scans to compensate for motion of said object.


Still another preferred embodiment of the invention comprises: a camera that records repeated images; an object orientation marker attached to the object; a first mirror in a fixed position with respect to the scanner positioned so that the camera can view the object orientation marker in the first mirror; a second mirror in a fixed position with respect to the first mirror positioned so that the camera can view reflected images of the object orientation marker in the second mirror simultaneously with reflected images of the object orientation marker in the first mirror; a mirror orientation marker in a fixed position with respect to the first mirror positioned so that the camera can view direct images of the mirror orientation marker simultaneously with reflected images of the object orientation marker in both the first mirror and the second mirror; a motion tracking computer that analyzes repeated reflected images of the object in the second mirror and repeated direct images of the mirror orientation marker, to determine motion of the object between the repeated images and to send tracking information to the scanner to dynamically adjust scans to compensate for motion of said object.


An additional feature of the present invention is that the mirrors and camera can be internally calibrated by analyzing the repeated direct images and the repeated reflected images.


Optionally, patient motion can be recorded both by scans of the object by the scanner and by repeated images of the object orientation marker, so that such patient motion is recorded in coordinate frames of both the scanner and of the detector and mirrors, whereby patient motion recorded in the coordinate frame of the detector and mirrors can be transformed into patient motion in the coordinate frame of the scanner.


An additional optional feature of the invention includes prediction means to predict orientation of the object at times when scans will be taken by the scanner, including motion filtering and prediction.


Of course, the scanner can be selected from the group consisting of MR scanners, PET scanners, SPECT scanners, CT scanners and digital angiography systems.


Operably the object orientation marker indicates pose in at least 3 degrees of freedom, but preferably the object orientation marker indicates pose in 5 degrees of freedom, and optimally in 6 degrees of freedom.


Preferably, the object orientation marker is an RGR. In general terms, the invention comprises: an adaptive imaging system; a motion tracking system; and a motion filtering and prediction system; wherein the motion tracking system provides tracking information to the adaptive imaging system to dynamically adjust scans to compensate for motion of said object; and wherein the motion filtering and prediction system provides predicted pose of the object when the imaging system takes scans.


Briefly, and in general terms, the present invention provides for a system for automatic real-time correction of subject motion during long duration scans, including (but not limited to) “tomographic” (or cross-sectional) imaging, specifically MRI scans. The present invention is a motion tracking system that is MRI-compatible, highly accurate, robust, self-calibrating, has a potential time resolution in the millisecond range, and can be integrated with any existing MR technique. The adaptive MR system has 3 main components, as shown in FIG. 1: (1) RGR-based tracking system, (2) interface between tracking system and MR scanner, and (3) MR scanner providing scanning sequences that allow dynamic adjustment of geometric scanning parameters (such as slice locations and orientations). The camera-based system relies on Retro-Grate Reflectors, or RGRs, which allow accurate and fast pose measurements with a single camera and a single, relatively small marker (approximately 1 cm size). Pose updates from the tracking system are sent to the MRI scanner via the interface. Tomographic scanning methods make it possible to image multiple cross-sections (“slices”) of the body; each slice is defined by a position and rotation in space. The MR scanning sequences continuously read the pose information from the tracking system, and the slice locations and rotations are updated dynamically, such that scanning planes or volumes track the poses of the object (such as an organ) to which the target is attached. This results in scans that are virtually void of motion-artifacts. Very fast movements with velocities of 100 mm/sec or greater can be corrected, which represents an approximate 10 to 100-fold improvement over current techniques.


One important component of the presently preferred embodiment of this invention is the Retro-Grate Reflector (RGR), a new tool that makes it possible to accurately determine the 3 locations and 3 rotations (“6 degrees of freedom” or “pose”) of a target from a single image. An RGR target is illustrated in FIG. 13. It is constructed by applying artwork on the front and back of a transparent substrate, such as a glass or plastic plate. The artwork includes a StarBurst landmark, shown in the center of FIG. 13, and circular landmarks. Also included are front and back gratings to produce a series of banded patterns (“moire” patterns), which are shown as light and dark fringes in FIG. 13.


The moire patterns of the RGR target are designed to be exquisitely sensitive to changes in orientation. As a result, the RGR system is able to accurately determine all 6 degrees of freedom (3 translations and 3 rotations) from a single camera image. Of course, an RGR can be used to extract less than 6 degrees of freedom.


In the context of adaptive imaging to correct for subject motion, RGR motion tracking addresses the shortcomings of stereovision by: (1) incorporating only one camera, thus removing the requirement for a significant separation between cameras, and (2) interpreting moire patterns so that high accuracy can be achieved even if the object orientation marker (also referred to as a target or tag) is small, and (3) providing redundant information for use in detecting and correcting drift and other calibration inaccuracies by internal calibration.


If desired, further innovations (described below) allow for 3) simultaneous motion tracking and determination of the internal calibration, 4) use of two or more “visual paths” to avoid loss of sight during large movements, 5) a 10-fold increase in tracking accuracy compared to stereovision, and 6) continuous automatic calibration (or “auto-tuning”) of the system in order to eliminate the effect of drift and other calibration inaccuracies, such as those due to temperature changes, vibration, etc.


One innovation is to use a mirror to detect an object orientation marker. A mirror shall include any device to allow an object orientation marker to be viewed along an indirect line of sight, including, without limitation, a prism, a beam splitter, a half silvered mirror, fiber optics, and a small camera.


Another innovation is to incorporate motion filtering and prediction to improve performance of a limited-quality motion sensing means. Motion filtering refers to using information about an object's prior positions to infer its motion and thereby improve accuracy in determining pose (over methods which look only at the most recent position and ignore prior positions).





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a conceptual side elevational view of a system for RGR-based motion tracking for real-time adaptive MR imaging and spectroscopy.



FIG. 2 is a flow chart of steps for adaptive MR imaging in an alternative embodiment, incorporating RGR-based motion sensing for adaptive MR imaging.



FIG. 3 is a flow chart of steps for RGR-based adaptive MR imaging in the preferred embodiment, incorporating RGR-based adaptive MR imaging and optional motion filtering and prediction.



FIG. 4 is a flow chart of steps for adaptive MR imaging in an alternative embodiment, incorporating motion sensing by any suitable means such as MR scan analysis and optional motion filtering and prediction.



FIG. 5 is a flow chart of steps for adaptive MR imaging in an alternative embodiment in which the motion filtering is performed separately.



FIG. 6 is a side elevational view of the physical layout of a preferred embodiment of adaptive RGR-MRI configuration.



FIG. 7 is a top plan view of the embodiment of FIG. 6.



FIG. 8 is a back elevational view of the embodiment of FIG. 6.



FIG. 9 is a camera view, showing the mirrors and object orientation markers in the camera in the embodiment of FIG. 6, and also showing placement of optional RGRs on mirrors.



FIG. 10 is a conceptual diagram illustrating that motion of the subject can be determined in both the coordinate frames of the motion tracking system and of the MR machine.



FIG. 11 is a conceptual flow chart illustrating a system for continuous tuning (“Auto-tuning”) of the co-registration transformation between a Motion Tracking system and a Medical Imaging system.



FIG. 12 is a flow chart of steps for Auto-tuning for automatic and continuous adjustment of the Co-registration Transformation between a Motion Tracking system and a Medical Imaging system.



FIG. 13 is a drawing of an RGR target.





DETAILED DESCRIPTION


FIGS. 1 and 2 illustrate the essential elements of the presently preferred embodiments of a system for motion tracking for real-time adaptive imaging and spectroscopy. The best modes are illustrated by way of example using a patient in an MR scanner and RGR object orientation marker, but of course, other objects can be scanned besides patients, other scanners can be used besides MR scanners, and other object orientation markers can be used besides RGRs.


As shown in FIG. 1, a patient P is imaged in a scanning volume V inside an MR scanner magnet 20. An RGR tag or target 30 is affixed to the patient P near the organ of interest being scanned (e.g., the head). A detector, such as a camera 40 (the “RGR Camera”) outside the scanner magnet 20 observes the RGR target 30, either directly or optionally via one or more mirrors on the wall of the scanner bore or in some other convenient location (not shown). As also shown in FIG. 2, the RGR Camera 40 is connected to the RGR Processing Computer 50. The RGR Processing Computer 50 performs several functions, including analyzing images 60 of the RGR to produce RGR Motion Information. Additionally, an accurate clock in the RGR Processing Computer 50 produces Timing Information related to the RGR Motion Information to provide Motion and Timing Information 70.


A Scanner Control and Processing Computer 100 is connected to the MR Scanner 120 and also to the RGR Processing Computer 50. RGR Motion and Timing Information 70 is passed from the RGR Processing Computer 50 to the Scanner Control and Processing Computer 100. In one embodiment, Timing Information related to the MR scan (Scanner Timing Information) is produced by the Scanner Control and Processing Computer 100 and passed to the RGR Processing Computer 50 with a request for RGR Motion Information. The RGR Processing Computer 50 uses the Scanner Timing Information in conjunction with the RGR Motion Information and RGR Timing Information to produce Motion Information at time instants determined by the Scanner Control and Processing Computer 100. Both the scanner and the motion tracking system have inherent lag times between acquiring an image and completing the image, due to computation delays and other factors. The motion tracking system's lag time in acquiring images may be on the order of milliseconds, but the scanner's lag time in acquiring images may be on the order of seconds to minutes.


The Scanner Control and Processing Computer 100 utilizes RGR Motion Information from the RGR Processing Computer 50 and makes calculations to adapt the MR Pulse Sequence (the sequence of pulses used to acquire tomographic images) to the motion information. The adapted MR Pulse Sequence parameters are used to drive the MR Scanner 120.



FIG. 3 provides a flow chart of the steps of the preferred embodiment of RGR based adaptive MR imaging and spectroscopy using optional motion filtering and prediction. System elements of RGR Camera, RGR Lighting and RGR target are used to obtain RGR Motion Tracking Images. The RGR Images are passed to the RGR Processing Computer where they are analyzed, which produces RGR Motion and RGR Timing Information. This information is optionally passed to a Motion Filtering and Prediction routine, which also receives Scanner Timing Information in the form of time values for future instants at which the Scanner Control and Processing Computer will apply Motion Information, The Motion Filtering and Prediction element analyzes a plurality of recent RGR Motion and Timing Information as well as Scanner Timing information to produce Adjusted Motion Information, which is the best estimate of the subject's pose at the future time indicated in the Scanner Timing Information. The Adjusted Motion Information corresponding to the Scanner Timing Information is passed to the Scan Control and MR Pulse Sequence Generation element.


The Scan Control and MR Pulse Sequence Generation element receives Adjusted Motion Information for corresponding Scanner Timing Information and generates Adapted Pulse Sequence Parameters, which are executed on the MR Scanner, thus realizing RGR-based adaptive MR imaging and spectroscopy.


Essentially, the motion tracking information is used to predict the change in pose of the patient due to movement, and the predicted pose is sent to the scanner, which then dynamically adjusts the pose of each scan plane or volume to compensate for the patient's movement.


Comparing the flow chart of FIG. 2 with the flow chart of FIG. 3, in the preferred embodiment, the “Motion Filtering and Prediction” routines run on the RGR Processing Computer 50, and there is no separate computer for the optional motion filtering and prediction calculations, which are relatively minor from the standpoint of computer burden. In alternative embodiments, the Motion Filtering and Prediction routines could run on a separate computer (or hardware or software), or on the Scanner Control and Processing Computer.



FIG. 4 illustrates an alternative embodiment of the invention. In this embodiment, any suitable motion and timing sensing means is used, including, but not limited to, motion sensing by image analysis, as is known in the prior art, such as commercially available Stereo Vision systems. The innovation in this embodiment is to employ a Motion Filtering and Prediction element to analyze a plurality of recent RGR Motion and Timing Information as well as Scanner Timing information to produce Adjusted Motion Information, which is the best estimate of the subject pose at the time indicated in the Scanner Timing Information. The Adjusted Motion Information is passed to the Scan Control and MR Pulse Sequence Generation element.


The Scan Control and MR Pulse Sequence Generation element receives Adjusted Motion Information and generates Adapted Pulse Sequence Parameters, which are sent to the MR Scanner and executed, thus realizing RGR-based adaptive MR imaging and spectroscopy.


Yet another embodiment is illustrated in FIG. 5. In this alternative embodiment the Motion Filtering calculations are executed by a Motion Tracking system computer, and the Motion Filter State and Timing Information are transferred to the Scanner Control and Processing Computer. The Prediction portion of the Motion Filtering and Prediction algorithm utilizes the Motion Filter State and Timing Information, as well as Scanner Timing Information that is internal to the Scanner Control and Processing Computer, to predict the subject pose at the time indicated in the Scanner Timing Information.



FIGS. 6 to 8 show various views of the presently preferred embodiment of the RGR-based adaptive MR imaging and spectroscopy system. Each view illustrates the relationship of the scanning volume V (here, the bore of an MR Scanner magnet), detector (here, a camera 40) and object orientation marker 30 (preferably an RGR tag, target or marker). The camera 40 is preferably outside and behind the scanner magnet 20.


Also seen in the figures are optional mirrors M1 and M2, each with or without a separate optional RGR, which are used to allow the camera 40 to be placed outside a direct line of sight with the object orientation marker 30, to avoid blockage and for other reasons. Considering the openings that are typically available in the coil surrounding the subject's head during MR scans, the top position point-of-view offers superior measurement accuracy. FIG. 6 also shows the position of the origin 0 of the medical imaging coordinate frame.


In one preferred embodiment of the invention, if the patient requires a brain or head scan, one RGR target 30 (the “mobile RGR tag”) is affixed to the side of the nose of the patient. This particular location has the advantage of being relatively immobile during head movements. However, a person knowledgeable in the art will recognize that the mobile RGR tag may also be affixed to other parts of the body.


In one preferred embodiment of the invention, a single mirror is used to observe the mobile RGR target from the camera. In another preferred embodiment of the invention, a mirror orientation marker (a “stationary marker”), preferably an RGR tag, is mounted on the single mirror. This mirror RGR tag is directly visible from the camera, and is being analyzed continuously in addition to the mobile RGR on the organ of interest. Analyzing the pose of the mirror RGR makes it possible to ensure the “internal calibration” of the RGR tracking system, i.e. to ensure the relative position of the camera and mirror are known accurately.


In yet another embodiment of the invention, two or more mirrors are used to observe the mobile RGR from the camera. The mirrors are arranged such that the reflected image of the mobile RGR is visible to the camera in all of them. Having two or more mirrors makes it possible to observe the mobile RGR on the patient, and determine the patient pose, even if one of the views is obstructed.


In another preferred embodiment of the invention, a single camera observes the mobile RGR on the subject directly as well as indirectly, creating two lines of sight. The camera is pointed towards a semi-transparent mirror (or prism) that splits the optical path into two. The direct, non-reflective optical path is pointed towards the mobile RGR, allowing a direct line of sight. The reflective optical path leads towards a second mirror or prism (fully reflective), and is redirected towards the RGR. One or both of the two mirrors or prisms can be equipped with RGRs, to enable internal calibration. This configuration allows mounting of the camera inside the MRI scanner bore, and provides the same advantages as the two-mirror/stationary RGR system disclosed herein.


In yet another embodiment of the invention, a single-camera is pointing directly towards the mobile RGR. However, half the field-of-view of the camera is obstructed by a mirror or prism. The reflected optical path leads towards a second mirror or prism that redirects the optical path towards the RGR. One or both of the two mirrors or prisms can be equipped with RGRs, to enable internal calibration. This configuration allows mounting of the camera inside the MRI scanner bore, and provides the same advantages as the two-mirror/stationary RGR system disclosed herein.


In another preferred embodiment of the invention, additional mirror orientation markers, preferably stationary RGR tags, are mounted on each of two or more mirrors, or on brackets holding one or more of the mirrors. The mirrors and stationary RGR tags are arranged such that the mobile RGR tag and all the stationary RGR tags are visible from the camera. All stationary RGR tags, as well as the mobile RGR tag on the patients, are being analyzed continuously. It would be expected that the accuracy of optical measurements would suffer if more optical elements are introduced into the measurement system because of the need to maintain more elements in alignment. However, by analyzing all the information from all RGRs simultaneously, this particular embodiment of the invention results in a dramatic and unexpected improvement in accuracy of the tracking system, such that the tracking accuracy is unexpectedly approximately 10-fold greater than that of a conventional stereo-vision system accuracy of the tracking system, such that the tracking accuracy is unexpectedly approximately 10-fold greater than that of a conventional stereo-vision system.


In another embodiment of this RGR-based adaptive MR imaging and spectroscopy system, the tracking camera is installed inside the MR magnet and observes the mobile RGR target either directly or via one or more mirrors (each with or without its own stationary RGR). In this instance, the camera needs to be shielded to avoid interference with the MR measurement system.



FIG. 9 exemplifies an RGR camera view which would be typical in the preferred embodiment with two mirrors M1 and M2. Optionally, mirror orientation markers 200A and 200B can be attached to the mirrors M1 and M2. The RGR Camera is arranged to produce an image of the mirrors, and the mirrors are arranged so that the mobile RGR tag is reflected in both of the mirrors and two reflected images of the mobile RGR tag 30R1 and 302 are visible to the camera. Two (or more) mirrors are used to obtain multiple views of the RGR target in a single image. Optionally, the mirror orientation markers 200A and 200B also can be viewed directly by the camera.


While the use of two or more mirrors, each with its optional associated stationary mirror RGR, may seem more cumbersome and error-prone than a single-mirror configuration, it provides several important and unexpected advantages. First, the multiple views of the mobile RGR target provide multiple lines of sight. One advantage of obtaining multiple views of the RGR target is that at least one view will remain clear and available for motion tracking, even if another view is obscured. A view can be obscured by, for example, a portion of the head coil that surrounds the head of the subject during functional MR scanning. A second advantage of obtaining multiple views of the mobile RGR target is an unexpected and dramatic improvement in the accuracy of the motion tracking system, such that the 2-mirror system is approximately 10 times more accurate than a stereovision tracking system. Therefore, a multi-mirror multi-RGR system provides substantial advantages that cannot be reproduced with other typical motion tracking systems, such as a stereovision system.


While the use of two or more mirrors, each with its optional associated stationary mirror RGR, may seem more cumbersome and error-prone than a single-mirror configuration, it provides several important and unexpected advantages. First, the multiple views of the mobile RGR target provide multiple lines of sight. One advantage of obtaining multiple views of the RGR target is that at least one view will remain clear and available for motion tracking, even if another view is obscured. A view can be obscured by, for example, a portion of the head coil that surrounds the head of the subject during functional MR scanning. A second advantage of obtaining multiple views of the mobile RGR target is an unexpected and dramatic improvement in the accuracy of the motion tracking system, such that the 2-mirror system is approximately 10 times more accurate than a stereovision tracking system. Therefore, a multi-mirror multi-RGR system provides substantial advantages that cannot be reproduced with other typical motion tracking systems, such as a stereovision system.


Yet another preferred embodiment of the invention involves a combination of any of the embodiments of the RGR-based tracking system described above, with a system that makes it possible to automatically and continuously calibrate the RGR-tracking system (“auto-tuning”), in order to eliminate the effect of drift and other calibration inaccuracies in the camera system. As noted above, because the required co-registration accuracy (between the Medical imaging system and the tracking system) is very high (on the order of 0.1 mm and 0.1 degree for Medical Imaging) and because the elements of prior art measurement systems can be widely separated (for example, by several meters for Magnetic Resonance imaging), thermal drift, vibration and other phenomena can cause the alignment (“co-registration”) between the motion tracking system coordinate frame c and scanning system coordinate frame M to change over time. The prior art has no means to track or correct for these slow changes while the medical imaging system is in service, imaging patients. The error which accumulates in the co-registration is a severe problem for motion compensation in medical imaging using an external motion tracking system. Time on a medical imaging system is limited and expensive, and removing patients and conducting periodic recalibration with a specialized calibration tool or target is prohibitively expensive.



FIG. 10 illustrates the coordinate frames of a system for real-time adaptive Medical Imaging. The system comprises a Motion Tracking System (preferably tracking motion in real time), such as the RGR tracking system, which produces timely measurements of the subject pose within a motion tracking coordinate frame ‘c’.


Simultaneously, the subject is imaged by a Medical Imaging system, such as an MR Scanner, which operates within a medical imaging coordinate frame ‘M’. Improved medical images are obtained if (real-time) Motion Information is available to the Medical Imaging system, but the Motion Information must be accurately translated (or transformed) from the real-time motion tracking system (coordinate frame ‘c,’) to the coordinate frame ‘M’ of the Medical Imaging system. The motion tracking system is considered “calibrated” with respect to the MR system if the mathematical transformation leading from one coordinate system to the other coordinate system is known. However, the calibration (or alignment) of the two coordinate systems can be lost, introducing inaccuracies, due to drift over time because of various factors, including heat and vibration.


Motion Information is transformed from frame ‘c’ to frame ‘M’ by a “coordinate transformation matrix”, or “Co-registration transformation Tc←M.” The “coordinate transformation matrix” converts or transforms motion information from one coordinate frame to another, such as from the motion tracking coordinate frame c to the medical imaging coordinate frame M. Loss of calibration due to drift, as well as other calibration inaccuracies, will result in a change over time of the coordinate transformation matrix, which in turn will lead to errors in the tracking information.


U.S. Pat. No. 6,044,308, incorporated herein by reference, describes the AX=XB method of coordinate transformations. This patent teaches the use of the AX=XB method for determining the transformation from a tool coordinate frame to a robot coordinate frame, where the tool moves with the end effector of the robot over the course of many hours or days) due to temperature changes, vibrations and other effects. This variation introduces error into the Transformed Real-time Motion Information for real-time adaptive Medical Imaging (over the course of many hours or days) due to temperature changes, vibrations and other effects. This variation introduces error into the Transformed Real-time Motion Information for real-time adaptive Medical Imaging.



FIG. 11 illustrates the elements of an embodiment of the system for Auto-tuning for automatic and continuous determination of the co-registration transformation between a Motion Tracking system and a Medical Imaging system. A patient P is imaged inside a Medical Imaging system comprising a medical imaging device 220 and a Medical Imaging and Control & Processing Element 240. Simultaneously, a Motion Tracking system comprising a motion tracking detector 250, and a motion tracking processing element, such as any embodiment of the RGR-tracking system, makes real-time motion measurements. Using the co-registration transformation Tc←M, the real-time Motion Information is transformed from the Motion Tracking system coordinate frame to the Medical Imaging system coordinate frame. Concurrent with the processes described above, Delayed Medical Image Motion Information 260 and Delayed Motion Tracking Motion Information 270 is supplied to the Co-registration Auto-tuning Element 280. This information is delayed because the Medical Image Motion Information is only available in delayed form and typically much less frequently than the information from the tracking system. For instance, ultra-fast MRI scanning sequences, such as echo planar imaging (EPI), make it possible to scan the entire head, or other organs of interest, every few seconds. From each of these volumetric data sets, it is possible to determine head position and rotation, with a time resolution of a few seconds. Alternatively, navigator scans can provide position information a few times each second. Displacements of the subject are recorded from both sources of Motion Information, i.e. from the RGR motion tracking system, as well as an MRI scanner, e.g. registration of EPI-volumes or navigator scans. By comparing these measured displacements, the Co-registration Auto-tuning Element adjusts the coordinate transformation matrix Tc←M to compensate for changes in the co-registration of the Motion Tracking system and the Medical Imaging system. The updated value 290 of the coordinate transformation matrix Tc←M is repeatedly generated and supplied to the Motion Tracking system for use in transforming the Real-time Motion Information to Medical Imaging system coordinates 300.


In the preferred embodiment of the auto-tuning system, each of the three processing elements is implemented as computer software running on a separate computer. Those skilled in the art of real-time computer systems will see that other configurations are possible, such as all processing elements running on a single computer, or two or more computers working in coordination to realize one of the processing elements.


With automatic and continuous tuning of the co-registration transformation, the real-time Motion Information produced by the Motion Tracking System is accurately transformed into Medical Imaging system coordinates, so as to be usable by the Medical


Imaging system for real-time adaptive Medical Imaging, even in the presence of inevitable drift and other calibration inaccuracies arising from variations over time of the relative position and orientation of the Motion Tracking and Medical Imaging coordinate frames.



FIG. 12 provides a flow chart of the steps for Auto-tuning for automatic and continuous co-registration of a Motion Tracking system (for instance any embodiment of the RGR-tracking system described above), with a Medical Imaging system. The Medical Imaging system obtains Medical Images. These are analyzed by post processing using prior art methods to produce Delayed Medical Image Motion Information in the form of the measured displacement of the imaging subject (e.g., the patient's head) between two times, tk1 and tk2. This displacement is measured in the Medical Imaging system coordinate frame.


Concurrently, the Motion Tracking system is used to obtain real-time Motion Information, which may be transformed into the Medical Imaging system coordinates to provide for real-time adaptive Medical Imaging. The Motion Tracking Motion Information is also stored in a buffer. Past values of the Motion Tracking Motion Information from the buffer are used to determine a second displacement of the imaging subject as detected by the Motion Tracking system, between the two previously mentioned times, tk1 and tk2. This second displacement is measured in the Motion Tracking system coordinate frame.


The displacement determined by post processing of the Medical Images and the displacement determined from the buffered Motion Tracking Motion Information are passed to the registration routine based on an approach labeled as “A X=X B methodology”, which is known to the prior art. See, for example, Park, F. C. and BJ. Martin, “Robot Sensor Calibration: Solving AX=XB on the Euclidean Group”, IEEE Transactions on Robotics and Automation, 1994. 10(5): p. 717-721; Angeles, J., G. Soucy, and F. P. Ferrie, “The online solution of the hand-eye problem”, IEEE Transactions on Robotics and Automation, 2000. 16(6): p. 720-731; Chou J C K, Kamel M., “Finding the Position and Orientation of a Sensor on a Robot Manipulator Using Quaternions”, The International Journal of Robotics Research 1991; 10:240-254; Shiu Y C, Ahmad S., “Calibration of Wrist-Mounted Robotic Sensors by Solving Homogeneous Transform Equations of the Form AX=XB”, IEEE Transactions on Robotics and Automation 1989; 5:16-29; Tsai R Y, Lenz R K, “A New Technique for fully autonomous and efficient 3D robotics hand/eye calibration”, IEEE Journal of Robotics and Automation 1989; 3:345-358; Wang C C, “Extrinsic Calibration of a Vision Sensor Mounted on a Robot”, IEEE Transactions on Robotics and Automation 1992; 8: 161-175, all of which are incorporated herein by reference.


Using this method, the co-registration Tc←M is updated. Therefore, by continuously updating the co-registration information, gradual and inevitable drifts and other calibration inaccuracies in the alignment of the Motion Tracking system and the Medical Imaging system coordinate frames are corrected and accurate adaptive compensation for subject motion is achieved even in the presence of drift and other calibration inaccuracies in the equipment.


Persons knowledgeable in the art will recognize that the auto-tuning technique described in this disclosure may also utilize motion information from multiple (more than 2) time points, for instance in the form of filtering, which will generally increase the accuracy of the auto-tuning procedure.


Persons knowledgeable in the art will recognize that the techniques described in this disclosure may also be applied to medical imaging techniques other than MRI, such as PET, SPECT, CT, or angiographic scanning.


The optimal embodiment of the RGR-based adaptive motion compensation system involves (1) the RGR system shown in FIGS. 6-9, (2) two or more observation mirrors, each optionally with its own stationary RGR, and (3) the auto-tuning system.


While the present invention has been disclosed in connection with the presently preferred best modes described herein, it should be understood that there are other embodiments which a person of ordinary skill in the art to which this invention relates would readily understand are within the scope of this invention. For example, the present invention shall not be limited by software, specified scanning methods, target tissues, or objects. For a further example, instead of using a camera or other optical imaging device to determine an object's pose, alternative detectors of pose can be used, including non-imaging detectors and non-optical detectors, such as magnetic detectors or polarized light detectors. Accordingly, no limitations are to be implied or inferred in this invention except as specifically and explicitly set forth in the attached claims.


INDUSTRIAL APPLICABILITY

This invention can be used whenever it is desired to compensate for motion of a subject, especially while taking a long duration scan.

Claims
  • 1. A motion tracking system for an object in a magnetic resonance imaging scanner, the system comprising: an optical landmark on the object;at least two detectors configured to periodically image the optical landmark, the at least two detectors each configured to view the optical landmark along a different line of sight; anda tracking system configured to analyze images generated by the at least two detectors to determine changes in position of the optical landmark, to generate tracking data for use by the magnetic resonance imaging scanner to dynamically adjust scans to compensate for the changes in position of the optical landmark, and to store the tracking data for processing;wherein the tracking system comprises a computer processor and an electronic memory.
  • 2. The motion tracking system of claim 1, wherein the tracking system is further configured to predict changes in position of the optical landmark by analyzing a plurality of previously determined changes in position of the optical landmark.
  • 3. The motion tracking system of claim 1, wherein each of the at least two detectors comprises a camera.
  • 4. The motion tracking system of claim 1, wherein the optical landmark indicates orientation in at least three degrees of freedom.
  • 5. The motion tracking system of claim 1, wherein the optical landmark is an RGR tag.
  • 6. The motion tracking system of claim 1, wherein the optical landmark is affixed to the object.
  • 7. The motion tracking system of claim 1, wherein at least one of the at least two detectors is configured to be positioned within the magnetic resonance imaging scanner.
  • 8. The motion tracking system of claim 1, wherein each of the at least two detectors is configured to be positioned within the magnetic resonance imaging scanner.
  • 9. The motion tracking system of claim 1, further comprising at least one mirror configured to provide an indirect line of sight between the optical landmark and at least one of the at least two detectors.
  • 10. The motion tracking system of claim 1, wherein the at least two detectors are configured to be positioned to enable at least one detector to view the optical landmark when the sight line from another detector to the optical landmark is obstructed.
  • 11. A motion tracking system for an object in a medical imaging scanner, the system comprising: an optical landmark on the object;a detector configured to periodically image the optical landmark, the detector configured to be positioned within the medical imaging scanner; anda tracking system configured to analyze images generated by the detector to determine changes in position of the optical landmark, and to generate tracking data for use by the medical imaging scanner to dynamically adjust scans to compensate for the changes in position of the optical landmark, and to store the tracking data for processing,wherein the tracking system comprises a computer processor and an electronic memory.
  • 12. The motion tracking system of claim 11, wherein the medical imaging scanner comprises a computer tomography (CT) scanner, an MR scanner, a PET scanner, a SPECT scanner, or a digital angiographic scanner.
  • 13. The motion tracking system of claim 11, wherein the tracking system is configured to analyze images generated by the detector to determine changes in position of the optical landmark and to automatically calibrate the tracking system.
  • 14. The motion tracking system of claim 11, wherein the detector is a camera.
  • 15. The motion tracking system of claim 11, wherein the optical landmark is an RGR tag.
  • 16. The motion tracking system of claim 11, wherein the optical landmark indicates orientation is at least three degrees of freedom.
  • 17. The motion tracking system of claim 11, wherein the optical landmark is affixed to the object.
  • 18. The motion tracking system of claim 11, wherein the detector is configured to be positioned within the medical imaging scanner.
  • 19. The motion tracking system of claim 18, wherein the medical imaging scanner comprises a computer tomography (CT) scanner, an MR scanner, a PET scanner, a SPECT scanner, or a digital angiographic scanner.
  • 20. The motion tracking system of claim 11, further comprising a first mirror configured to split a sight line of the detector into a first path and a second path; wherein the first path is directed directly at the optical landmark;wherein the second path is directed toward a second mirror; andwherein the second mirror is configured to deflect the second path toward the optical landmark.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 14/828,299, titled MOTION TRACKING SYSTEM FOR REAL TIME ADAPTIVE IMAGING AND SPECTROSCOPY and filed Aug. 17, 2015, which is a continuation of U.S. patent application Ser. No. 14/698,350, titled MOTION TRACKING SYSTEM FOR REAL TIME ADAPTIVE IMAGING AND SPECTROSCOPY and filed Apr. 28, 2015 and now U.S. Pat. No. 9,138,175, which is a continuation of U.S. patent application Ser. No. 14/034,252, titled MOTION TRACKING SYSTEM FOR REAL TIME ADAPTIVE IMAGING AND SPECTROSCOPY and filed Sep. 23, 2013 and now U.S. Pat. No. 9,076,212, which is a continuation of U.S. patent application Ser. No. 13/735,907, titled MOTION TRACKING SYSTEM FOR REAL TIME ADAPTIVE IMAGING AND SPECTROSCOPY and filed Jan. 7, 2013 and now U.S. Pat. No. 8,571,293, which is a continuation of U.S. patent application Ser. No. 13/338,166, titled MOTION TRACKING SYSTEM FOR REAL TIME ADAPTIVE IMAGING AND SPECTROSCOPY and filed Dec. 27, 2011, and now U.S. Pat. No. 8,374,411, which is a continuation of U.S. patent application Ser. No. 11/804,417, titled MOTION TRACKING SYSTEM FOR REAL TIME ADAPTIVE IMAGING AND SPECTROSCOPY and filed May 18, 2007, and now U.S. Pat. No. 8,121,361, which claims priority to U.S. Provisional Application No. 60/802,216, titled MRI MOTION ACCOMMODATION and filed May 19, 2006. The foregoing applications are hereby incorporated herein by reference in their entirety.

STATEMENT REGARDING FEDERALLY SPONSORED R&D

This invention was made with government support under Grant numbers 5K02 DA016991 and 5ROI DA021146 awarded by the National Institutes of Health. The government has certain rights in the invention.

US Referenced Citations (809)
Number Name Date Kind
3811213 Eaves May 1974 A
4689999 Shkedi Sep 1987 A
4724386 Haacke et al. Feb 1988 A
4894129 Leiponen et al. Jan 1990 A
4923295 Sireul et al. May 1990 A
4953554 Zerhouni et al. Sep 1990 A
4988886 Palum et al. Jan 1991 A
5075562 Greivenkamp et al. Dec 1991 A
5318026 Pelc Jun 1994 A
5515711 Hinkle May 1996 A
5545993 Taguchi et al. Aug 1996 A
5615677 Pelc et al. Apr 1997 A
5687725 Wendt Nov 1997 A
5728935 Czompo Mar 1998 A
5802202 Yamada et al. Sep 1998 A
5808376 Gordon et al. Sep 1998 A
5835223 Zawemer et al. Nov 1998 A
5877732 Ziarati Mar 1999 A
5886257 Gustafson et al. Mar 1999 A
5889505 Toyama Mar 1999 A
5891060 McGregor Apr 1999 A
5936722 Armstrong et al. Aug 1999 A
5936723 Schmidt et al. Aug 1999 A
5947900 Derbyshire et al. Sep 1999 A
5987349 Schulz Nov 1999 A
6016439 Acker Jan 2000 A
6031888 Ivan et al. Feb 2000 A
6044308 Huissoon Mar 2000 A
6057680 Foo et al. May 2000 A
6057685 Zhou May 2000 A
6061644 Leis May 2000 A
6088482 He Jul 2000 A
6144875 Schweikard et al. Nov 2000 A
6175756 Ferre Jan 2001 B1
6236737 Gregson et al. May 2001 B1
6246900 Cosman et al. Jun 2001 B1
6279579 Riaziat et al. Aug 2001 B1
6285902 Kienzle, III et al. Sep 2001 B1
6289235 Webber Sep 2001 B1
6292683 Gupta et al. Sep 2001 B1
6298262 Franck et al. Oct 2001 B1
6381485 Hunter et al. Apr 2002 B1
6384908 Schmidt et al. May 2002 B1
6390982 Bova et al. May 2002 B1
6402762 Hunter et al. Jun 2002 B2
6405072 Cosman Jun 2002 B1
6421551 Kuth et al. Jul 2002 B1
6467905 Stahl et al. Oct 2002 B1
6474159 Foxlin et al. Nov 2002 B1
6484131 Amoral-Moriya et al. Nov 2002 B1
6490475 Seeley et al. Dec 2002 B1
6501981 Schweikard et al. Dec 2002 B1
6587707 Nehrke et al. Jul 2003 B2
6621889 Mostafavi Sep 2003 B1
6650920 Schaldach et al. Nov 2003 B2
6662036 Cosman Dec 2003 B2
6687528 Gupta et al. Feb 2004 B2
6690965 Riaziat et al. Feb 2004 B1
6711431 Sarin et al. Mar 2004 B2
6731970 Schlossbauer et al. May 2004 B2
6758218 Anthony Jul 2004 B2
6771997 Schaffer Aug 2004 B2
6794869 Brittain Sep 2004 B2
6856827 Seeley et al. Feb 2005 B2
6856828 Cossette et al. Feb 2005 B2
6876198 Watanabe et al. Apr 2005 B2
6888924 Claus et al. May 2005 B2
6891374 Brittain May 2005 B2
6892089 Prince et al. May 2005 B1
6897655 Brittain et al. May 2005 B2
6913603 Knopp et al. Jul 2005 B2
6937696 Mostafavi Aug 2005 B1
6959266 Mostafavi Oct 2005 B1
6973202 Mostafavi Dec 2005 B2
6980679 Jeung et al. Dec 2005 B2
7007699 Martinelli et al. Mar 2006 B2
7024237 Bova et al. Apr 2006 B1
7107091 Jutras et al. Sep 2006 B2
7110805 Machida Sep 2006 B2
7123758 Jeung et al. Oct 2006 B2
7171257 Thomson Jan 2007 B2
7173426 Bulumulla et al. Feb 2007 B1
7176440 Cofer et al. Feb 2007 B2
7191100 Mostafavi Mar 2007 B2
7204254 Riaziat et al. Apr 2007 B2
7209777 Saranathan et al. Apr 2007 B2
7209977 Acharya et al. Apr 2007 B2
7260253 Rahn et al. Aug 2007 B2
7260426 Schweikard et al. Aug 2007 B2
7295007 Dold Nov 2007 B2
7313430 Urquhart et al. Dec 2007 B2
7327865 Fu et al. Feb 2008 B2
7348776 Aksoy et al. Mar 2008 B1
7403638 Jeung et al. Jul 2008 B2
7494277 Setala Feb 2009 B2
7498811 Macfarlane et al. Mar 2009 B2
7502413 Guillaume Mar 2009 B2
7505805 Kuroda Mar 2009 B2
7535411 Falco May 2009 B2
7551089 Sawyer Jun 2009 B2
7561909 Pai et al. Jul 2009 B1
7567697 Mostafavi Jul 2009 B2
7573269 Yao Aug 2009 B2
7602301 Stirling et al. Oct 2009 B1
7603155 Jensen Oct 2009 B2
7623623 Raanes et al. Nov 2009 B2
7657300 Hunter et al. Feb 2010 B2
7657301 Mate et al. Feb 2010 B2
7659521 Pedroni Feb 2010 B2
7660623 Hunter et al. Feb 2010 B2
7668288 Conwell et al. Feb 2010 B2
7689263 Fung et al. Mar 2010 B1
7702380 Dean Apr 2010 B1
7715604 Sun et al. May 2010 B2
7742077 Sablak et al. Jun 2010 B2
7742621 Hammoud et al. Jun 2010 B2
7742804 Faul et al. Jun 2010 B2
7744528 Wallace et al. Jun 2010 B2
7760908 Curtner et al. Jul 2010 B2
7766837 Pedrizzetti et al. Aug 2010 B2
7769430 Mostafavi Aug 2010 B2
7772569 Bewersdorf et al. Aug 2010 B2
7787011 Zhou et al. Aug 2010 B2
7787935 Dumoulin et al. Aug 2010 B2
7791808 French et al. Sep 2010 B2
7792249 Gertner et al. Sep 2010 B2
7796154 Senior et al. Sep 2010 B2
7798730 Westerweck Sep 2010 B2
7801330 Zhang et al. Sep 2010 B2
7805987 Smith Oct 2010 B1
7806604 Bazakos et al. Oct 2010 B2
7817046 Coveley et al. Oct 2010 B2
7817824 Liang et al. Oct 2010 B2
7819818 Ghajar Oct 2010 B2
7825660 Yui et al. Nov 2010 B2
7833221 Voegele Nov 2010 B2
7834846 Bell Nov 2010 B1
7835783 Aletras Nov 2010 B1
7839551 Lee et al. Nov 2010 B2
7840253 Tremblay et al. Nov 2010 B2
7844094 Jeung et al. Nov 2010 B2
7844320 Shahidi Nov 2010 B2
7850526 Zalewski et al. Dec 2010 B2
7860301 Se et al. Dec 2010 B2
7866818 Schroeder et al. Jan 2011 B2
7868282 Lee et al. Jan 2011 B2
7878652 Chen et al. Feb 2011 B2
7883415 Larsen et al. Feb 2011 B2
7889907 Engelbart et al. Feb 2011 B2
7894877 Lewin et al. Feb 2011 B2
7902825 Bammer et al. Mar 2011 B2
7907987 Dempsey Mar 2011 B2
7908060 Basson et al. Mar 2011 B2
7908233 Angell et al. Mar 2011 B2
7911207 Macfarlane et al. Mar 2011 B2
7912532 Schmidt et al. Mar 2011 B2
7920250 Robert et al. Apr 2011 B2
7920911 Hoshino et al. Apr 2011 B2
7925066 Ruohonen et al. Apr 2011 B2
7925549 Looney et al. Apr 2011 B2
7931370 Prat Bartomeu Apr 2011 B2
7944354 Kangas et al. May 2011 B2
7944454 Zhou et al. May 2011 B2
7945304 Feinberg May 2011 B2
7946921 Ofek et al. May 2011 B2
7962197 Rioux et al. Jun 2011 B2
7971999 Zinser Jul 2011 B2
7977942 White Jul 2011 B2
7978925 Souchard Jul 2011 B1
7988288 Donaldson Aug 2011 B2
7990365 Marvit et al. Aug 2011 B2
8005571 Sutherland et al. Aug 2011 B2
8009198 Alhadef Aug 2011 B2
8019170 Wang et al. Sep 2011 B2
8021231 Walker et al. Sep 2011 B2
8022982 Thorn Sep 2011 B2
8024026 Groszmann Sep 2011 B2
8031909 Se et al. Oct 2011 B2
8031933 Se et al. Oct 2011 B2
8036425 Hou Oct 2011 B2
8041077 Bell Oct 2011 B2
8041412 Glossop et al. Oct 2011 B2
8048002 Ghajar Nov 2011 B2
8049867 Bridges et al. Nov 2011 B2
8055020 Meuter et al. Nov 2011 B2
8055049 Stayman et al. Nov 2011 B2
8060185 Hunter et al. Nov 2011 B2
8063929 Kurtz et al. Nov 2011 B2
8073197 Xu et al. Dec 2011 B2
8077914 Kaplan Dec 2011 B1
8085302 Zhang et al. Dec 2011 B2
8086026 Schulz Dec 2011 B2
8086299 Adler et al. Dec 2011 B2
RE43147 Aviv Jan 2012 E
8094193 Peterson Jan 2012 B2
8095203 Wright et al. Jan 2012 B2
8095209 Flaherty Jan 2012 B2
8098889 Zhu et al. Jan 2012 B2
8113991 Kutliroff Feb 2012 B2
8116527 Sabol Feb 2012 B2
8121356 Friedman Feb 2012 B2
8121361 Ernst et al. Feb 2012 B2
8134597 Thorn Mar 2012 B2
8135201 Smith et al. Mar 2012 B2
8139029 Boillot Mar 2012 B2
8139896 Ahiska Mar 2012 B1
8144118 Hildreth Mar 2012 B2
8144148 El Dokor Mar 2012 B2
8150063 Chen Apr 2012 B2
8150498 Gielen et al. Apr 2012 B2
8160304 Rhoads Apr 2012 B2
8165844 Luinge et al. Apr 2012 B2
8167802 Baba et al. May 2012 B2
8172573 Sonenfeld et al. May 2012 B2
8175332 Herrington May 2012 B2
8179604 Prada Gomez et al. May 2012 B1
8180428 Kaiser et al. May 2012 B2
8180432 Sayeh May 2012 B2
8187097 Zhang May 2012 B1
8189869 Bell May 2012 B2
8189889 Pearlstein et al. May 2012 B2
8189926 Sharma May 2012 B2
8190233 Dempsey May 2012 B2
8191359 White et al. Jun 2012 B2
8194134 Furukawa Jun 2012 B2
8195084 Xiao Jun 2012 B2
8199983 Qureshi Jun 2012 B2
8206219 Shum Jun 2012 B2
8207967 El Dokor Jun 2012 B1
8208758 Wang Jun 2012 B2
8213693 Li Jul 2012 B1
8214012 Zuccolotto et al. Jul 2012 B2
8214016 Lavallee et al. Jul 2012 B2
8216016 Yamagishi et al. Jul 2012 B2
8218818 Cobb Jul 2012 B2
8218819 Cobb Jul 2012 B2
8218825 Gordon Jul 2012 B2
8221399 Amano Jul 2012 B2
8223147 El Dokor Jul 2012 B1
8224423 Faul Jul 2012 B2
8226574 Whillock Jul 2012 B2
8229163 Coleman Jul 2012 B2
8229166 Teng Jul 2012 B2
8229184 Benkley Jul 2012 B2
8232872 Zeng Jul 2012 B2
8235529 Raffle Aug 2012 B1
8235530 Maad Aug 2012 B2
8241125 Hughes Aug 2012 B2
8243136 Aota Aug 2012 B2
8243269 Matousek Aug 2012 B2
8243996 Steinberg Aug 2012 B2
8248372 Saila Aug 2012 B2
8249691 Chase et al. Aug 2012 B2
8253770 Kurtz Aug 2012 B2
8253774 Huitema Aug 2012 B2
8253778 Atsushi Aug 2012 B2
8259109 El Dokor Sep 2012 B2
8260036 Hamza et al. Sep 2012 B2
8279288 Son Oct 2012 B2
8284157 Markovic Oct 2012 B2
8284847 Adermann Oct 2012 B2
8287373 Marks et al. Oct 2012 B2
8289390 Aggarwal Oct 2012 B2
8289392 Senior et al. Oct 2012 B2
8290208 Kurtz Oct 2012 B2
8290229 Qureshi Oct 2012 B2
8295573 Bredno et al. Oct 2012 B2
8301226 Csavoy et al. Oct 2012 B2
8306260 Zhu Nov 2012 B2
8306267 Gossweiler, III Nov 2012 B1
8306274 Grycewicz Nov 2012 B2
8306663 Wickham Nov 2012 B2
8310656 Zalewski Nov 2012 B2
8310662 Mehr Nov 2012 B2
8311611 Csavoy et al. Nov 2012 B2
8314854 Yoon Nov 2012 B2
8315691 Sumanaweera et al. Nov 2012 B2
8316324 Boillot Nov 2012 B2
8320621 McEldowney Nov 2012 B2
8320709 Arartani et al. Nov 2012 B2
8323106 Zalewski Dec 2012 B2
8325228 Mariadoss Dec 2012 B2
8330811 Francis J Maguire, Jr. Dec 2012 B2
8330812 Maguire, Jr. Dec 2012 B2
8331019 Cheong Dec 2012 B2
8334900 Qu et al. Dec 2012 B2
8339282 Noble Dec 2012 B2
8351651 Lee Jan 2013 B2
8368586 Mohamadi Feb 2013 B2
8369574 Hu Feb 2013 B2
8374393 Cobb Feb 2013 B2
8374411 Ernst Feb 2013 B2
8374674 Gertner Feb 2013 B2
8376226 Dennard Feb 2013 B2
8376827 Cammegh Feb 2013 B2
8379927 Taylor Feb 2013 B2
8380284 Saranathan et al. Feb 2013 B2
8386011 Wieczorek Feb 2013 B2
8390291 Macfarlane et al. Mar 2013 B2
8390729 Long Mar 2013 B2
8395620 El Dokor Mar 2013 B2
8396654 Simmons et al. Mar 2013 B1
8400398 Schoen Mar 2013 B2
8400490 Apostolopoulos Mar 2013 B2
8405491 Fong Mar 2013 B2
8405656 El Dokor Mar 2013 B2
8405717 Kim Mar 2013 B2
8406845 Komistek et al. Mar 2013 B2
8411931 Zhou Apr 2013 B2
8427538 Ahiska Apr 2013 B2
8428319 Tsin et al. Apr 2013 B2
8571293 Ernst Oct 2013 B2
8600213 Mestha et al. Dec 2013 B2
8615127 Fitzpatrick Dec 2013 B2
8617081 Mestha et al. Dec 2013 B2
8744154 Van Den Brink Jun 2014 B2
8747382 D'Souza Jun 2014 B2
8768438 Mestha et al. Jul 2014 B2
8788020 Mostafavi et al. Jul 2014 B2
8790269 Xu et al. Jul 2014 B2
8792969 Bernal et al. Jul 2014 B2
8805019 Jeanne et al. Aug 2014 B2
8848977 Bammer et al. Sep 2014 B2
8855384 Kyal et al. Oct 2014 B2
8862420 Ferran et al. Oct 2014 B2
8873812 Larlus-Larrondo et al. Oct 2014 B2
8953847 Moden Feb 2015 B2
8971985 Bernal et al. Mar 2015 B2
8977347 Mestha et al. Mar 2015 B2
8995754 Wu et al. Mar 2015 B2
8996094 Schouenborg et al. Mar 2015 B2
9020185 Mestha et al. Apr 2015 B2
9036877 Kyal et al. May 2015 B2
9076212 Ernst et al. Jul 2015 B2
9082177 Sebok Jul 2015 B2
9084629 Rosa Jul 2015 B1
9103897 Herbst et al. Aug 2015 B2
9138175 Ernst Sep 2015 B2
9173715 Baumgartner Nov 2015 B2
9176932 Baggen et al. Nov 2015 B2
9194929 Siegert et al. Nov 2015 B2
9226691 Bernal et al. Jan 2016 B2
9305365 Lovberg et al. Apr 2016 B2
9318012 Johnson Apr 2016 B2
9336594 Kyal et al. May 2016 B2
9395386 Corder et al. Jul 2016 B2
9433386 Mestha et al. Sep 2016 B2
9436277 Furst et al. Sep 2016 B2
9443289 Xu et al. Sep 2016 B2
9451926 Kinahan et al. Sep 2016 B2
9453898 Nielsen et al. Sep 2016 B2
9504426 Kyal et al. Nov 2016 B2
9606209 Ernst et al. Mar 2017 B2
9607377 Lovberg et al. Mar 2017 B2
9629595 Walker et al. Apr 2017 B2
9693710 Mestha et al. Jul 2017 B2
9734589 Yu et al. Aug 2017 B2
9779502 Lovberg et al. Oct 2017 B1
9785247 Horowitz et al. Oct 2017 B1
9943247 Ernst et al. Apr 2018 B2
10327708 Yu et al. Jun 2019 B2
10339654 Lovberg et al. Jul 2019 B2
10438349 Yu et al. Oct 2019 B2
10542913 Olesen Jan 2020 B2
10716515 Gustafsson et al. Jul 2020 B2
20020082496 Kuth Jun 2002 A1
20020087101 Barrick et al. Jul 2002 A1
20020091422 Greenberg et al. Jul 2002 A1
20020115931 Strauss et al. Aug 2002 A1
20020118373 Eviatar et al. Aug 2002 A1
20020180436 Dale et al. Dec 2002 A1
20020188194 Cosman Dec 2002 A1
20030063292 Mostafavi Apr 2003 A1
20030088177 Totterman et al. May 2003 A1
20030116166 Anthony Jun 2003 A1
20030130574 Stoyle Jul 2003 A1
20030195526 Vilsmeier Oct 2003 A1
20040071324 Norris et al. Apr 2004 A1
20040116804 Mostafavi Jun 2004 A1
20040140804 Polzin et al. Jul 2004 A1
20040171927 Lowen et al. Sep 2004 A1
20050027194 Adler et al. Feb 2005 A1
20050054910 Tremblay et al. Mar 2005 A1
20050070784 Komura et al. Mar 2005 A1
20050105772 Voronka et al. May 2005 A1
20050107685 Seeber May 2005 A1
20050137475 Dold et al. Jun 2005 A1
20050148845 Dean et al. Jul 2005 A1
20050148854 Ito et al. Jul 2005 A1
20050265516 Haider Dec 2005 A1
20050283068 Zuccoloto et al. Dec 2005 A1
20060004281 Saracen Jan 2006 A1
20060045310 Tu et al. Mar 2006 A1
20060074292 Thomson et al. Apr 2006 A1
20060241405 Leitner et al. Oct 2006 A1
20070049794 Glassenberg et al. Mar 2007 A1
20070093709 Abernathie Apr 2007 A1
20070189386 Imagawa et al. Aug 2007 A1
20070206836 Yoon Sep 2007 A1
20070239169 Plaskos et al. Oct 2007 A1
20070276224 Lang Nov 2007 A1
20070280508 Ernst et al. Dec 2007 A1
20080039713 Thomson et al. Feb 2008 A1
20080129290 Yao Jun 2008 A1
20080181358 Van Kampen et al. Jul 2008 A1
20080183074 Carls et al. Jul 2008 A1
20080208012 Ali Aug 2008 A1
20080212835 Tavor Sep 2008 A1
20080221442 Tolowsky et al. Sep 2008 A1
20080221520 Nagel et al. Sep 2008 A1
20080273754 Hick et al. Nov 2008 A1
20080287728 Mostafavi Nov 2008 A1
20080287780 Chase et al. Nov 2008 A1
20080317313 Goddard et al. Dec 2008 A1
20090028411 Pfeuffer Jan 2009 A1
20090041200 Lu et al. Feb 2009 A1
20090052760 Smith et al. Feb 2009 A1
20090116719 Jaffray et al. May 2009 A1
20090185663 Gaines, Jr. et al. Jul 2009 A1
20090187112 Meir et al. Jul 2009 A1
20090209846 Bammer Aug 2009 A1
20090253985 Shachar et al. Oct 2009 A1
20090304297 Adabala et al. Dec 2009 A1
20090306499 Van Vorhis et al. Dec 2009 A1
20100054579 Okutomi Mar 2010 A1
20100057059 Makino Mar 2010 A1
20100059679 Albrecht Mar 2010 A1
20100069742 Partain et al. Mar 2010 A1
20100091089 Cromwell et al. Apr 2010 A1
20100099981 Fishel Apr 2010 A1
20100125191 Sahin May 2010 A1
20100137709 Gardner et al. Jun 2010 A1
20100148774 Kamata Jun 2010 A1
20100149099 Elias Jun 2010 A1
20100149315 Qu Jun 2010 A1
20100160775 Pankratov Jun 2010 A1
20100164862 Sullivan Jul 2010 A1
20100165293 Tanassi et al. Jul 2010 A1
20100167246 Ghajar Jul 2010 A1
20100172567 Prokoski Jul 2010 A1
20100177929 Kurtz Jul 2010 A1
20100178966 Suydoux Jul 2010 A1
20100179390 Davis Jul 2010 A1
20100179413 Kadour et al. Jul 2010 A1
20100183196 Fu et al. Jul 2010 A1
20100191631 Weidmann Jul 2010 A1
20100194879 Pasveer Aug 2010 A1
20100198067 Mahfouz Aug 2010 A1
20100198101 Song Aug 2010 A1
20100198112 Maad Aug 2010 A1
20100199232 Mistry Aug 2010 A1
20100210350 Walker Aug 2010 A9
20100214267 Radivojevic Aug 2010 A1
20100231511 Henty Sep 2010 A1
20100231692 Perlman Sep 2010 A1
20100245536 Huitema Sep 2010 A1
20100245593 Kim Sep 2010 A1
20100251924 Taylor Oct 2010 A1
20100253762 Cheong Oct 2010 A1
20100268072 Hall et al. Oct 2010 A1
20100277571 Xu Nov 2010 A1
20100282902 Rajasingham Nov 2010 A1
20100283833 Yeh Nov 2010 A1
20100284119 Coakley Nov 2010 A1
20100289899 Hendron Nov 2010 A1
20100290668 Friedman Nov 2010 A1
20100292841 Wickham Nov 2010 A1
20100295718 Mohamadi Nov 2010 A1
20100296701 Hu Nov 2010 A1
20100302142 French Dec 2010 A1
20100303289 Polzin Dec 2010 A1
20100311512 Lock Dec 2010 A1
20100321505 Kokubun Dec 2010 A1
20100328055 Fong Dec 2010 A1
20100328201 Marbit Dec 2010 A1
20100328267 Chen Dec 2010 A1
20100330912 Saila Dec 2010 A1
20110001699 Jacobsen Jan 2011 A1
20110006991 Elias Jan 2011 A1
20110007939 Teng Jan 2011 A1
20110007946 Liang Jan 2011 A1
20110008759 Usui Jan 2011 A1
20110015521 Faul Jan 2011 A1
20110019001 Rhoads Jan 2011 A1
20110025853 Richardson Feb 2011 A1
20110038520 Yui Feb 2011 A1
20110043631 Marman Feb 2011 A1
20110043759 Bushinsky Feb 2011 A1
20110050562 Schoen Mar 2011 A1
20110050569 Marvit Mar 2011 A1
20110050947 Marman Mar 2011 A1
20110052002 Cobb Mar 2011 A1
20110052003 Cobb Mar 2011 A1
20110052015 Saund Mar 2011 A1
20110054870 Dariush Mar 2011 A1
20110057816 Noble Mar 2011 A1
20110058020 Dieckmann Mar 2011 A1
20110064290 Punithakaumar Mar 2011 A1
20110069207 Steinberg Mar 2011 A1
20110074675 Shiming Mar 2011 A1
20110081000 Gertner Apr 2011 A1
20110081043 Sabol Apr 2011 A1
20110085704 Han Apr 2011 A1
20110087091 Olson Apr 2011 A1
20110092781 Gertner Apr 2011 A1
20110102549 Takahashi May 2011 A1
20110105883 Lake et al. May 2011 A1
20110105893 Akins et al. May 2011 A1
20110115793 Grycewicz May 2011 A1
20110115892 Fan May 2011 A1
20110116683 Kramer et al. May 2011 A1
20110117528 Marciello et al. May 2011 A1
20110118032 Zalewski May 2011 A1
20110133917 Zeng Jun 2011 A1
20110142411 Camp Jun 2011 A1
20110150271 Lee Jun 2011 A1
20110157168 Bennett Jun 2011 A1
20110157358 Bell Jun 2011 A1
20110157370 Livesey Jun 2011 A1
20110160569 Cohen et al. Jun 2011 A1
20110172060 Morales Jul 2011 A1
20110172521 Zdeblick et al. Jul 2011 A1
20110175801 Markovic Jul 2011 A1
20110175809 Markovic Jul 2011 A1
20110175810 Markovic Jul 2011 A1
20110176723 Ali et al. Jul 2011 A1
20110180695 Li Jul 2011 A1
20110181893 MacFarlane Jul 2011 A1
20110182472 Hansen Jul 2011 A1
20110187640 Jacobsen Aug 2011 A1
20110193939 Vassigh Aug 2011 A1
20110199461 Horio Aug 2011 A1
20110201916 Duyn et al. Aug 2011 A1
20110201939 Hubschman et al. Aug 2011 A1
20110202306 Eng Aug 2011 A1
20110205358 Aota Aug 2011 A1
20110207089 Lagettie Aug 2011 A1
20110208437 Teicher Aug 2011 A1
20110216002 Weising Sep 2011 A1
20110216180 Pasini Sep 2011 A1
20110221770 Kruglick Sep 2011 A1
20110229862 Parikh Sep 2011 A1
20110230755 MacFarlane et al. Sep 2011 A1
20110234807 Jones Sep 2011 A1
20110234834 Sugimoto Sep 2011 A1
20110235855 Smith Sep 2011 A1
20110237933 Cohen Sep 2011 A1
20110242134 Miller Oct 2011 A1
20110244939 Cammegh Oct 2011 A1
20110250929 Lin Oct 2011 A1
20110251478 Wieczorek Oct 2011 A1
20110255845 Kikuchi Oct 2011 A1
20110257566 Burdea Oct 2011 A1
20110260965 Kim Oct 2011 A1
20110262002 Lee Oct 2011 A1
20110267427 Goh Nov 2011 A1
20110267456 Adermann Nov 2011 A1
20110275957 Bhandari Nov 2011 A1
20110276396 Rathod Nov 2011 A1
20110279663 Fan Nov 2011 A1
20110285622 Marti Nov 2011 A1
20110286010 Kusik et al. Nov 2011 A1
20110291925 Isarel Dec 2011 A1
20110293143 Narayanan et al. Dec 2011 A1
20110293146 Grycewicz Dec 2011 A1
20110298708 Hsu Dec 2011 A1
20110298824 Lee Dec 2011 A1
20110300994 Verkaaik Dec 2011 A1
20110301449 Maurer, Jr. Dec 2011 A1
20110301934 Tardis Dec 2011 A1
20110303214 Welle Dec 2011 A1
20110304541 Dalal Dec 2011 A1
20110304650 Canpillo Dec 2011 A1
20110304706 Border et al. Dec 2011 A1
20110306867 Gopinadhan Dec 2011 A1
20110310220 McEldowney Dec 2011 A1
20110310226 McEldowney Dec 2011 A1
20110316994 Lemchen Dec 2011 A1
20110317877 Bell Dec 2011 A1
20120002112 Huang Jan 2012 A1
20120004791 Buelthoff Jan 2012 A1
20120007839 Tsao et al. Jan 2012 A1
20120019645 Maltz Jan 2012 A1
20120020524 Ishikawa Jan 2012 A1
20120021806 Maltz Jan 2012 A1
20120027226 Desenberg Feb 2012 A1
20120029345 Mahfouz et al. Feb 2012 A1
20120032882 Schlachta Feb 2012 A1
20120033083 Horvinger Feb 2012 A1
20120035462 Maurer, Jr. et al. Feb 2012 A1
20120039505 Bastide et al. Feb 2012 A1
20120044363 Lu Feb 2012 A1
20120045091 Kaganovich Feb 2012 A1
20120049453 Morichau-Beauchant et al. Mar 2012 A1
20120051588 McEldowney Mar 2012 A1
20120051664 Gopalakrishnan et al. Mar 2012 A1
20120052949 Weitzner Mar 2012 A1
20120056982 Katz Mar 2012 A1
20120057640 Shi Mar 2012 A1
20120065492 Gertner et al. Mar 2012 A1
20120065494 Gertner et al. Mar 2012 A1
20120072041 Miller Mar 2012 A1
20120075166 Marti Mar 2012 A1
20120075177 Jacobsen Mar 2012 A1
20120076369 Abramovich Mar 2012 A1
20120081504 Ng Apr 2012 A1
20120083314 Ng Apr 2012 A1
20120083960 Zhu Apr 2012 A1
20120086778 Lee Apr 2012 A1
20120086809 Lee Apr 2012 A1
20120092445 McDowell Apr 2012 A1
20120092502 Knasel Apr 2012 A1
20120093481 McDowell Apr 2012 A1
20120098938 Jin Apr 2012 A1
20120101388 Tripathi Apr 2012 A1
20120105573 Apostolopoulos May 2012 A1
20120106814 Gleason et al. May 2012 A1
20120108909 Slobounov et al. May 2012 A1
20120113140 Hilliges May 2012 A1
20120113223 Hilliges May 2012 A1
20120116202 Bangera May 2012 A1
20120119999 Harris May 2012 A1
20120120072 Se May 2012 A1
20120120237 Trepess May 2012 A1
20120120243 Chien May 2012 A1
20120120277 Tsai May 2012 A1
20120121124 Bammer May 2012 A1
20120124604 Small May 2012 A1
20120127319 Rao May 2012 A1
20120133616 Nishihara May 2012 A1
20120133889 Bergt May 2012 A1
20120143029 Silverstein Jun 2012 A1
20120143212 Madhani Jun 2012 A1
20120147167 Mason Jun 2012 A1
20120154272 Hildreth Jun 2012 A1
20120154511 Hsu Jun 2012 A1
20120154536 Stoker Jun 2012 A1
20120154579 Hanpapur Jun 2012 A1
20120156661 Smith Jun 2012 A1
20120158197 Hinman Jun 2012 A1
20120162378 El Dokor et al. Jun 2012 A1
20120165964 Flaks Jun 2012 A1
20120167143 Longet Jun 2012 A1
20120169841 Chemali Jul 2012 A1
20120176314 Jeon Jul 2012 A1
20120184371 Shum Jul 2012 A1
20120188237 Han Jul 2012 A1
20120188371 Chen Jul 2012 A1
20120194422 El Dokor Aug 2012 A1
20120194517 Izadi et al. Aug 2012 A1
20120194561 Grossinger Aug 2012 A1
20120195466 Teng Aug 2012 A1
20120196660 El Dokor et al. Aug 2012 A1
20120197135 Slatkine Aug 2012 A1
20120200676 Huitema Aug 2012 A1
20120201428 Joshi et al. Aug 2012 A1
20120206604 Jones Aug 2012 A1
20120212594 Barns Aug 2012 A1
20120218407 Chien Aug 2012 A1
20120218421 Chien Aug 2012 A1
20120220233 Teague Aug 2012 A1
20120224666 Speller Sep 2012 A1
20120224743 Rodriguez Sep 2012 A1
20120225718 Zhang Sep 2012 A1
20120229643 Chidanand Sep 2012 A1
20120229651 Takizawa Sep 2012 A1
20120230561 Qureshi Sep 2012 A1
20120235896 Jacobsen Sep 2012 A1
20120238337 French Sep 2012 A1
20120238864 Piferi et al. Sep 2012 A1
20120242816 Cruz Sep 2012 A1
20120249741 Maciocci Oct 2012 A1
20120253201 Reinhold Oct 2012 A1
20120253241 Levital et al. Oct 2012 A1
20120262540 Rondinelli Oct 2012 A1
20120262558 Boger Oct 2012 A1
20120262583 Bernal Oct 2012 A1
20120268124 Herbst et al. Oct 2012 A1
20120275649 Cobb Nov 2012 A1
20120276995 Lansdale Nov 2012 A1
20120277001 Lansdale Nov 2012 A1
20120281093 Fong Nov 2012 A1
20120281873 Brown Nov 2012 A1
20120288142 Gossweiler, III Nov 2012 A1
20120288852 Willson Nov 2012 A1
20120289334 Mikhailov Nov 2012 A9
20120289822 Shachar et al. Nov 2012 A1
20120293412 El Dokor Nov 2012 A1
20120293506 Vertucci Nov 2012 A1
20120293663 Liu Nov 2012 A1
20120294511 Datta Nov 2012 A1
20120300961 Moeller Nov 2012 A1
20120303839 Jackson Nov 2012 A1
20120304126 Lavigne Nov 2012 A1
20120307075 Margalit Dec 2012 A1
20120307207 Abraham Dec 2012 A1
20120314066 Lee Dec 2012 A1
20120315016 Fung Dec 2012 A1
20120319946 El Dokor Dec 2012 A1
20120319989 Argiro Dec 2012 A1
20120320178 Siegert et al. Dec 2012 A1
20120320219 David Dec 2012 A1
20120326966 Rauber Dec 2012 A1
20120326976 Markovic Dec 2012 A1
20120326979 Geisert Dec 2012 A1
20120327241 Howe Dec 2012 A1
20120327246 Senior et al. Dec 2012 A1
20130002866 Hanpapur Jan 2013 A1
20130002879 Weber Jan 2013 A1
20130002900 Gossweiler, III Jan 2013 A1
20130009865 Valik Jan 2013 A1
20130010071 Valik Jan 2013 A1
20130013452 Dennard Jan 2013 A1
20130016009 Godfrey Jan 2013 A1
20130016876 Wooley Jan 2013 A1
20130021434 Ahiska Jan 2013 A1
20130021578 Chen Jan 2013 A1
20130024819 Rieffel Jan 2013 A1
20130030283 Vortman et al. Jan 2013 A1
20130033640 Lee Feb 2013 A1
20130033700 Hallil Feb 2013 A1
20130035590 Ma et al. Feb 2013 A1
20130035612 Mason Feb 2013 A1
20130040720 Cammegh Feb 2013 A1
20130041368 Cunninghan Feb 2013 A1
20130049756 Ernst et al. Feb 2013 A1
20130053683 Hwang et al. Feb 2013 A1
20130057702 Chavan Mar 2013 A1
20130064426 Watkins, Jr. Mar 2013 A1
20130064427 Picard Mar 2013 A1
20130065517 Svensson Mar 2013 A1
20130066448 Alonso Mar 2013 A1
20130066526 Mondragon Mar 2013 A1
20130069773 Li Mar 2013 A1
20130070201 Shahidi Mar 2013 A1
20130070257 Wong Mar 2013 A1
20130072787 Wallace et al. Mar 2013 A1
20130076863 Rappel Mar 2013 A1
20130076944 Kosaka Mar 2013 A1
20130077823 Mestha Mar 2013 A1
20130079033 Gupta Mar 2013 A1
20130084980 Hammontree Apr 2013 A1
20130088584 Malhas Apr 2013 A1
20130093866 Ohlhues et al. Apr 2013 A1
20130096439 Lee Apr 2013 A1
20130102879 MacLaren et al. Apr 2013 A1
20130102893 Vollmer Apr 2013 A1
20130108979 Daon May 2013 A1
20130113791 Isaacs et al. May 2013 A1
20130211421 Abovitz et al. Aug 2013 A1
20130237811 Mihailescu et al. Sep 2013 A1
20130281818 Vija et al. Oct 2013 A1
20140005527 Nagarkar et al. Jan 2014 A1
20140055563 Jessop Feb 2014 A1
20140073908 Biber Mar 2014 A1
20140088410 Wu Mar 2014 A1
20140133720 Lee et al. May 2014 A1
20140148685 Liu et al. May 2014 A1
20140159721 Grodzki Jun 2014 A1
20140171784 Ooi et al. Jun 2014 A1
20140205140 Lovberg et al. Jul 2014 A1
20140343344 Saunders et al. Nov 2014 A1
20140378816 Oh et al. Dec 2014 A1
20150085072 Yan Mar 2015 A1
20150094597 Mestha et al. Apr 2015 A1
20150094606 Mestha et al. Apr 2015 A1
20150212182 Nielsen et al. Jul 2015 A1
20150245787 Kyal et al. Sep 2015 A1
20150257661 Mestha et al. Sep 2015 A1
20150265187 Bernal et al. Sep 2015 A1
20150265220 Ernst et al. Sep 2015 A1
20150289878 Tal et al. Oct 2015 A1
20150297120 Son et al. Oct 2015 A1
20150297314 Fowler Oct 2015 A1
20150316635 Stehning et al. Nov 2015 A1
20150323637 Beck et al. Nov 2015 A1
20150327948 Schoepp et al. Nov 2015 A1
20150331078 Speck et al. Nov 2015 A1
20150359464 Oleson Dec 2015 A1
20150366527 Yu et al. Dec 2015 A1
20160000383 Lee et al. Jan 2016 A1
20160000411 Raju et al. Jan 2016 A1
20160035108 Yu et al. Feb 2016 A1
20160045112 Weissler et al. Feb 2016 A1
20160073962 Yu et al. Mar 2016 A1
20160091592 Beall et al. Mar 2016 A1
20160166205 Ernst et al. Jun 2016 A1
20160189372 Lovberg et al. Jun 2016 A1
20160198965 Mestha et al. Jul 2016 A1
20160228005 Bammer et al. Aug 2016 A1
20160249984 Janssen Sep 2016 A1
20160256713 Saunders et al. Sep 2016 A1
20160262663 MacLaren et al. Sep 2016 A1
20160287080 Olesen et al. Oct 2016 A1
20160310093 Chen et al. Oct 2016 A1
20160310229 Bammer et al. Oct 2016 A1
20160313432 Feiweier et al. Oct 2016 A1
20170032538 Ernst Feb 2017 A1
20170038449 Voigt et al. Feb 2017 A1
20170143271 Gustafsson et al. May 2017 A1
20170276754 Ernst et al. Sep 2017 A1
20170303859 Robertson et al. Oct 2017 A1
20170319143 Yu et al. Nov 2017 A1
20170345145 Nempont et al. Nov 2017 A1
20180220925 Lauer Aug 2018 A1
20190004282 Park et al. Jan 2019 A1
20190059779 Ernst et al. Feb 2019 A1
20200022654 Yu et al. Jan 2020 A1
20200234434 Yu et al. Jul 2020 A1
Foreign Referenced Citations (45)
Number Date Country
1745717 Mar 2006 CN
1879574 Dec 2006 CN
100563551 Dec 2009 CN
104603835 May 2015 CN
105338897 Feb 2016 CN
105392423 Mar 2016 CN
106572810 Apr 2017 CN
106714681 May 2017 CN
29519078 Mar 1996 DE
102004024470 Dec 2005 DE
0904733 Mar 1991 EP
1319368 Jun 2003 EP
1354564 Oct 2003 EP
01524626 Apr 2005 EP
2515139 Oct 2012 EP
2948056 Dec 2015 EP
2950714 Dec 2015 EP
03023838 May 1991 JP
2015-526708 Sep 2015 JP
WO 9617258 Jun 1996 WO
WO 9938449 Aug 1999 WO
WO 0072039 Nov 2000 WO
WO 03003796 Jan 2003 WO
WO 2004023783 Mar 2004 WO
WO 2005077293 Aug 2005 WO
WO 2007025301 Mar 2007 WO
WO 2007085241 Aug 2007 WO
WO 2007136745 Nov 2007 WO
WO 2009101566 Aug 2009 WO
WO 2009129457 Oct 2009 WO
WO 2010066824 Jun 2010 WO
WO 2011047467 Apr 2011 WO
WO 2011113441 Sep 2011 WO
WO 2012046202 Apr 2012 WO
WO 2013032933 Mar 2013 WO
WO 2014005178 Jan 2014 WO
WO 2014116868 Jul 2014 WO
WO 2014120734 Aug 2014 WO
WO 2015022684 Feb 2015 WO
WO 2015042138 Mar 2015 WO
WO 2015092593 Jun 2015 WO
WO 2015148391 Oct 2015 WO
WO 2016014718 Jan 2016 WO
WO2017091479 Jun 2017 WO
WO2017189427 Nov 2017 WO
Non-Patent Literature Citations (90)
Entry
Armstrong et al., IEEE Publication, May 2002, RGR-3D : Simple, Cheap detection of 6-DOF Pose for Tele-Operation, and Robot Programming and Calibration (pp. 2938-2943). (Year: 2002).
Ashouri, H., L. et al., Unobtrusive Estimation of Cardiac Contractility and Stroke Volume Changes Using Ballistocardiogram Measurements on a High Bandwidth Force Plate, Sensors 2016, 16, 787; doi:10.3390/s16060787.
Communication pursuant to Article 94(3) EPC for application No. 14743670.3, which is an EP application related to the present application, dated Feb. 6, 2018.
Extended Europen Search Report for application No. 14743670.3 which is a EP application related to the present application, dated Aug. 17, 2017.
Extended Europen Search Report for application No. 15769296.3 which is a EP application related to the present application, dated Dec. 22, 2017.
Gordon, J. W. Certain molar movements of the human body produced by the circulation of the blood. J. Anat. Physiol. 11, 533-536 (1877).
Herbst et al., “Reproduction of Motion Artifacts for Performance Analysis of Prospective Motion Correction in MM”, Magnetic Resonance in Medicine., vol. 71, No. 1, p. 182-190 (Feb. 25, 2013).
Horn, Berthold K. P., “Closed-form solution of absolute orientation using unit quaternions”, Journal of the Optical Society of America, vol. 4, p. 629-642 (Apr. 1987).
Kim, Chang-Sei et al. “Ballistocardiogram: Mechanism and Potential for Unobtrusive Cardiovascular Health Monitoring”, Scientific Reports, Aug. 9, 2016.
Maclaren et al., “Prospective Motion Correction in Brain Imaging: A Review” Online Magnetic Resonance in Medicine, vol. 69, No. 3, pp. 621-636 (Mar. 1, 2013.
Tarvainen, M.P. et al., “An advanced de-trending method with application to HRV analysis,” IEEE Trans. Biomed. Eng., vol. 49, No. 2, pp. 172-175, Feb. 2002.
Aksoy et al., “Hybrind Prospective and Retrospective Head Motion Correction to Mitigate Cross-Calibration Errors”, NIH Publication, Nov. 2012.
Aksoy et al., “Real-Time Optical Motion Correction for Diffusion Tensor Imaging, Magnetic Resonance in Medicine” (Mar. 22, 2011) 66 366-378.
Andrews et al., “Prospective Motion Correction for Magnetic Resonance Spectroscopy Using Single Camera Retro-Grate Reflector Optical Tracking, Journal of Magnetic Resonance Imaging” (Feb. 2011) 33(2): 498-504.
Angeles et al., “The Online Solution of the Hand-Eye Problem”, IEEE Transactions on Robotics and Automation, 16(6): 720-731 (Dec. 2000).
Anishenko et al., “A Motion Correction System for Brain Tomography Based on Biologically Motivated Models.” 7th IEEE International Conference on Cybernetic Intelligent Systems, dated Sep. 9, 2008, in 9 pages.
Armstrong et al., RGR-6D: Low-cost, high-accuracy measurement of 6-DOF Pose from a Single Image. Publication date unknown.
Armstrong et al., “RGR-3D: Simple, cheap detection of 6-DOF pose for tele-operation, and robot programming and calibration”, In Proc. 2002 Int. Conf. on Robotics and Automation, IEEE, Washington (May 2002).
Bandettini, Peter A., et al., “Processing Strategies for Time-Course Data Sets in Functional MRI of the Human Breain”, Magnetic Resonance in Medicine 30: 161-173 (1993).
Barmet et al, Spatiotemporal Magnetic Field Monitoring for MR, Magnetic Resonance in Medicine (Feb. 1, 2008) 60: 187-197.
Bartels, LW, et al., “Endovascular interventional magnetic resonance imaging”, Physics in Medicine and Biology 48: R37-R64 (2003).
Benchoff, Brian, “Extremely Precise Positional Tracking”, https://hackaday.com/2013/10/10/extremely-precise-positional-tracking/, printed on Sep. 16, 2017, in 7 pages.
Carranza-Herrezuelo et al, “Motion estimation of tagged cardiac magnetric resonance images using variational techniques” Elsevier, Computerized Medical Imaging and Graphics 34 (2010), pp. 514-522.
Chou, Jack C. K., et al., “Finding the Position and Orientation of a Sensor on a Robot Manipulator Using Quaternions”, The International Journal of Robotics Research, 10(3): 240-254 (Jun. 1991).
Cofaru et al “Improved Newton-Raphson digital image correlation method for full-field displacement and strain calculation,” Department of Materials Science and Engineering, Ghent University St-Pietersnieuwstraat, Nov. 20, 2010.
Ernst et al., “A Novel Phase and Frequency Navigator for Proton Magnetic Resonance Spectroscopy Using Water-Suppression Cycling, Magnetic Resonance in Medicine” (Jan. 2011) 65(1): 13-7.
Eviatar et al., “Real time head motion correction for functional MRI”, In: Proceedings of the International Society for Magnetic Resonance in Medicine (1999) 269.
Forbes, Kristen P. N., et al., “Propeller MRI: Clinical Testing of a Novel Technique for Quantification and Compensation of Head Motion”, Journal of Magnetic Resonance Imaging 14: 215-222 (2001).
Fulton et al., “Correction for Head Movements in Positron Emission Tomography Using an Optical Motion-Tracking System”, IEEE Transactions on Nuclear Science, vol. 49(1):116-123 (Feb. 2002).
Glover, Gary H., et al., “Self-Navigated Spiral fMRI: Interleaved versus Single-shot”, Magnetic Resonance in Medicine 39: 361-368 (1998).
Gumus et al., “Elimination of DWI signal dropouts using blipped gradients for dynamic restoration of gradient moment”, ISMRM 20th Annual Meeting & Exhibition, May 7, 2012.
Herbst et al., “Preventing Signal Dropouts in DWI Using Continous Prospective Motion Correction”, Proc. Intl. Soc. Mag. Reson. Med. 19 (May 2011) 170.
Herbst et al., “Prospective Motion Correction With Continuous Gradient Updates in Diffusion Weighted Imaging, Magnetic Resonance in Medicine” (2012) 67:326-338.
Hoff et al., “Analysis of Head Pose Accuracy in Augmented Reality”, IEEE Transactions on Visualization and Computer Graphics 6, No. 4 (Oct.-Dec. 2000): 319-334.
International Preliminary Report on Patentability for Application No. PCT/US2015/022041, dated Oct. 6, 2016, in 8 pages.
International Preliminary Report on Patentability for Application No. PCT/US2007/011899, dated Jun. 8, 2008, in 13 pages.
International Search Report and Written Opinion for Application No. PCT/US2007/011899, dated Nov. 14, 2007.
International Search Report and Written Opinion for Application No. PCT/US2014/012806, dated May 15, 2014, in 15 pages.
International Search Report and Written Opinion for Application No. PCT/US2015/041615, dated Oct. 29, 2015, in 13 pages.
International Preliminary Report on Patentability for Application No. PCT/US2014/013546, dated Aug. 4, 2015, in 9 pages.
International Search Report and Written Opinion for Application No. PCT/US2015/022041, dated Jun. 29, 2015, in 9 pages.
Jochen Triesch, et al.“Democratic Integration: Self-Organized Integration of Adaptive Cues”, Neural Computation., vol. 13, No. 9, dated Sep. 1, 2001, pp. 2049-2074.
Josefsson et al. “A flexible high-precision video system for digital recording of motor acts through lightweight reflect markers”, Computer Methods and Programs in Biomedicine, vol. 49:111-129 (1996).
Katsuki, et al., “Design of an Artificial Mark to Determine 3D Pose by Monocular Vision”, 2003 IEEE International Conference on Robotics and Automation (Cat. No. 03CH37422), Sep. 14-19, 2003, pp. 995-1000 vol. 1.
Kiebel et al., “MRI and PET coregistration-a cross validation of statistical parametric mapping and automated image registration”, Neuroimage 5(4):271-279 (1997).
Kiruluta et al., “Predictive Head Movement Tracking Using a Kalman Filter”, IEEE Trans. on Systems, Man, and Cybernetics—Part B: Cybernetics, 27(2):326-331 (Apr. 1997).
Lerner, “Motion correction in fmri images”, Technion-Israel Institute of Technology, Faculty of Computer Science ( Feb. 2006).
Maclaren et al., “Combined Prospective and Retrospective Motion Correction to Relax Navigator Requirements”, Magnetic Resonance in Medicine (Feb. 11, 2011) 65:1724-1732.
MacLaren et al., “Navigator Accuracy Requirements for Prospective Motion Correction”, Magnetic Resonance in Medicine (Jan. 2010) 63(1): 162-70.
MacLaren, “Prospective Motion Correction in MRI Using Optical Tracking Tape”, Book of Abstracts, ESMRMB (2009).
Maclaren et al., “Measurement and correction of microscopic head motion during magnetic resonance imaging of the brain”, Plos One, vol. 7(11):1-9 (2012).
McVeigh et al., “Real-time, Interactive MRI for Cardiovascular Interventions”, Academic Radiology, 12(9): 1121-1127 (2005).
Nehrke et al., “Prospective Correction of Affine Motion for Arbitrary MR Sequences on a Clinical Scanner”, Magnetic Resonance in Medicine (Jun. 28, 2005) 54:1130-1138.
Norris et al., “Online motion correction for diffusion-weighted imaging using navigator echoes: application to Rare imaging without sensitivity loss”, Magnetic Resonance in Medicine, vol. 45:729-733 (2001).
Olesen et al., “Structured Light 3D Tracking System for Measuring Motions in Pet Brain Imaging”, Proceedings of SPIE, the International Society for Optical Engineering (ISSN: 0277-786X), vol. 7625:76250X (2010).
Olesen et al., “Motion Tracking in Narrow Spaces: A Structured Light Approach”, Lecture Notes in Computer Science (ISSN: 0302-9743)vol. 6363:253-260 (2010).
Olesen et al., “Motion Tracking for Medical Imaging: A Nonvisible Structured Light Tracking Approach”, IEEE Transactions on Medical Imaging, vol. 31(1), Jan. 2012.
Ooi et al., “Prospective Real-Time Correction for Arbitrary Head Motion Using Active Markers”, Magnetic Resonance in Medicine (Apr. 15, 2009) 62(4): 943-54.
Orchard et al., “MRI Reconstruction using real-time motion tracking: A simulation study”, Signals, Systems and Computers, 42nd Annual Conference IEEE, Piscataway, NJ, USA (Oct. 26, 2008).
Park, Frank C. and Martin, Bryan J., “Robot Sensor Calibration: Solving AX-XB on the Euclidean Group”, IEEE Transaction on Robotics and Automation, 10(5): 717-721 (Oct. 1994).
PCT Search Report from the International Searching Authority, dated Feburary 28, 2013, in 16 pages, regarding International Application No. PCT/US2012/052349.
Qin et al., “Prospective Head-Movement Correction for High-Resolution MRI Using an In-Bore Optical Tracking System”, Magnetic Resonance in Medicine (Apr. 13, 2009) 62: 924-934.
Schulz et al., “First Embedded In-Bore System for Fast Optical Prospective Head Motion-Correction in MRI”, Proceedings of the 28th Annual Scientific Meeting of the ESMRMB (Oct. 8, 2011) 369.
Shiu et al., “Calibration of Wrist-Mounted Robotic Sensors by Solving Homogeneous Transform Equations of the Form AX=XB”, IEEE Transactions on Robotics and Automation, 5(1): 16-29 (Feb. 1989).
Speck, et al., “Prospective real-time slice-by-slice Motion Correction for fMRI in Freely Moving Subjects”, Magnetic Resonance Materials in Physics, Biology and Medicine., 19(2), 55-61, published May 9, 2006.
Tremblay et al., “Retrospective Coregistration of Functional Magnetic Resonance Imaging Data using External monitoring”, Magnetic Resonance in Medicine 53:141-149 (2005).
Tsai et al., “A New Technique for Fully Autonomous and Efficient 3D Robotics Hand/Eye Calibration”, IEEE Transaction on Robotics and Automation, 5(3): 345-358 (Jun. 1989).
Wang, Ching-Cheng, “Extrinsic Calibration of a Vision Sensor Mounted on a Robot”, IEEE Transactions on Robotics and Automation, 8(2):161-175 (Apr. 1992).
Ward et al., “Prospective Multiaxial Motion Correction for fMRI”, Magnetic Resonance in Medicine 43:459-469 (2000).
Welch at al., “Spherical Navigator Echoes for Full 3D Rigid Body Motion Measurement in MRI”, Magnetic Resonance in Medicine 47:32-41 (2002).
Wilm et al., “Accurate and Simple Calibration of DLP Projector Systems”, Proceedings of SPIE, the International Society for Optical Engineering (ISSN: 0277-786X), vol. 8979 (2014).
Wilm et al., “Correction of Motion Artifacts for Real-Time Structured Light”, R.R. Paulsen and K.S. Pedersen (Eds.): SCIA 2015, LNCS 9127, pp. 142-151 (2015).
Yeo, et al. Motion correction in fMRI by mapping slice-to-volume with concurrent field-inhomogeneity correction:, International Conference on Medical Image Computing and Computer-Assisted Intervention, pp. 752-760 (2004).
Zaitsev, M., et al., “Prospective Real-Time Slice-by-Slice 3D Motion Correction for EPI Using an External Optical Motion Tracking System”, Proc.Intl.Soc.Mag.Reson.Med.11:517(2004).
Zeitsev et al., “Magnetic resonance imaging of freely moving objects: Prospective real-time motion correction using an external optical motion tracking system”, Neurolmage 31 (Jan. 29, 2006) 1038-1050.
European Examination Report for application No. 15202598.7 dated Nov. 12, 2018.
European Examination Report for application No. 12826869.5 dated Mar. 4, 2019.
Extended European Search Report for application No. 15824707.2 which is a EP application related to the present appliation, dated Apr. 16, 2018.
Extended European Search Report for application No. 16869116.0 which is a EP Application related to the present application, dated Aug. 2, 2019.
Fodor et al., Aesthetic Applications of Intense Pulsed Light, DOI: 10.1007/978-1-84996-4562_2, © Springer-Verlag London Limited 2011.
Gaul, Scott, Quiet Mind Cafe, https://www.youtube.com/watch?v=7wFX9Wn70eM.
https://www.innoveremedical.com/.
International Search Report and Written Opinion for Application No. PCT/US2019/013147, dated Apr. 29, 2019 in 10 pages.
International Search Report and Written Opinion for Application No. PCT/US2019/020593 dated Jun. 12, 2019 in 12 pages.
Ming-Zhere Poh, D.J. McDuff, and R.W. Picard, “Advancements in Noncontact, Multiparameter Physiological Measurements Using a Webcam”, IEEE Transactions on Biomedical Engineering, vol. 58, No. 1, Jan. 2011.
Rostaminia, A. Mayberry, D. Ganesan, B. Marlin, and J. Gummeson, “Low-power Sensing of Fatigue and Drowsiness Measures on a Computational Eyeglass”, Proc ACM Interact Mob Wearable Ubiquitous Technol.; 1(2): 23; doi: 10.1145/3090088, Jun. 2017.
Supplementary European Search Report for application No. 17790186.5 which is an EP application related to the present application, dated Nov. 4, 2019.
van Gemert MJ, Welch AJ. Time constants in thermal laser medicine. Lasers Surg Med. 1989;9(4):405-421.
Wallace et al., Head motion measurement and correction using FID navigators, Magnetic Resonance in Medicine, 2019;81:258-274.
Dold et al., “Advantages and Limitations of Prospective Head Motion Compensation for MRI Using an Optical Motion Tracking Device”, Academic Radiology, vol. 13(9):1093-1103 (2006).
Related Publications (1)
Number Date Country
20180249927 A1 Sep 2018 US
Provisional Applications (1)
Number Date Country
60802216 May 2006 US
Continuations (6)
Number Date Country
Parent 14828299 Aug 2015 US
Child 15837240 US
Parent 14698350 Apr 2015 US
Child 14828299 US
Parent 14034252 Sep 2013 US
Child 14698350 US
Parent 13735907 Jan 2013 US
Child 14034252 US
Parent 13338166 Dec 2011 US
Child 13735907 US
Parent 11804417 May 2007 US
Child 13338166 US