Systems, devices, and methods for detecting false movements for motion correction during a medical imaging scan

Information

  • Patent Grant
  • 10660541
  • Patent Number
    10,660,541
  • Date Filed
    Tuesday, March 13, 2018
    6 years ago
  • Date Issued
    Tuesday, May 26, 2020
    4 years ago
Abstract
The systems, methods, and devices described herein generally relate to achieving accurate and robust motion correction by detecting and accounting for false movements in motion correction systems used in conjunction with medical imaging and/or therapeutic systems. In other words, in some embodiments of the systems, methods, and devices described herein can be configured to detect false movements for motion correction during a medical imaging scan and/or therapeutic procedure, and thereby ensure that such false movements are not accounted for in the motion correction process. Upon detection of false movements, the imaging or therapeutic system can be configured to transiently suppress and/or subsequently repeat acquisitions.
Description
BACKGROUND

The disclosure relates generally to the field of motion tracking, and more specifically to systems, devices, and methods for detecting false movements for motion correction during a medical imaging scan.


There are various modalities for performing medical imaging of patients. For example, magnetic resonance imaging (MRI) is a medical imaging technique used in radiology to visualize internal structures of the body in detail. An MRI scanner is a device in which the patient or a portion of the patient's body is positioned within a powerful magnet where a magnetic field is used to align the magnetization of some atomic nuclei (usually hydrogen nuclei—protons) and radio frequency magnetic fields are applied to systematically alter the alignment of this magnetization. This causes the nuclei to produce a rotating magnetic field detectable by the scanner and this information is recorded to construct an image of the scanned region of the body. These scans typically take several minutes (up to about one hour in some instances), and in prior art devices any movement can degrade or ruin the images and require the scan to be repeated. For example, a scanner can be any medical or biomedical imaging system, such as MRI, CAT, PET, SPECT, nuclear medicine or the like.


Additionally, there are various radiation therapies, proton therapies, and other therapies that can be applied to patients. For example, radiation therapy can be applied to a targeted tissue region. In some systems, radiation therapy can be dynamically applied in response to patient movements. However, in many such systems, the tracking of patient movements does not have a high degree of accuracy. Accordingly, the use of such systems can result in the application of radiation therapy to non-targeted tissue regions, thereby unintentionally harming healthy tissue while intentionally affecting diseased tissue. The foregoing is also true for proton therapies and other therapies.


In order to track motion of patient movements during a medical imaging and/or therapeutic procedure, some modalities utilize one or more markers. For example, in some motion tracking technologies related to medical imaging, one or more markers can be placed on one or more portions of a patient's body, which are then tracked by one or more detectors. However, not all movement of such markers truly reflects motion of the patient, or the organ or organs of interest. For example, a marker may slip on the skin, or skin may slip relative to the organ(s) of interest, resulting in unwanted motion signals, which can be referred to herein as “false movement” or “false motion,” and incorrect motion correction.


SUMMARY

The disclosure herein provides systems, devices, and methods for detecting false movements for motion correction during a medical imaging scan, such as during a magnetic resonance imaging scan.


An accurate and reliable method of determining the dynamic position and orientation of a patient's head or other body portion during MRI scanning or therapeutic procedures is a requirement in any attempt to compensate for subject motion during such procedures. Toward this end, one or more markers may be placed on one or more portions of a subject's body, which are then detected by one or more motion detectors. Such markers can provide points of reference in detecting subject motion, thereby facilitating the motion detection process. However, in some instances, marker movement may not reflect true movement of the subject. For example, a marker may slip or a marker may move due to skin movement rather than rigid body movement. As such, it can be advantageous to detect such false movements in correcting motion of a subject for medical imaging and/or therapeutic procedures in order to correct for only true movements of the subject.


In some embodiments, a computer-implemented method for determining false motion tracked by a motion correction system during a medical imaging scan, comprises: obtaining tracking data, wherein the tracking data reflects motion of a subject of the medical imaging scan or a portion thereof; determining by a false motion discriminator a likelihood of the obtained tracking data reflecting false motion, wherein the false motion is not reflective of true motion of the subject of the medical imaging scan or the portion thereof; and generating by the false motion discriminator a confidence level of the determined likelihood of the obtained tracking data reflecting false motion, wherein the motion correction system is configured to adjust output based on the confidence level, wherein the false motion discriminator comprises a computer processor and an electronic storage medium. In certain embodiments, the likelihood of the obtained tracking data reflecting false motion is based on at least one of velocity, acceleration, center of rotation, and axis of rotation of the motion of the subject. In certain embodiments, the likelihood of the obtained tracking data reflecting false motion is based on machine learning.


In some embodiments, a false motion classifier system for determining false motion tracked by a motion correction system during a medical imaging scan comprises: a first false motion discriminator configured to determine a center of rotation of a detected motion and further determine a first confidence level based on the center of rotation of the detected motion, wherein the first confidence level is indicative of the detected motion being true or false; a second false motion discriminator configured to determine a velocity of the detected motion and further determine a second confidence level based on the velocity of the detected motion, wherein the second confidence level is indicative of the detected motion being true or false; and a combination module configured to combine the first and second confidence levels to generate an output indicative of the detected motion being true or false, wherein an output indicative of the detected motion being false causes the motion correction system not to apply one or more motion correction processes to the detected motion.


In certain embodiments, the false motion classifier system further comprises a third false motion discriminator configured to determine a third confidence level indicative of the detected motion being true or false based on machine learning. In certain embodiments, the false motion classifier system further comprises a differential motion discriminator configured to determine a third confidence level indicative of the detected motion being true or false based on relative motion of two or more motion trackers used by the motion correction system. In certain embodiments, the false motion classifier system further comprises one or more external discriminators configured to determine one or more external confidence levels indicative of the detected motion being true or false. In certain embodiments, the one or more external discriminators are configured to determine the one or more external confidence levels based on at least one or more of noise and video tracking. In certain embodiments, the output is binary.


In some embodiments, a motion correction system for detecting and correcting motion by a subject during a medical imaging scan comprises: one or more markers placed on one or more portions of the subject; one or more detectors configured to track motion of the one or more markers; and a tracking quality classifier for determining false motion of the one or more markers, wherein the tracking quality classifier system comprises one or more false motion discriminators, wherein determination by the tracking quality classifier that a false motion of the one or more markers occurred causes the motion correction system not to apply one or more motion correction processes to the false motion.


In certain embodiments, the one or more false motion discriminators are configured to determine false motion of the one or more markers based on at least one or more of velocity, rotational center, and machine learning. In certain embodiments, the tracking quality classifier further comprises one or more external discriminators. In certain embodiments, the motion correction system further comprises a scanner control module configured to control a scanner for the medical imaging scan not to utilize acquired data during the false motion. In certain embodiments, the motion correction system further comprises a scanner control module configured to control a scanner for the medical imaging scan to repeat acquisitions of data initially acquired during the false motion.


For purposes of summarizing the invention and the advantages achieved over the prior art, certain objects and advantages are described herein. Of course, it is to be understood that not necessarily all such objects or advantages need to be achieved in accordance with any particular embodiment. Thus, for example, those skilled in the art will recognize that the invention may be embodied or carried out in a manner that can achieve or optimize one advantage or a group of advantages without necessarily achieving other objects or advantages.


All of these embodiments are intended to be within the scope of the invention herein disclosed. These and other embodiments will become readily apparent to those skilled in the art from the following detailed description having reference to the attached figures, the invention not being limited to any particular disclosed embodiment(s).





BRIEF DESCRIPTION OF THE DRAWINGS

The foregoing and other features, aspects, and advantages of the present inventions are described in detail below with reference to the drawings of various embodiments, which are intended to illustrate and not to limit the inventions. The drawings comprise the following figures in which:



FIG. 1A illustrates an embodiment of a schematic diagram depicting a side view of a medical imaging scanner as a part of a motion compensation system.



FIG. 1B illustrates an embodiment of a schematic diagram depicting a front view of a medical imaging scanner as a part of a motion compensation system.



FIG. 2 illustrates a perspective view of an embodiment of one or more optical markers used for a motion compensation system.



FIG. 3 is a block diagram depicting an embodiment of a simple tracking quality classifier.



FIG. 4 is a block diagram depicting an embodiment of a neural network.



FIG. 5 is illustrates an example of an embodiment of various confusion matrices showing the results of training, validation, and testing of a neural network.



FIG. 6 is a block diagram depicting an embodiment of a complex tracking quality classifier.



FIG. 7 is a block diagram depicting an embodiment of a complex tracking quality classifier with additional external input.



FIG. 8 is a block diagram depicting an embodiment of a biomedical or medical scanning system with a tracking quality classifier.



FIG. 9 is a block diagram depicting an embodiment of a complex tracking quality classifier with a differential motion discriminator.



FIG. 10 is a block diagram depicting an embodiment of a computer system configured to implement one or more embodiments of the methods, devices, and systems described herein.





DETAILED DESCRIPTION

Although several embodiments, examples, and illustrations are disclosed below, it will be understood by those of ordinary skill in the art that the inventions described herein extend beyond the specifically disclosed embodiments, examples, and illustrations and includes other uses of the inventions and obvious modifications and equivalents thereof. Embodiments of the inventions are described with reference to the accompanying figures, wherein like numerals refer to like elements throughout. The terminology used in the description presented herein is not intended to be interpreted in any limited or restrictive manner simply because it is being used in conjunction with a detailed description of certain specific embodiments of the inventions. In addition, embodiments of the inventions can comprise several novel features and no single feature is solely responsible for its desirable attributes or is essential to practicing the inventions herein described.


With the use of diagnostic technologies and therapeutic technologies, it can be advantageous to track patient movement with a high degree of accuracy. Such high accuracy tracking can improve the imaging quality obtained and produced by diagnostic equipment, such as imaging technologies. Further, the use of high accuracy patient movement tracking technology can improve the application of patient therapies, such as radiation treatment, proton treatment, and the like. By accounting for patient movement with a high degree of accuracy, therapeutic technologies can apply therapies only to the targeted tissue and avoid healthy surrounding tissue. One of such methods to address motion during a medical imaging scan or therapeutic procedure is prospective motion correction, which involves tracking of the object of interest, such as the head or brain of a subject, and adjusting scan planes in real-time or near real-time such that they follow the movement, resulting in images without motion artifacts.


When an external tracking system is used, such as a camera-based system, the tracking system usually tracks the position of a marker that is assumed to reflect the position of the object to be imaged, such as the brain. For example, many existing technologies relating to motion detection and/or correction in the field of medical imaging and/or therapeutic procedures utilize one or more markers. One or more markers can be attached or placed on one or more different portions of a subject's body that is subject to the medical imaging or therapeutic procedure. Such markers can be detected by the motion detection and/or correction system. As such, prospective motion correction can generally involve (1) continuously tracking the motion of the object of interest with a sensor device, commonly using a marker attached to the object of interest, (2) sending tracking information from the sensor to the imaging device, and (3) continuously adjusting scan parameters of the imaging device in real-time or near real-time such that the image acquisition is locked relative to the mobile object of interest. The results can be images that show no, or reduced, motion artifacts compared to a scan that does not use prospective motion correction.


However, some problems may arise with the use of such markers. It is possible that such markers transiently do not reflect true subject movement. For instance, prospective motion correction relies on the assumption that the tracking marker accurately represents the motion of the object of interest. However, the actual tissue of interest can be situated inside the subject, whereas the marker can be attached to the surface or skin of the subject. As such, while the motion of a marker attached to the forehead skin would normally be assumed to reflect movement of the brain inside the skull, the skin and thus the marker may slip relative to the brain, for example when a subject is squinting or twitching. In such circumstances, the marker may fail to accurately reproduce motion of the tissue of interest and result in false marker movements. This can result in false correction signals and image artifacts. Such false marker movements can greatly reduce the robustness of prospective motion correction when applied to uncooperative subjects, and in the worst-case scenario can render otherwise high-quality images unusable.


As such, although the markers have moved thereby triggering the motion correction process, it may be advantageous to ignore such movements of the markers. Such movements of markers that ideally should not be accounted for by the system in the motion correction process can be called “false movements” or “false motions.” In contrast, movements or motions of the markers that indeed should be accounted for in the motion correction process can be called “true motions” or “true movements.”


The systems, methods, and devices described herein generally relate to achieving accurate and robust motion correction by detecting and accounting for such false movements. In other words, in some embodiments of the systems, methods, and devices described herein can be configured to detect false movements for motion correction during a medical imaging scan and/or therapeutic procedure, and thereby ensure that such false marker movements are not accounted for in the motion correction process. In certain embodiments, the systems, methods, and devices described herein detect the presence of or a high likelihood of such false marker movements. Upon detection of false movements, the imaging or therapeutic apparatus can be configured to transiently suppress acquisitions and subsequently repeat acquisitions or reconstruct images with partially missing data.


In other words, some of the systems, devices, and methods described herein involve detecting biomechanically unlikely motions of a marker, which can suggest a high probability that false marker movements are present. Based on such detection, a medical or biomedical imaging device or therapeutic device can be informed of potentially false marker movement conditions and may take measures to avoid applying false corrections, for instance, by not updating positions or by suspending acquisition of imaging data until the false motion condition is no longer present.


In certain embodiments, the systems, devices, and methods described herein can be utilized to determine false movement or motion without use of markers. In some embodiments, the systems, devices, and methods described herein can be applied to tracking systems where the sensing device or devices are directly located on the object of interest, such as the skin of the subject. In certain embodiments, the systems, devices, and methods described herein can be configured to detect false marker movement or false movement for body organs other than the head and/or brain of a subject.


Motion Compensation Systems and Medical Imaging Scanners or Therapeutic Procedures



FIG. 1A is a schematic diagram illustrating a side view of a medical imaging scanner 104 as part of a motion compensation system 100 that is configured to determine false movements for motion correction. FIG. 1B is a schematic diagram illustrating a front view of a medical imaging scanner 104 as part of a motion compensation system 100 that is configured to detect and account for false movements for motion correction during a medical imaging scan or therapeutic procedure.


The motion compensation system 100 illustrated in FIGS. 1A and 1B can comprise a motion tracking system 102, a scanner, a scanner controller 106, one or more detectors 108, and one or more motion tracking markers or targets 110. In some embodiments, one or more markers 110 can be attached and/or otherwise placed on a subject 112. For example, the one or more markers 110 can be placed on the face of a subject 110 for imaging or therapeutic procedures directed to the head or brain of the subject. Similarly, the one or more markers 110 can be placed on other portions of a body of a subject 110 for imaging or therapeutic procedures directed to other portions of the body of the subject 110. The subject 110 can be positioned to lie on a table 114 of a medical imaging scanner 104. The scanner 104 can be, for example, a magnetic resonance imaging scanner or MRI scanner.


As depicted in FIGS. 1A and 1B, a three-dimensional coordinate system or space can be applied to a subject that is positioned inside a medical imaging scanner. For example, the center or substantially center of a particular portion of the subject 110 for observation can be thought of as having coordinates of (0, 0, 0). Further, a z-axis can be imposed along the longitudinal axis of the medical imaging scanner. In other words, the z-axis can be positioned along the length of the medical imaging scanner and along the height of a subject or patient that is positioned within the medical imaging scanner, thereby essentially coming out of the medical scanner. Similarly, an x-axis can be thought of as being positioned along the width of the medical imaging scanner or along the width of the patient or subject that is positioned within the medical imaging scanner. Furthermore, a y-axis can be thought of as extending along the height of the medical imaging scanner. For instance, the y-axis can be thought of as extending from the patient or subject located within the medical imaging scanner towards the one or more detectors 108.


Markers


As briefly discussed above, in many medical imaging scanning systems and/or therapeutic procedure systems that utilize motion detection and correction systems or methods, one or more markers can be utilized to track the position and movement or motion of the patient or subject. FIG. 2 illustrates an example of an embodiment where one or more markers are placed or attached to the face of a patient or subject for motion detection or correction purposes.


As depicted in FIG. 2, one or more markers 202 can be placed on the face or head of a patient or subject 112. In the example shown in FIG. 2, three of such markers 202 are placed on the patient or subject's head. For example, one or more markers 202 can be placed on the forehead of a subject or patient 112 and/or one or more markers 202 can be placed vis a vie a mouthpiece that is held by the mouth of a subject. In certain embodiments, one or more markers 202 can be placed on other portions of the subject as well. For example, one or more markers 202 can be placed on one or more sides of the subject's nose, nose bridge, cheek, cheekbone, eyes, eyelids, neck, chest, stomach, and/or any other portion of the body.


However, while such markers 202 can help the motion correction or detection system detect motion of the subject and correct for any unwanted movement by the subject, one or more of such markers 202 can also move unexpectedly and/or separately from a rigid body movement. For example, when one or more markers 202 are placed on the face of a subject or patient, any twitches on the skin of the patient, for example, if the patient sneezes, squints, frowns or makes any other facial expression, such markers 202 can move. However, if taking an MRI scan of a subject's brain, such false movements along the skin or surface of the patient only without any or substantial movement of the head or brain of the subject, ideally should not have any impact on the resulting scanned image. However if all such false movements or motions along the skin of a patient are accounted for by the motion correction system, the resulting scanned image may in fact be even more blurry compared to when no such motion correction was applied, because the brain or head of the subject did not actually move.


Tracking Quality Classifier—General


In some embodiments, the system can comprise one or more motion classifiers. For example, such motion classifiers can be configured to detect and/or classify a particular motion as being a false movement and/or true movement. Such motion classifier can be based on a physical model. In certain embodiments, the classifier may run in real-time, substantially real-time, or near real-time during a medical imaging process or therapeutic process and receive input from one or more motion tracking devices and/or systems.


In certain embodiments, tracking data obtained by the motion classifier can be sent to a false motion discriminator. A false motion discriminator can be configured to estimate a confidence level for false marker movement and forward the decision to the medical imaging or therapeutic device or system.



FIG. 3 illustrates a block diagram depicting an embodiment of a simple tracking quality classifier. As depicted in FIG. 3, the system or motion classifier can be configured to track motion data of a subject in block 302. Such tracked motion data can be further analyzed by the system to detect whether the tracked motion is true or false. For example, in certain embodiments, the system can comprise a false motion discriminator at block 304 that is configured to determine whether the tracked motion is false or true motion. In certain embodiments, the system can comprise a plurality of false motion discriminators 304.


In some embodiments, the one or more false motion discriminators 304 can be configured to output one or more confidence levels 306 of the detected movement. For example, the system can be configured to output a confidence level that is high when the false motion discriminator determines that the detected movement or motion of the subject is highly likely to be false movement that should not be accounted for by the motion correction system. In contrast, the system can be configured to output a confidence level that is high when the false motion discriminator determines that the detected movement or motion of the subject is highly likely to be true movement that should be accounted for by the motion correction system. Similarly, in some embodiments, a low confidence level can correspond to a low likelihood of false movement or high likelihood of true movement. In certain embodiments, a low confidence level can correspond to a low likelihood of true movement or high likelihood of false movement.


In some embodiments, the confidence level that is determined by the one or more false motion discriminators 304 can be a unit-less number. For example, the confidence level can be on a scale of 0 to 1, 1 to 10, 1 to 100, and 1 to 1,000. In certain embodiments, the confidence level can be a percentage or ratio.


Based on the confidence level that is determined by the false motion discriminator 304, the system can be configured to output a decision 308 of the nature of the detected movement, which can then be utilized by the motion correction system of a medical imaging scanner or therapeutic procedure to either account for or not account for the tracked motion.


In some embodiments, the system comprises only one false motion discriminator. In other embodiments, the system can comprise a plurality of false motion discriminators. In certain embodiments, such false motion discriminators can be based on one or more parameters. For example, in some embodiments, a false motion discriminator 304 can be configured to determine the likelihood of a false motion or movement based on the velocity, acceleration, axis of rotation, central point of rotation, machine learning, and/or any combination of the above.


Velocity/Acceleration-Based Discriminator


In some embodiments, the system or one or false motion discriminators thereof can be configured to determine the likelihood of a false motion being present based on the velocity of movement or motion of one or more markers or trackers. For example, when a subject twitches or sneezes or the like movement of the marker can be substantially faster than when there is rigid body movement or physiological movement. Conditions associated with false marker movement, such as squinting, coughing or frowning are typically transient and involve velocities that are substantially higher than those for a typical physiologic motions, since the head has a high inertia compared to skin or marker. Accordingly, the velocity discriminator can be configured to provide a confidence level that the velocity of a given movement is outside of the physiologic range.


In certain embodiments, the system or one or false motion discriminators thereof can be configured to detect or otherwise receive the raw tracking data and further analyze such data by obtaining their first and/or second temporal derivatives of velocity and/or acceleration. Such velocities and/or acceleration may be used by a velocity-based discriminator to determine a likelihood of false movement. In certain embodiments, such obtained velocities and/or acceleration may be further sent to a set of discriminators, each of which can be configured to estimate confidence levels for conditions of false marker movement.


In some embodiments, if the velocity of a particular movement or motion of a marker at any given time is greater or equal to 10-20 mm/sec or 10-20°/sec, depending on whether the velocity is determined based on distance or angle of rotation, the system or velocity-based false motion discriminator can be configured to determine a high likelihood of false movement. Similarly, if the velocity of a particular movement or motion of a marker at any given time is lower or equal to 10-20 mm/sec or 10-20°/sec, the system or velocity-based false motion discriminator can be configured to determine a low likelihood of false movement or high likelihood of true movement.


In certain embodiments, the boundary for determining a high likelihood of false or true movement based on velocity of a particular marker can be about 1 mm/sec, about 2 mm/sec, about 3 mm/sec, about 4 mm/sec, about 5 mm/sec, about 6 mm/sec, about 7 mm/sec, about 8 mm/sec, about 9 mm/sec, about 10 mm/sec, about 11 mm/sec, about 12 mm/sec, about 13 mm/sec, about 14 mm/sec, about 15 mm/sec, about 16 mm/sec, about 17 mm/sec, about 18 mm/sec, about 19 mm/sec, about 20 mm/sec, about 21 mm/sec, about 22 mm/sec, about 23 mm/sec, about 24 mm/sec, about 25 mm/sec, about 26 mm/sec, about 27 mm/sec, about 28 mm/sec, about 29 mm/sec, about 30 mm/sec, about 1°/sec, about 2°/sec, about 3°/sec, about 4°/sec, about 5°/sec, about 6°/sec, about 7°/sec, about 8°/sec, about 9°/sec, about 10°/sec, about 11°/sec, about 12°/sec, about 13°/sec, about 14°/sec, about 15°/sec, about 16°/sec, about 17°/sec, about 18°/sec, about 19°/sec, about 20°/sec, about 21°/sec, about 22°/sec, about 23°/sec, about 24°/sec, about 25°/sec, 26°/sec, 27°/sec, 28°/sec, 29°/sec, 30°/sec, and/or within a range defined by any two of the above-identified values.


Similarly, in some embodiments, the boundary for determining a high likelihood of false or true movement based on acceleration of a particular marker can be about 1 mm/sec2, about 2 mm/sec2, about 3 mm/sec2, about 4 mm/sec2, about 5 mm/sec2, about 6 mm/sec2, about 7 mm/sec2, about 8 mm/sec2, about 9 mm/sec2, about 10 mm/sec2, about 11 mm/sec2, about 12 mm/sec2, about 13 mm/sec2, about 14 mm/sec2, about 15 mm/sec2, about 16 mm/sec2, about 17 mm/sec2, about 18 mm/sec2, about 19 mm/sec2, about 20 mm/sec2, about 21 mm/sec2, about 22 mm/sec2, about 23 mm/sec2, about 24 mm/sec2, about 25 mm/sec2, about 26 mm/sec2, about 27 mm/sec2, about 28 mm/sec2, about 29 mm/sec2, about 30 mm/sec2, about 1°/sec2, about 2°/sec2, about 3°/sec2, about 4°/sec2, about 5°/sec2, about 6°/sec2, about 7°/sec2, about 8°/sec2, about 9°/sec2, about 10°/sec2, about 11°/sec2, about 12°/sec2, about 13°/sec2, about 14°/sec2, about 15°/sec2, about 16°/sec2, about 17°/sec2, about 18°/sec2, about 19°/sec2, about 20°/sec2, about 21°/sec2, about 22°/sec2, about 23°/sec2, about 24°/sec2, about 25°/sec2, 26°/sec2, 27°/sec2, 28°/sec2, 29°/sec2, 30°/sec2, and/or within a range defined by any two of the above-identified values.


Center of Rotation-Based Discriminator


In some embodiments, the system or one or false motion discriminators thereof can be configured to determine the likelihood of a false motion being present based on the center of rotation of a movement or motion of one or more markers or trackers. In other words, in certain embodiments, a false motion discriminator can be configured to determine or estimate the likelihood of false movement or motion of a subject based on the anterior-posterior center (y) position of head rotations.


Physiological head movements commonly have a posterior rotational center. If the center or substantially center of a head of a patient or subject is assumed to be at a y=0 position, physiological head movements commonly have a rotational center that is below 0 on the y axis. For example, the rotational center of a physiological head movement can be located along the y axis at around y=−50 mm, wherein y=0 is located at substantially the center of the subject's head. In contrast, a false marker motion or rotation can be thought of as being centered at an anterior attachment point. In a false marker motion, the rotational center can be assumed to be at a position along the y axis with a positive value. For example, the rotational center of a false marker motion can be at roughly y=80 mm. This is because most subjects inside a medical imaging scanner or therapeutic system are in a position with their head in contact with the bed of the scanner. As such, when the subject rotates his or her head the rotational center will usually be located near or at the center of the back of the subject's head that is contact with the bed of the scanner. In contrast, any movement along the skin or surface of the patient's head will tend to rotate along a rotational center that is located much higher than the back of the subject's head.


In other words, since brain MRI is generally performed with patients in a supine position, physiologic head rotations are typically centered where the back of the head is supported, which can be approximately 50 to 100 millimeters posterior of the MRI isocenter. Conversely, false marker movements typically involve skin motion at the marker attachment point, which can be at the forehead, nose, teeth, or any other position, and therefore a rotational center can be located 50 to 100 millimeters interior of the MRI isocenter. Based on such assumptions, in some embodiments, a false motion discriminator can be configured to utilize the tracking data from a single marker to estimate the anterior-posterior position of the axis of rotation. Accordingly, in certain embodiments, the false motion discriminator can be configured to provide a confidence level that a given rotation is centered close to the skin attachment point.


In some embodiments, the system or one or more false motion discriminators can be configured to detect or otherwise receive tracking data of one or more markers and determine the center of rotation of the movement. To determine the center of rotation, the system or one or more false motion discriminators can be configured to utilize one or more processes or methods. An embodiment of one of such methods is described below in connection with Appendix A below.


A-P Axis Rotation-Based Discriminator


In certain embodiments, a false motion discriminator can be configured to determine false motion based on the fact that human head movements in the tight space of an MRI radiofrequency (RF) coil are unlikely to involve major rotations about the anterior-posterior (A-P) axis, which can be thought of as connecting the nose to the back of the head of a subject. Additionally, if such rotations occur, they are generally centered at a point towards the neck. In other words, A-P rotations generally occur during motions dominated by left-right movements of the head or head shaking. Conversely, facial skin movements, such as squinting can result in relatively large and fast A-P rotations without proportionate left-right head movements. As such, in certain embodiments, the system or a false motion discriminator thereof can be configured to track and determine the rotation of a particular movement along the A-P axis and the scope thereof and/or determine left-right head movement in order to determine a likelihood of false of true motion. Consequently, a false motion discriminator based on A-P rotations can provide a confidence level that a given A-P rotation is outside of a biomechanically reasonable range in terms of velocity and not associated with a rotational center distant from the neck.


As such, in some embodiments, the system or a false motion discriminator thereof can be configured to determine the velocity of a particular movement or motion and also the rotational movement aroung the A-P axis using one or methods described herein. Based on such data, the system or false motion discriminator thereof can be further configured to determine whether the determined velocity, A-P rotation trend or movement, and/or both is within or outside of a biomechanically reasonable range for determining the likelihood of false or true movement.


Neural Network-Based Discriminator


In some embodiments, the system can comprise one or more false motion discriminators that involve sophisticated pattern recognition or machine learning approaches. For instance, in certain embodiments, a false motion discriminator can be configured to use an artificial neural network that is trained on tracking data from sample subjects who perform rigid head movements as well as non-rigid face movements or false movements, such as squinting, coughing, frowning or the like.



FIG. 4 is a block diagram depicting an embodiment of a neural network. As shown in FIG. 4, in certain embodiments, a false motion discriminator can comprise one or more probabilistic networks. Further, in some embodiments, the network can be a feed-forward neural network comprising a hidden layer 404 of 18 neurons and an output layer 406 of 1 neuron. In other embodiments, the hidden layer 404 and/or the output layer 406 can comprise any number of neurons.


A neural network can be trained to classify motion into rigid-body motion and skin-movement based on data collected from one or more volunteers or sample subjects who can perform both rigid and skin movement. For example, one or more markers can be attached to one or more portions of a volunteer, such as forehead, nose, etc., and a plurality of rigid and skin movement can be inputted into the system. In some embodiments, while the network can be trained using data from one marker, the presence of two markers can provide the network with the true information about the type of motion. In certain embodiments, the system can be binary, for example 0 can correspond to rigid-body motion and 1 can correspond to skin-movement.



FIG. 5 illustrates an example of an embodiment of various confusion matrices showing the results of training, validation, and testing of a neural network.


Additionally, other well-established and/or non-neural discrimination methods can be utilized to detect conditions indicating false marker movements. For example, some embodiments may utilize a support vector machine for detecting conditions indicating false marker movements.


Complex Tracking Quality Classifier


In some embodiments, the system can comprise a plurality of false motion discriminators, wherein the determined confidence levels or other output of each of the false motion discriminators can be combined in some manner to generate a final output. By utilizing a plurality of false motion discriminators, the accuracy and quality of the system to detect false motions can be substantially improved. FIG. 6 is a block diagram depicting an embodiment of a complex tracking quality classifier.


As depicted in FIG. 6, in some embodiments, tracking data from one or more detectors can be collected in block 602. Such tracking data can be further analyzed by one or more false motion discriminators. For example, in some embodiments, the system can comprise a false motion discriminator that determines the likelihood of a false motion being present based on the rotation center of the movement 604. Such false motion discriminators based on rotational centers can utilize one or more processes described herein.


Further, in certain embodiments, the system can further comprise one or more additional false motion discriminators 606. For example, the one or more additional false motion discriminators 606 can utilize one or more neural or probabilistic methods or processes in determining false motion as discussed herein. In some embodiments, the system can be configured to determine first and/or second temporal derivatives of the tracked motion at block 608, which correspond to velocity and acceleration as discussed above. In certain embodiments, the one or more additional false motion discriminators 606 can be configured to utilize the determined velocity and/or acceleration data of the tracked motion in determining the likelihood of false or true motion. In other embodiments, the one or more additional false motion discriminators 606 can be configured to determine a likelihood of false or true motion without utilizing data relating to the velocity and/or acceleration of the tracked motion. In certain embodiments, one or more of a plurality of false motion discriminators can be configured to utilize velocity and/or acceleration data generated from tracked motion while other one or more of the plurality of false motion discriminators can be configured not to utilize the velocity and/or acceleration data in determining the likelihood of false and/or true motion.


In some embodiments, the system can comprise a velocity-based false motion discriminator 610 configured to determine a confidence level regarding the true or false nature of tracked motion. Any such false motion discriminator 610 configured to utilize velocity of the tracked data as basis for determining a likelihood of false motion can use any of such processes or methods described herein.


In certain embodiments, each or some subset of false motion discriminators 604, 606, 610 of the system can be configured to produce or output a confidence level 612. In some embodiments, such confidence levels 612 can be combined in block 614 according to one or more methods or processes. For example, in some embodiments, the system can comprise one or more combination modules.


The system or one or more combination modules thereof can be configured to use one or more methods to combine the one or more confidence levels. For example, the system can be configured to utilize a simple “winner takes all” process. In such embodiments, the overall probability of false marker movement can be defined as the highest confidence level value of individual discriminators, wherein a high confidence level corresponds to a high likelihood of false motion being present. Further, in certain embodiments, the system can be configured to utilize a more sophisticated approach, such as a Bayesian approach.


In some embodiments, the combination module 614 can be configured to combine the one or more confidence levels 612 and generate an output 616. The output 616 can be binary. For example, in certain embodiments, the system can be configured to generate an output of 0 when it determines that a particular movement or motion corresponds to rigid body movement and can generate an output of 1 when it determines that the movement corresponds to skin motion or false movement. In certain embodiments, if the output is 0, then the system can be configured to apply one or more motion correction processes to clarify the image. In contrast, if the output is 1, the system can be configured not to apply any motion correction process despite the fact that motion tracking data was obtained.


Complex Tracking Quality Classifier with External Input


In some embodiments, a system can comprise an additional external input in addition to the one or more false motion discriminators. The system can comprise an additional external input in order to provide an additional input or confidence level generator to further improve the accuracy of the one or more false motion discriminators. For example, in some embodiments, one or more external inputs can be configured to determine one or more external confidence levels. In certain embodiments, such one or more external confidence levels can be combined with the one or more confidence levels determined by the one or more false motion discriminators.


For example, in some embodiments where the system comprises more than one sensor to track motion, an external discriminator of the system may be configured to calculate a confidence measure based on internal consistency of data from multiple sensors. In certain embodiments, an external discriminator may be configured to detect and process acoustic noise from a patient in the medical imaging scanner or therapeutic device, since high acoustic noise levels can be associated with patient discomfort and/or pain and thus an increased probability of false marker movements. Further, in some embodiments, an external discriminator of the system can be configured to utilize video tracking to determine actual movement of the subject.



FIG. 7 is a block diagram depicting an embodiment of a complex tracking quality classifier with additional external input. As illustrated in FIG. 7, in some embodiments, the system can be configured to track motion data in block 702 using one or more sensors. The system can also comprise one or more external discriminators configured to determine one or more external confidence levels in block 712.


In some embodiments, one or more confidence levels determined by one or more false motion discriminators 704, 706, 710, utilizing one or more methods or processes described herein can then be further combined in block 716 together and with the one or more external confidence levels 712. The one or more confidence levels determined by the one or more false motion discriminators 704, 706, 710 and the one or more external discriminators can be combined based on a winner takes all process and/or a more complex statistical approach such as a Bayesian approach.


Medical Imaging Scanning System with a Tracking Quality Classifier


As discussed herein, in some embodiments, one or more tracking quality classifiers can be combined with or be part of a medical imaging scanner and/or therapeutic device or system that utilizes motion detection and/or correction systems in order to improve scan quality. FIG. 8 is a block diagram depicting an embodiment of a biomedical scanning system with a tracking quality classifier. For example, a scanner can be any medical or biomedical imaging system, such as MRI, CAT, PET, SPECT, nuclear medicine, or the like.


The acquisition of images by the scanner can be controlled by a scanner control module. In some embodiments, prospective motion correction is achieved by sending tracking data to the scanner control module, which continuously adjusts scan parameters, such as slice positions and/or rotations, such that the scan tracks motion of the object of interest. In certain embodiments, the scanner control module can be configured to additionally receive information regarding false marker movement from any one or more tracking quality classifiers as described herein.


If the false marker movement information indicates false marker movement, the system or scanner control module can be configured to take appropriate action to avoid false motion corrections. For example, in some embodiments, the scanner control module or other part or device of the system can be configured to (1) not apply motion correction updates, (2) pause image acquisition until the false marker condition has resolve, (3) acquire “dummy” or place-holder acquisitions and re-acquire real acquisitions once the condition has resolved and/or (4) apply other corrective measures.


As illustrated in FIG. 8, in some embodiments, the system can be configured to detect and/or track motion data in block 802. In certain embodiments, the system can comprise one or more external discriminators that are configured to determine one or more external confidence levels 814 as discussed above. The tracking data 802 and/or external confidence level(s) 814 can be then inputted into a tracking quality classifier 804.


In some embodiments, the tracking quality classifier 804 can comprise one or more false motion discriminators. For example, the tracking quality classifier 804 can comprise a rotation center-based false motion discriminator 806. The tracking quality classifier 804 may comprise one or more additional false motion discriminators 808. For example, the one or more additional discriminators 808 can be configured to detect or determine false motion based on a neural network and/or via a probabilistic process. In certain embodiments, the one or more additional discriminators 808 can be configured to determine or detect false motion based on velocity 810 and/or acceleration of the tracked data.


In certain embodiments, the tracking quality classifier 804 can be configured to determine a first order temporal derivative 810 and/or second or higher order temporal derivatives. In some embodiments, the tracking quality classifier 804 can comprise a velocity-based false motion discriminator 812.


Confidence levels 816 determined by the one or more false motion discriminators and/or one or more external confidence levels 814 can be combined in block 818. For example, the combination may involve a winner takes all approach and/or statistical approach, such as a Bayesian process.


If the tracking quality classifier 804 determines that false movement or false marker movement was likely present for a particular period of time, the system can be configured to transmit the false marker movement related data 820 and the tracking data 802 of that period to a scanner control module 822. The scanner control module 822 can be configured to respond utilizing any of the methods discussed above in controlling the scanner 824.


False Motion Classifier with Differential Motion Discriminator


As discussed above, in some embodiments, a motion detection and/or correction system for a medical imaging scanner and/or therapeutic device can comprise one or more markers or trackers. For such systems, in certain embodiments, a false motion classifier or tracking quality classifier can comprise one or more differential motion discriminators. Such differential motion discriminators can be configured to track the motion of two or more markers relative to each other and utilize such data to determine whether rigid body movement or facial movement or false movement was involved.



FIG. 9 is a block diagram depicting an embodiment of a complex tracking quality classifier with a differential motion discriminator. In some embodiments, as illustrated in FIG. 9, a motion detection and/or correction system to be used in conjunction with a medical imaging scanner and/or therapeutic device or system can involve use of one or more markers 902, 904. Such markers can be configured to provide 6 degrees of freedom in some embodiments. In certain embodiments, in addition to the one or more false motion discriminators 910, 912, 914, 922, 924, 926, which can involve any one or more methods of processes described herein, the false motion classifier 906 can further comprise a differential motion discriminator 918.


A differential motion discriminator 918 can be configured to detect and/or track movement or motion of the plurality of markers 902, 904 relative to each other. For example, the differential motion discriminator 918 can be configured to determine the distance and/or relative location of two or more markers 902, 904 relative to each other. In certain embodiments, the differential motion discriminator 918 can further be configured to determine a first order and/or second order temporal derivative of the relative location of the two or more markers 902, 904 to determine a velocity and/or acceleration of movement of the two or more markers 902, 904 relative to each other. The basic assumption can be that when a rigid body movement occurs, all or substantially all markers located on the subject move together, thereby not moving or substantially not moving relative to each other. However, when skin movement or false movement occurs, the location of one or more markers relative to others can change, rather abruptly in certain circumstances.


As such, in some embodiments, the system can be configured to determine that a high likelihood of false movement is present when the relative movement between two or more markers present on a subject is above a certain threshold, percentage, or ratio. Similarly, in certain embodiments, the system can be configured to determine that a high likelihood of false movement is present when a first order temporal derivative of relative location of two or more markers or velocity of relative movement is above a certain threshold, percentage, or ratio.


In certain embodiments, the differential motion discriminator 918 can be configured to generate a confidence level 916, 918 similar to those discussed above. In some embodiments, confidence levels 916 determined by one or more false movement discriminators 910, 912, 914 with respect to a first marker 902, confidence levels 918 determined by one or more false movement discriminators 922, 924, 926 with respect to a second marker 904, and/or a confidence level determined by a differential motion discriminator 918 based on both the first and second markers 902, 904 can be combined in block 930 using any of the combination methods or process discussed herein. Further, the system can be configured to generate one or more outputs, such as a binary output as discussed above in block 932.


In some embodiments, the differential motion discriminator 918 may provide the highest overall accuracy compared to the other false motion discriminators based on a single marker. However, for systems that utilize only a single marker, other false motion discriminators configured to determine false motion based on tracking data provided by a single marker may be important. Further, since false-negative events, where false tracking is not detected, but not false-positive events, may cause deterioration of MRI scan quality, it can be important to minimize the false-negative rate. The false-negative rate can be minimized by optimizing one or more thresholds for the one or more false motion discriminators.


Variations


Specific embodiments have been described in detail above with emphasis on medical application and in particular MRI examination of a patient's head. However, the teachings of the present invention can be utilized for other MRI examinations of other body parts where movements of up to six degrees of freedom are possible. In addition medical procedures involving imaging devices other than MRI equipment (e.g., CT, PET, ultrasound, plain radiography, and others) may benefit from the teaching of the present invention. The teachings of the present invention may be useful in many non-medical applications where tracking of a target having several degrees of freedom are possible. Some of these applications could be military applications. Furthermore, while particular algorithms are disclosed, variations, combinations, and subcombinations are also possible.


Computing System


In some embodiments, the computer clients and/or servers described above take the form of a computing system illustrated in FIG. 10, which is a block diagram of one embodiment of a computing system that is in communication with one or more computing systems 1020 and/or one or more data sources 1022 via one or more networks 1018. The computing system 1000 may be used to implement one or more of the systems and methods described herein. In addition, in one embodiment, the computing system 1000 may be configured to apply one or more of the methods and systems described herein. While FIG. 10 illustrates an embodiment of a computing system 1000, it is recognized that the functionality provided for in the components and modules of computing system 1000 may be combined into fewer components and modules or further separated into additional components and modules.


False Motion Detection System


In an embodiment, the system 1000 comprises a false motion detection system module 1014 that carries out the functions described herein with reference to false motion detection, including any one of the false motion detection and/or combination methods described above. The false motion detection system module 1014 may be executed on the computing system 1000 by a central processing unit 1004 discussed further below.


In general, the word “module,” as used herein, refers to logic embodied in hardware or firmware, or to a collection of software instructions, possibly having entry and exit points, written in a programming language, such as, for example, COBOL, CICS, Java, Lua, C or C++ or Objective C. A software module may be compiled and linked into an executable program, installed in a dynamic link library, or may be written in an interpreted programming language such as, for example, BASIC, Perl, or Python. It will be appreciated that software modules may be callable from other modules or from themselves, and/or may be invoked in response to detected events or interrupts. Software instructions may be embedded in firmware, such as an EPROM. It will be further appreciated that hardware modules may be comprised of connected logic units, such as gates and flip-flops, and/or may be comprised of programmable units, such as programmable gate arrays or processors. The modules described herein are preferably implemented as software modules, but may be represented in hardware or firmware. Generally, the modules described herein refer to logical modules that may be combined with other modules or divided into sub-modules despite their physical organization or storage.


Computing System Components


In an embodiment, the computing system 1000 also comprises a workstation or other computing devices suitable for controlling and/or communicating with large databases, performing transaction processing, and generating reports from large databases. The computing system 1000 also comprises a central processing unit (“CPU”) 1004, which may comprise a conventional microprocessor. The computing system 1000 further comprises a memory 1008, such as random access memory (“RAM”) for temporary storage of information and/or a read only memory (“ROM”) for permanent storage of information, and a mass storage device 1002, such as a hard drive, diskette, or optical media storage device. Typically, the modules of the computing system 1000 are connected to the computer using a standards based bus system. In different embodiments, the standards based bus system could be Peripheral Component Interconnect (PCI), Microchannel, SCSI, Industrial Standard Architecture (ISA) and Extended ISA (EISA) architectures, for example.


The computing system 1000 comprises one or more commonly available input/output (I/O) devices and interfaces 1012, such as a keyboard, mouse, touchpad, and printer. In one embodiment, the I/O devices and interfaces 1012 comprise one or more display devices, such as a monitor, that allows the visual presentation of data to a user. More particularly, a display device provides for the presentation of GUIs, application software data, and multimedia presentations, for example. In the embodiment of FIG. 10, the I/O devices and interfaces 1012 also provide a communications interface to various external devices. The computing system 1000 may also comprise one or more multimedia devices 1006, such as speakers, video cards, graphics accelerators, and microphones, for example.


Computing System Device/Operating System


The computing system 1000 may run on a variety of computing devices, such as, for example, a mobile device or a server or a desktop or a workstation, a Windows server, an Structure Query Language server, a Unix server, a personal computer, a mainframe computer, a laptop computer, a cell phone, a personal digital assistant, a kiosk, an audio player, a smartphone, a tablet computing device, and so forth. The computing system 1000 is generally controlled and coordinated by operating system software, such as iOS, z/OS, Windows 95, Windows 98, Windows NT, Windows 2000, Windows XP, Windows Vista, Windows 7, Linux, BSD, SunOS, Solaris, or other compatible operating systems. In Macintosh systems, the operating system may be any available operating system, such as MAC OS X. In other embodiments, the computing system 1000 may be controlled by a proprietary operating system. Conventional operating systems control and schedule computer processes for execution, perform memory management, provide file system, networking, and I/O services, and provide a user interface, such as a graphical user interface (“GUI”), among other things.


Network


In the embodiment of FIG. 10, the computing system 1000 is coupled to a network 1018, such as a LAN, WAN, or the Internet, for example, via a wired, wireless, or combination of wired and wireless, communication link 1016. The network 1018 communicates with various computing devices and/or other electronic devices via wired or wireless communication links. In the embodiment of FIG. 10, the network 1018 is communicating with one or more computing systems 1020 and/or one or more data sources 1022.


Access to the motion correction control system module 1014 of the computer system 1000 by computing systems 1020 and/or by data sources 1022 may be through a web-enabled user access point such as the computing systems' 1020 or data source's 1022 personal computer, cellular phone, laptop, or other device capable of connecting to the network 1018. Such a device may have a browser module is implemented as a module that uses text, graphics, audio, video, and other media to present data and to allow interaction with data via the network 1018.


The browser module may be implemented as a combination of an all points addressable display such as a cathode-ray tube (CRT), a liquid crystal display (LCD), a plasma display, touch screen display or other types and/or combinations of displays. In addition, the browser module may be implemented to communicate with input devices 1012 and may also comprise software with the appropriate interfaces which allow a user to access data through the use of stylized screen elements such as, for example, menus, windows, dialog boxes, toolbars, and controls (for example, radio buttons, check boxes, sliding scales, and so forth). Furthermore, the browser module may communicate with a set of input and output devices to receive signals from the user.


The input device(s) may comprise a keyboard, roller ball, pen and stylus, mouse, trackball, voice recognition system, or pre-designated switches or buttons. The output device(s) may comprise a speaker, a display screen, a printer, or a voice synthesizer. In addition a touch screen may act as a hybrid input/output device. In another embodiment, a user may interact with the system more directly such as through a system terminal connected to the score generator without communications over the Internet, a WAN, or LAN, or similar network.


In some embodiments, the system 1000 may comprise a physical or logical connection established between a remote microprocessor and a mainframe host computer for the express purpose of uploading, downloading, or viewing interactive data and databases on-line in real time. The remote microprocessor may be operated by an entity operating the computer system 1000, including the client server systems or the main server system, an/or may be operated by one or more of the data sources 1022 and/or one or more of the computing systems. In some embodiments, terminal emulation software may be used on the microprocessor for participating in the micro-mainframe link.


In some embodiments, computing systems 1020 that are internal to an entity operating the computer system 1000 may access the motion correction control system module 1014 internally as an application or process run by the CPU 1004.


User Access Point


In an embodiment, the computing system 1000 comprises a computing system, a smartphone, a tablet computing device, a mobile device, a personal computer, a laptop computer, a portable computing device, a server, a computer workstation, a local area network of individual computers, an interactive kiosk, a personal digital assistant, an interactive wireless communications device, a handheld computer, an embedded computing device, or the like.


Other Systems


In addition to the systems that are illustrated in FIG. 10, the network 1018 may communicate with other data sources or other computing devices. The computing system 1000 may also comprise one or more internal and/or external data sources. In some embodiments, one or more of the data repositories and the data sources may be implemented using a relational database, such as DB2, Sybase, Oracle, CodeBase and Microsoft® SQL Server as well as other types of databases such as, for example, a signal database, object-oriented database, and/or a record-based database.


Although this invention has been disclosed in the context of certain embodiments and examples, it will be understood by those skilled in the art that the invention extends beyond the specifically disclosed embodiments to other alternative embodiments and/or uses of the invention and obvious modifications and equivalents thereof. In addition, while several variations of the embodiments of the invention have been shown and described in detail, other modifications, which are within the scope of this invention, will be readily apparent to those of skill in the art based upon this disclosure. It is also contemplated that various combinations or sub-combinations of the specific features and aspects of the embodiments may be made and still fall within the scope of the invention. It should be understood that various features and aspects of the disclosed embodiments can be combined with, or substituted for, one another in order to form varying modes of the embodiments of the disclosed invention. Any methods disclosed herein need not be performed in the order recited. Thus, it is intended that the scope of the invention herein disclosed should not be limited by the particular embodiments described above.


Conditional language, such as, among others, “can,” “could,” “might,” or “may,” unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without user input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment. The headings used herein are for the convenience of the reader only and are not meant to limit the scope of the inventions or claims.


The ranges disclosed herein also encompass any and all overlap, sub-ranges, and combinations thereof. Language such as “up to,” “at least,” “greater than,” “less than,” “between,” and the like includes the number recited. Numbers preceded by a term such as “about” or “approximately” include the recited numbers. For example, “about 3 mm” includes “3 mm.”


The headings provided herein, if any, are for convenience only and do not necessarily affect the scope or meaning of the devices and methods disclosed herein.

Claims
  • 1. A computer-implemented method for determining false motion tracked by a motion correction system during a medical imaging scan, the method comprising: determining, by a false motion discriminator, a likelihood of motion tracking data of a subject during the medical imaging scan reflecting false motion, wherein the false motion does not correspond to true motion of the subject during the medical imaging scan; andcausing, by the false motion discriminator, adjustment of an output of the motion correction system based at least in part on the determined likelihood of the motion tracking data of the subject during the medical imaging scan reflecting false motion,wherein the false motion discriminator comprises a computer processor and an electronic storage medium.
  • 2. The computer-implemented method of claim 1, wherein the likelihood of the motion tracking data of the subject of the medical imaging scan reflecting false motion is based on one or more of velocity, acceleration, center of rotation, or axis of rotation of motion of the subject during the medical imaging scan.
  • 3. The computer-implemented method of claim 1, wherein the likelihood of the motion tracking data of the subject of the medical imaging scan reflecting false motion is based at least in part on machine learning.
  • 4. The computer-implemented method of claim 1, further comprising generating a confidence level of the determined likelihood of the motion tracking data of the subject during the medical imaging scan reflecting false motion.
  • 5. The computer-implemented method of claim 1, wherein the motion tracking data of the subject of the medical imaging scan is obtained by using one or more markers attached to the subject.
  • 6. The computer-implemented method of claim 1, wherein the motion tracking data of the subject of the medical imaging scan is obtained without using a marker attached to the subject.
  • 7. A computer-implemented method for determining false motion tracked by a motion correction system during a medical imaging scan, the method comprising: determining, by a false motion discriminator, a center of rotation of a detected motion of a subject during the medical imaging scan;determining, by the false motion discriminator, a velocity of the detected motion of the subject during the medical imaging scan; andgenerating, by the false motion discriminator, an output indicative of the detected motion of the subject during the medical imaging scan being true or false based at least in part on the determined center of rotation of the detected motion and the determined velocity of the detected motion,wherein an output indicative of the detected motion of the subject during the medical imaging scan being false causes the motion correction system to adjust one or more motion correction processes to the detected motion.
  • 8. The computer-implemented method of claim 7, wherein the output indicative of the detected motion of the subject during the medical imaging scan being true or false is further generated based at least in part on machine learning.
  • 9. The computer-implemented method of claim 7, wherein the output indicative of the detected motion of the subject during the medical imaging scan being true or false is further generated based at least in part on relative motion of the subject during the medical imaging scan detected by two or more detectors.
  • 10. The computer-implemented method of claim 7, wherein the output indicative of the detected motion of the subject during the medical imaging scan being true or false is further generated based at least in part on one or more external confidence levels indicative of the detected motion being true or false.
  • 11. The computer-implemented method of claim 10, wherein the one or more external confidence levels are determined based on one or more of noise or video tracking during the medical imaging scan.
  • 12. The computer-implemented method of claim 7, wherein the detected motion of the subject during the medical imaging scan is detected using one or more markers attached to the subject.
  • 13. The computer-implemented method of claim 7, wherein the detected motion of the subject during the medical imaging scan is detected without using a marker attached to the subject.
  • 14. The computer-implemented method of claim 7, wherein the output is binary.
  • 15. A motion correction system for detecting and correcting motion by a subject during a medical imaging scan, the motion correction system comprising: one or more optical detectors configured to track motion of the subject during the medical imaging scan; anda tracking quality classifier for determining false motion of the tracked motion of the subject during the medical imaging scan, wherein the tracking quality classifier comprises a computer processor and an electronic storage medium,wherein determination by the tracking quality classifier that a portion of the tracked motion of the subject during the medical imaging scan corresponds to false motion causes the motion correction system to adjust motion correction output to account for the false motion.
  • 16. The motion correction system of claim 15, wherein the one or more detectors are configured to track motion of the subject during the medical imaging scan using one or more markers attached to the subject.
  • 17. The motion correction system of claim 15, wherein the one or more detectors are configured to track motion of the subject during the medical imaging scan without using a marker attached to the subject.
  • 18. The motion correction system of claim 15, wherein the tracking quality classifier is configured to determine false motion of the tracked motion of the subject based on at least one or more of velocity, rotational center, or machine learning.
  • 19. The motion correction system of claim 15, wherein the tracking quality classifier comprises one or more external discriminators.
  • 20. The motion correction system of claim 19, wherein the one or more external discriminators are configured to track one or more of noise or video during the medical imaging scan.
CROSS-REFERENCE TO RELATED APPLICATIONS

The present application is a continuation of U.S. patent application Ser. No. 15/222,811, filed Jul. 28, 2016, and titled “SYSTEMS, DEVICES, AND METHODS FOR DETECTING FALSE MOVEMENTS FOR MOTION CORRECTION DURING A MEDICAL IMAGING SCAN,” which claims the benefit under 35 U.S.C. § 119(c) of U.S. Provisional Application No. 62/198,079, filed Jul. 28, 2015, and titled “DETECTION OF FALSE MOVEMENTS FOR PROSPECTIVE MOTION CORRECTION FOR BIOMEDICAL IMAGING.” Each of the foregoing applications is hereby incorporated herein by reference in its entirety under 37 C.F.R. § 1.57.

STATEMENT REGARDING FEDERALLY SPONSORED R&D

This invention was made with government support under grant number R01DA021146-01A1 awarded by the National Institutes of Health. The government has certain rights in the invention.

US Referenced Citations (776)
Number Name Date Kind
3811213 Eaves May 1974 A
4689999 Shkedi Sep 1987 A
4724386 Haacke et al. Feb 1988 A
4894129 Leiponen et al. Jan 1990 A
4923295 Sireul et al. May 1990 A
4953554 Zerhouni et al. Sep 1990 A
4988886 Palum et al. Jan 1991 A
5075562 Greivenkamp et al. Dec 1991 A
5318026 Pelc Jun 1994 A
5515711 Hinkle May 1996 A
5545993 Taguchi et al. Aug 1996 A
5615677 Pelc et al. Apr 1997 A
5687725 Wendt Nov 1997 A
5728935 Czompo Mar 1998 A
5802202 Yamada et al. Sep 1998 A
5808376 Gordon et al. Sep 1998 A
5835223 Zawemer et al. Nov 1998 A
5877732 Ziarati Mar 1999 A
5886257 Gustafson et al. Mar 1999 A
5889505 Toyama Mar 1999 A
5891060 McGregor Apr 1999 A
5936722 Armstrong et al. Aug 1999 A
5936723 Schmidt et al. Aug 1999 A
5947900 Derbyshire et al. Sep 1999 A
5987349 Schulz Nov 1999 A
6016439 Acker Jan 2000 A
6031888 Ivan et al. Feb 2000 A
6044308 Huissoon Mar 2000 A
6057680 Foo et al. May 2000 A
6057685 Zhou May 2000 A
6061644 Leis May 2000 A
6088482 He Jul 2000 A
6144875 Schweikard et al. Nov 2000 A
6175756 Ferre Jan 2001 B1
6236737 Gregson et al. May 2001 B1
6246900 Cosman et al. Jun 2001 B1
6279579 Riaziat et al. Aug 2001 B1
6285902 Kienzle, III et al. Sep 2001 B1
6289235 Webber Sep 2001 B1
6292683 Gupta et al. Sep 2001 B1
6298262 Franck et al. Oct 2001 B1
6381485 Hunter et al. Apr 2002 B1
6384908 Schmidt et al. May 2002 B1
6390982 Bova et al. May 2002 B1
6402762 Hunter et al. Jun 2002 B2
6405072 Cosman Jun 2002 B1
6421551 Kuth et al. Jul 2002 B1
6467905 Stahl et al. Oct 2002 B1
6474159 Foxlin et al. Nov 2002 B1
6484131 Amoral-Moriya et al. Nov 2002 B1
6490475 Seeley et al. Dec 2002 B1
6501981 Schweikard et al. Dec 2002 B1
6587707 Nehrke et al. Jul 2003 B2
6621889 Mostafavi Sep 2003 B1
6650920 Schaldach et al. Nov 2003 B2
6662036 Cosman Dec 2003 B2
6687528 Gupta et al. Feb 2004 B2
6690965 Riaziat et al. Feb 2004 B1
6711431 Sarin et al. Mar 2004 B2
6731970 Schlossbauer et al. May 2004 B2
6758218 Anthony Jul 2004 B2
6771997 Schaffer Aug 2004 B2
6794869 Brittain Sep 2004 B2
6856827 Seeley et al. Feb 2005 B2
6856828 Cossette et al. Feb 2005 B2
6876198 Watanabe et al. Apr 2005 B2
6888924 Claus et al. May 2005 B2
6891374 Brittain May 2005 B2
6892089 Prince et al. May 2005 B1
6897655 Brittain et al. May 2005 B2
6913603 Knopp et al. Jul 2005 B2
6937696 Mostafavi Aug 2005 B1
6959266 Mostafavi Oct 2005 B1
6973202 Mostafavi Dec 2005 B2
6980679 Jeung et al. Dec 2005 B2
7007699 Martinelli et al. Mar 2006 B2
7107091 Jutras et al. Sep 2006 B2
7110805 Machida Sep 2006 B2
7123758 Jeung et al. Oct 2006 B2
7171257 Thomson Jan 2007 B2
7173426 Bulumulla et al. Feb 2007 B1
7176440 Cofer et al. Feb 2007 B2
7191100 Mostafavi Mar 2007 B2
7204254 Riaziat et al. Apr 2007 B2
7209777 Saranathan et al. Apr 2007 B2
7209977 Acharya et al. Apr 2007 B2
7260253 Rahn et al. Aug 2007 B2
7260426 Schweikard et al. Aug 2007 B2
7295007 Dold Nov 2007 B2
7313430 Urquhart et al. Dec 2007 B2
7327865 Fu et al. Feb 2008 B2
7348776 Aksoy et al. Mar 2008 B1
7403638 Jeung et al. Jul 2008 B2
7494277 Setala Feb 2009 B2
7498811 Macfarlane et al. Mar 2009 B2
7502413 Guillaume Mar 2009 B2
7505805 Kuroda Mar 2009 B2
7535411 Falco May 2009 B2
7551089 Sawyer Jun 2009 B2
7561909 Pai et al. Jul 2009 B1
7567697 Mostafavi Jul 2009 B2
7573269 Yao Aug 2009 B2
7602301 Stirling et al. Oct 2009 B1
7603155 Jensen Oct 2009 B2
7623623 Raanes et al. Nov 2009 B2
7657300 Hunter et al. Feb 2010 B2
7657301 Mate et al. Feb 2010 B2
7659521 Pedroni Feb 2010 B2
7660623 Hunter et al. Feb 2010 B2
7668288 Conwell et al. Feb 2010 B2
7689263 Fung et al. Mar 2010 B1
7702380 Dean Apr 2010 B1
7715604 Sun et al. May 2010 B2
7742077 Sablak et al. Jun 2010 B2
7742621 Hammoud et al. Jun 2010 B2
7742804 Faul et al. Jun 2010 B2
7744528 Wallace et al. Jun 2010 B2
7760908 Curtner et al. Jul 2010 B2
7766837 Pedrizzetti et al. Aug 2010 B2
7769430 Mostafavi Aug 2010 B2
7772569 Bewersdorf et al. Aug 2010 B2
7787011 Zhou et al. Aug 2010 B2
7787935 Dumoulin et al. Aug 2010 B2
7791808 French et al. Sep 2010 B2
7792249 Gertner et al. Sep 2010 B2
7796154 Senior et al. Sep 2010 B2
7798730 Westerweck Sep 2010 B2
7801330 Zhang et al. Sep 2010 B2
7805987 Smith Oct 2010 B1
7806604 Bazakos et al. Oct 2010 B2
7817046 Coveley et al. Oct 2010 B2
7817824 Liang et al. Oct 2010 B2
7819818 Ghajar Oct 2010 B2
7833221 Voegele Nov 2010 B2
7834846 Bell Nov 2010 B1
7835783 Aletras Nov 2010 B1
7839551 Lee et al. Nov 2010 B2
7840253 Tremblay et al. Nov 2010 B2
7844094 Jeung et al. Nov 2010 B2
7844320 Shahidi Nov 2010 B2
7850526 Zalewski et al. Dec 2010 B2
7860301 Se et al. Dec 2010 B2
7866818 Schroeder et al. Jan 2011 B2
7868282 Lee et al. Jan 2011 B2
7878652 Chen et al. Feb 2011 B2
7883415 Larsen et al. Feb 2011 B2
7889907 Engelbart et al. Feb 2011 B2
7894877 Lewin et al. Feb 2011 B2
7902825 Bammer et al. Mar 2011 B2
7907987 Dempsey Mar 2011 B2
7908060 Basson et al. Mar 2011 B2
7908233 Angell et al. Mar 2011 B2
7911207 Macfarlane et al. Mar 2011 B2
7912532 Schmidt et al. Mar 2011 B2
7920250 Robert et al. Apr 2011 B2
7920911 Hoshino et al. Apr 2011 B2
7925066 Ruohonen et al. Apr 2011 B2
7925549 Looney et al. Apr 2011 B2
7931370 Bartomeu Apr 2011 B2
7944354 Kangas et al. May 2011 B2
7944454 Zhou et al. May 2011 B2
7945304 Feinberg May 2011 B2
7946921 Ofek et al. May 2011 B2
7962197 Rioux et al. Jun 2011 B2
7971999 Zinser Jul 2011 B2
7977942 White Jul 2011 B2
7978925 Souchard Jul 2011 B1
7988288 Donaldson Aug 2011 B2
7990365 Marvit et al. Aug 2011 B2
8005571 Sutherland et al. Aug 2011 B2
8009198 Alhadef Aug 2011 B2
8019170 Wang et al. Sep 2011 B2
8021231 Walker et al. Sep 2011 B2
8022982 Thorn Sep 2011 B2
8024026 Groszmann Sep 2011 B2
8031909 Se et al. Oct 2011 B2
8031933 Se et al. Oct 2011 B2
8036425 Hou Oct 2011 B2
8041077 Bell Oct 2011 B2
8041412 Glossop et al. Oct 2011 B2
8048002 Ghajar Nov 2011 B2
8049867 Bridges et al. Nov 2011 B2
8055020 Meuter et al. Nov 2011 B2
8055049 Stayman et al. Nov 2011 B2
8060185 Hunter et al. Nov 2011 B2
8063929 Kurtz et al. Nov 2011 B2
8073197 Xu et al. Dec 2011 B2
8077914 Kaplan Dec 2011 B1
8085302 Zhang et al. Dec 2011 B2
8086026 Schulz Dec 2011 B2
8086299 Adler et al. Dec 2011 B2
RE43147 Aviv Jan 2012 E
8094193 Peterson Jan 2012 B2
8095203 Wright et al. Jan 2012 B2
8095209 Flaherty Jan 2012 B2
8098889 Zhu et al. Jan 2012 B2
8113991 Kutliroff Feb 2012 B2
8116527 Sabol Feb 2012 B2
8121356 Friedman Feb 2012 B2
8121361 Ernst et al. Feb 2012 B2
8134597 Thorn Mar 2012 B2
8135201 Smith et al. Mar 2012 B2
8139029 Boillot Mar 2012 B2
8139896 Ahiska Mar 2012 B1
8144118 Hildreth Mar 2012 B2
8144148 El Dokor Mar 2012 B2
8150063 Chen Apr 2012 B2
8150498 Gielen et al. Apr 2012 B2
8160304 Rhoads Apr 2012 B2
8165844 Luinge et al. Apr 2012 B2
8167802 Baba et al. May 2012 B2
8172573 Sonenfeld et al. May 2012 B2
8175332 Herrington May 2012 B2
8179604 Prada Gomez et al. May 2012 B1
8180428 Kaiser et al. May 2012 B2
8180432 Sayeh May 2012 B2
8187097 Zhang May 2012 B1
8189869 Bell May 2012 B2
8189889 Pearlstein et al. May 2012 B2
8189926 Sharma May 2012 B2
8190233 Dempsey May 2012 B2
8191359 White et al. Jun 2012 B2
8194134 Furukawa Jun 2012 B2
8195084 Xiao Jun 2012 B2
8199983 Qureshi Jun 2012 B2
8206219 Shum Jun 2012 B2
8207967 El Dokor Jun 2012 B1
8208758 Wang Jun 2012 B2
8213693 Li Jul 2012 B1
8214012 Zuccolotto et al. Jul 2012 B2
8214016 Lavallee et al. Jul 2012 B2
8216016 Yamagishi et al. Jul 2012 B2
8218818 Cobb Jul 2012 B2
8218819 Cobb Jul 2012 B2
8218825 Gordon Jul 2012 B2
8221399 Amano Jul 2012 B2
8223147 El Dokor Jul 2012 B1
8224423 Faul Jul 2012 B2
8226574 Whillock Jul 2012 B2
8229163 Coleman Jul 2012 B2
8229166 Teng Jul 2012 B2
8229184 Benkley Jul 2012 B2
8232872 Zeng Jul 2012 B2
8235529 Raffle Aug 2012 B1
8235530 Maad Aug 2012 B2
8241125 Huges Aug 2012 B2
8243136 Aota Aug 2012 B2
8243269 Matousek Aug 2012 B2
8243996 Steinberg Aug 2012 B2
8248372 Saila Aug 2012 B2
8249691 Chase et al. Aug 2012 B2
8253770 Kurtz Aug 2012 B2
8253774 Huitema Aug 2012 B2
8253778 Atsushi Aug 2012 B2
8259109 El Dokor Sep 2012 B2
8260036 Hamza et al. Sep 2012 B2
8279288 Son Oct 2012 B2
8284157 Markovic Oct 2012 B2
8284847 Adermann Oct 2012 B2
8287373 Marks et al. Oct 2012 B2
8289390 Aggarwal Oct 2012 B2
8289392 Senior et al. Oct 2012 B2
8290208 Kurtz Oct 2012 B2
8290229 Qureshi Oct 2012 B2
8295573 Bredno et al. Oct 2012 B2
8301226 Csavoy et al. Oct 2012 B2
8306260 Zhu Nov 2012 B2
8306267 Gossweiler, III Nov 2012 B1
8306274 Grycewicz Nov 2012 B2
8306663 Wickham Nov 2012 B2
8310656 Zalewski Nov 2012 B2
8310662 Mehr Nov 2012 B2
8311611 Csavoy et al. Nov 2012 B2
8314854 Yoon Nov 2012 B2
8315691 Sumanaweera et al. Nov 2012 B2
8316324 Boillot Nov 2012 B2
8320621 McEldowney Nov 2012 B2
8320709 Arartani et al. Nov 2012 B2
8323106 Zalewski Dec 2012 B2
8325228 Mariadoss Dec 2012 B2
8330811 Maguire, Jr. Dec 2012 B2
8330812 Maguire, Jr. Dec 2012 B2
8331019 Cheong Dec 2012 B2
8334900 Qu et al. Dec 2012 B2
8339282 Noble Dec 2012 B2
8351651 Lee Jan 2013 B2
8368586 Mohamadi Feb 2013 B2
8369574 Hu Feb 2013 B2
8374393 Cobb Feb 2013 B2
8374411 Ernst et al. Feb 2013 B2
8374674 Gertner Feb 2013 B2
8376226 Dennard Feb 2013 B2
8376827 Cammegh Feb 2013 B2
8379927 Taylor Feb 2013 B2
8380284 Saranathan et al. Feb 2013 B2
8386011 Wieczorek Feb 2013 B2
8390291 Macfarlane et al. Mar 2013 B2
8390729 Long Mar 2013 B2
8395620 El Dokor Mar 2013 B2
8396654 Simmons et al. Mar 2013 B1
8400398 Schoen Mar 2013 B2
8400490 Apostolopoulos Mar 2013 B2
8405491 Fong Mar 2013 B2
8405656 El Dokor Mar 2013 B2
8405717 Kim Mar 2013 B2
8406845 Komistek et al. Mar 2013 B2
8411931 Zhou Apr 2013 B2
8427538 Ahiska Apr 2013 B2
8428319 Tsin et al. Apr 2013 B2
8571293 Ernst et al. Oct 2013 B2
8600213 Mestha et al. Dec 2013 B2
8615127 Fitzpatrick Dec 2013 B2
8617081 Mestha et al. Dec 2013 B2
8744154 Van Den Brink Jun 2014 B2
8747382 D'Souza Jun 2014 B2
8768438 Mestha et al. Jul 2014 B2
8790269 Xu et al. Jul 2014 B2
8792969 Bernal et al. Jul 2014 B2
8805019 Jeanne et al. Aug 2014 B2
8848977 Bammer et al. Sep 2014 B2
8855384 Kyal et al. Oct 2014 B2
8862420 Ferran et al. Oct 2014 B2
8873812 Larlus-Larrondo et al. Oct 2014 B2
8953847 Moden Feb 2015 B2
8971985 Bernal et al. Mar 2015 B2
8977347 Mestha et al. Mar 2015 B2
8995754 Wu et al. Mar 2015 B2
8996094 Schouenborg et al. Mar 2015 B2
9020185 Mestha et al. Apr 2015 B2
9036877 Kyal et al. May 2015 B2
9076212 Ernst et al. Jul 2015 B2
9082177 Sebok Jul 2015 B2
9084629 Rosa Jul 2015 B1
9103897 Herbst et al. Aug 2015 B2
9138175 Ernst et al. Sep 2015 B2
9173715 Baumgartner Nov 2015 B2
9176932 Baggen et al. Nov 2015 B2
9194929 Siegert et al. Nov 2015 B2
9226691 Bernal et al. Jan 2016 B2
9305365 Lovberg et al. Apr 2016 B2
9318012 Johnson Apr 2016 B2
9336594 Kyal et al. May 2016 B2
9395386 Corder et al. Jul 2016 B2
9433386 Mestha et al. Sep 2016 B2
9436277 Furst et al. Sep 2016 B2
9443289 Xu et al. Sep 2016 B2
9451926 Kinahan et al. Sep 2016 B2
9453898 Nielsen et al. Sep 2016 B2
9504426 Kyal et al. Nov 2016 B2
9606209 Ernst et al. Mar 2017 B2
9607377 Lovberg et al. Mar 2017 B2
9629595 Walker Apr 2017 B2
9693710 Mestha et al. Jul 2017 B2
9717461 Yu et al. Aug 2017 B2
9734589 Yu et al. Aug 2017 B2
9779502 Lovberg et al. Oct 2017 B1
9782141 Yu et al. Oct 2017 B2
9943247 Ernst Apr 2018 B2
20020082496 Kuth Jun 2002 A1
20020087101 Barrick et al. Jul 2002 A1
20020091422 Greenberg et al. Jul 2002 A1
20020115931 Strauss et al. Aug 2002 A1
20020180436 Dale et al. Dec 2002 A1
20020188194 Cosman Dec 2002 A1
20030063292 Mostafavi Apr 2003 A1
20030088177 Totterman et al. May 2003 A1
20030116166 Anthony Jun 2003 A1
20030130574 Stoyle Jul 2003 A1
20030195526 Vilsmeier Oct 2003 A1
20040071324 Norris et al. Apr 2004 A1
20040116804 Mostafavi Jun 2004 A1
20040140804 Polzin et al. Jul 2004 A1
20040171927 Lowen et al. Sep 2004 A1
20050027194 Adler et al. Feb 2005 A1
20050054910 Tremblay et al. Mar 2005 A1
20050070784 Komura et al. Mar 2005 A1
20050105772 Voronka et al. May 2005 A1
20050107685 Seeber May 2005 A1
20050137475 Dold Jun 2005 A1
20050148845 Dean et al. Jul 2005 A1
20050148854 Ito et al. Jul 2005 A1
20050283068 Zuccoloto et al. Dec 2005 A1
20060004281 Saracen Jan 2006 A1
20060045310 Tu et al. Mar 2006 A1
20060074292 Thomson et al. Apr 2006 A1
20060241405 Leitner et al. Oct 2006 A1
20070049794 Glassenberg et al. Mar 2007 A1
20070093709 Abernathie Apr 2007 A1
20070189386 Imagawa Aug 2007 A1
20070206836 Yoon Sep 2007 A1
20070239169 Plaskos et al. Oct 2007 A1
20070280508 Ernst et al. Dec 2007 A1
20080039713 Thomson et al. Feb 2008 A1
20080181358 Van Kampen et al. Jul 2008 A1
20080183074 Carls et al. Jul 2008 A1
20080208012 Ali Aug 2008 A1
20080212835 Tavor Sep 2008 A1
20080221442 Tolowsky et al. Sep 2008 A1
20080273754 Hick et al. Nov 2008 A1
20080287728 Mostafavi Nov 2008 A1
20080287780 Chase et al. Nov 2008 A1
20080317313 Goddard et al. Dec 2008 A1
20090028411 Pfeuffer Jan 2009 A1
20090041200 Lu Feb 2009 A1
20090052760 Smith et al. Feb 2009 A1
20090185663 Gaines, Jr. et al. Jul 2009 A1
20090187112 Meir et al. Jul 2009 A1
20090209846 Bammer Aug 2009 A1
20090253985 Shachar et al. Oct 2009 A1
20090304297 Adabala et al. Dec 2009 A1
20090306499 Van Vorhis et al. Dec 2009 A1
20100054579 Okutomi Mar 2010 A1
20100057059 Makino Mar 2010 A1
20100059679 Albrecht Mar 2010 A1
20100069742 Partain et al. Mar 2010 A1
20100091089 Cromwell et al. Apr 2010 A1
20100099981 Fishel Apr 2010 A1
20100125191 Sahin May 2010 A1
20100137709 Gardner et al. Jun 2010 A1
20100148774 Kamata Jun 2010 A1
20100149099 Elias Jun 2010 A1
20100149315 Qu Jun 2010 A1
20100160775 Pankratov Jun 2010 A1
20100164862 Sullivan Jul 2010 A1
20100165293 Tanassi et al. Jul 2010 A1
20100167246 Ghajar Jul 2010 A1
20100172567 Prokoski Jul 2010 A1
20100177929 Kurtz Jul 2010 A1
20100178966 Suydoux Jul 2010 A1
20100179390 Davis Jul 2010 A1
20100179413 Kadour et al. Jul 2010 A1
20100183196 Fu et al. Jul 2010 A1
20100191631 Weidmann Jul 2010 A1
20100194879 Pasveer Aug 2010 A1
20100198067 Mahfouz Aug 2010 A1
20100198101 Song Aug 2010 A1
20100198112 Maad Aug 2010 A1
20100199232 Mistry Aug 2010 A1
20100210350 Walker Aug 2010 A9
20100214267 Radivojevic Aug 2010 A1
20100231511 Henty Sep 2010 A1
20100231692 Perlman Sep 2010 A1
20100245536 Huitema Sep 2010 A1
20100245593 Kim Sep 2010 A1
20100251924 Taylor Oct 2010 A1
20100253762 Cheong Oct 2010 A1
20100268072 Hall et al. Oct 2010 A1
20100277571 Xu Nov 2010 A1
20100282902 Rajasingham Nov 2010 A1
20100283833 Yeh Nov 2010 A1
20100284119 Coakley Nov 2010 A1
20100289899 Hendron Nov 2010 A1
20100290668 Friedman Nov 2010 A1
20100292841 Wickham Nov 2010 A1
20100295718 Mohamadi Nov 2010 A1
20100296701 Hu Nov 2010 A1
20100302142 French Dec 2010 A1
20100303289 Polzin Dec 2010 A1
20100311512 Lock Dec 2010 A1
20100321505 Kokubun Dec 2010 A1
20100328055 Fong Dec 2010 A1
20100328201 Marbit Dec 2010 A1
20100328267 Chen Dec 2010 A1
20100330912 Saila Dec 2010 A1
20110001699 Jacobsen Jan 2011 A1
20110006991 Elias Jan 2011 A1
20110007939 Teng Jan 2011 A1
20110007946 Liang Jan 2011 A1
20110008759 Usui Jan 2011 A1
20110015521 Faul Jan 2011 A1
20110019001 Rhoads Jan 2011 A1
20110025853 Richardson Feb 2011 A1
20110038520 Yui Feb 2011 A1
20110043631 Marman Feb 2011 A1
20110043759 Bushinsky Feb 2011 A1
20110050562 Schoen Mar 2011 A1
20110050569 Marvit Mar 2011 A1
20110050947 Marman Mar 2011 A1
20110052002 Cobb Mar 2011 A1
20110052003 Cobb Mar 2011 A1
20110052015 Saund Mar 2011 A1
20110054870 Dariush Mar 2011 A1
20110057816 Noble Mar 2011 A1
20110058020 Dieckmann Mar 2011 A1
20110064290 Punithakaumar Mar 2011 A1
20110069207 Steinberg Mar 2011 A1
20110074675 Shiming Mar 2011 A1
20110081000 Gertner Apr 2011 A1
20110081043 Sabol Apr 2011 A1
20110085704 Han Apr 2011 A1
20110092781 Gertner Apr 2011 A1
20110102549 Takahashi May 2011 A1
20110105883 Lake et al. May 2011 A1
20110105893 Akins et al. May 2011 A1
20110115793 Grycewicz May 2011 A1
20110115892 Fan May 2011 A1
20110116683 Kramer et al. May 2011 A1
20110117528 Marciello et al. May 2011 A1
20110118032 Zalewski May 2011 A1
20110133917 Zeng Jun 2011 A1
20110142411 Camp Jun 2011 A1
20110150271 Lee Jun 2011 A1
20110157168 Bennett Jun 2011 A1
20110157358 Bell Jun 2011 A1
20110157370 Livesey Jun 2011 A1
20110160569 Cohen et al. Jun 2011 A1
20110172060 Morales Jul 2011 A1
20110172521 Zdeblick et al. Jul 2011 A1
20110175801 Markovic Jul 2011 A1
20110175809 Markovic Jul 2011 A1
20110175810 Markovic Jul 2011 A1
20110176723 Ali et al. Jul 2011 A1
20110180695 Li Jul 2011 A1
20110181893 MacFarlane Jul 2011 A1
20110182472 Hansen Jul 2011 A1
20110187640 Jacobsen Aug 2011 A1
20110193939 Vassigh Aug 2011 A1
20110199461 Horio Aug 2011 A1
20110201916 Duyn et al. Aug 2011 A1
20110201939 Hubschman et al. Aug 2011 A1
20110202306 Eng Aug 2011 A1
20110205358 Aota Aug 2011 A1
20110207089 Lagettie Aug 2011 A1
20110208437 Teicher Aug 2011 A1
20110216002 Weising Sep 2011 A1
20110216180 Pasini Sep 2011 A1
20110221770 Kruglick Sep 2011 A1
20110229862 Parikh Sep 2011 A1
20110230755 MacFarlane et al. Sep 2011 A1
20110234807 Jones Sep 2011 A1
20110234834 Sugimoto Sep 2011 A1
20110235855 Smith Sep 2011 A1
20110237933 Cohen Sep 2011 A1
20110242134 Miller Oct 2011 A1
20110244939 Cammegh Oct 2011 A1
20110250929 Lin Oct 2011 A1
20110251478 Wieczorek Oct 2011 A1
20110255845 Kikuchi Oct 2011 A1
20110257566 Burdea Oct 2011 A1
20110260965 Kim Oct 2011 A1
20110262002 Lee Oct 2011 A1
20110267427 Goh Nov 2011 A1
20110267456 Adermann Nov 2011 A1
20110275957 Bhandari Nov 2011 A1
20110276396 Rathod Nov 2011 A1
20110279663 Fan Nov 2011 A1
20110285622 Marti Nov 2011 A1
20110286010 Kusik et al. Nov 2011 A1
20110291925 Isarel Dec 2011 A1
20110293143 Narayanan et al. Dec 2011 A1
20110293146 Grycewicz Dec 2011 A1
20110298708 Hsu Dec 2011 A1
20110298824 Lee Dec 2011 A1
20110300994 Verkaaik Dec 2011 A1
20110301449 Maurer, Jr. Dec 2011 A1
20110301934 Tardis Dec 2011 A1
20110303214 Welle Dec 2011 A1
20110304541 Dalal Dec 2011 A1
20110304650 Canpillo Dec 2011 A1
20110304706 Border et al. Dec 2011 A1
20110306867 Gopinadhan Dec 2011 A1
20110310220 McEldowney Dec 2011 A1
20110310226 McEldowney Dec 2011 A1
20110316994 Lemchen Dec 2011 A1
20110317877 Bell Dec 2011 A1
20120002112 Huang Jan 2012 A1
20120004791 Buelthoff Jan 2012 A1
20120007839 Tsao et al. Jan 2012 A1
20120019645 Maltz Jan 2012 A1
20120020524 Ishikawa Jan 2012 A1
20120021806 Maltz Jan 2012 A1
20120027226 Desenberg Feb 2012 A1
20120029345 Mahfouz et al. Feb 2012 A1
20120032882 Schlachta Feb 2012 A1
20120033083 Horvinger Feb 2012 A1
20120035462 Maurer, Jr. et al. Feb 2012 A1
20120039505 Bastide et al. Feb 2012 A1
20120044363 Lu Feb 2012 A1
20120045091 Kaganovich Feb 2012 A1
20120049453 Morichau-Beauchant et al. Mar 2012 A1
20120051588 McEldowney Mar 2012 A1
20120051664 Gopalakrishnan et al. Mar 2012 A1
20120052949 Weitzner Mar 2012 A1
20120056982 Katz Mar 2012 A1
20120057640 Shi Mar 2012 A1
20120065492 Gertner et al. Mar 2012 A1
20120065494 Gertner et al. Mar 2012 A1
20120072041 Miller Mar 2012 A1
20120075166 Marti Mar 2012 A1
20120075177 Jacobsen Mar 2012 A1
20120076369 Abramovich Mar 2012 A1
20120081504 Ng Apr 2012 A1
20120083314 Ng Apr 2012 A1
20120083960 Zhu Apr 2012 A1
20120086778 Lee Apr 2012 A1
20120086809 Lee Apr 2012 A1
20120092445 McDowell Apr 2012 A1
20120092502 Knasel Apr 2012 A1
20120093481 McDowell Apr 2012 A1
20120098938 Jin Apr 2012 A1
20120101388 Tripathi Apr 2012 A1
20120105573 Apostolopoulos May 2012 A1
20120106814 Gleason et al. May 2012 A1
20120108909 Slobounov et al. May 2012 A1
20120113140 Hilliges May 2012 A1
20120113223 Hilliges May 2012 A1
20120116202 Bangera May 2012 A1
20120119999 Harris May 2012 A1
20120120072 Se May 2012 A1
20120120237 Trepess May 2012 A1
20120120243 Chien May 2012 A1
20120120277 Tsai May 2012 A1
20120121124 Bammer May 2012 A1
20120124604 Small May 2012 A1
20120127319 Rao May 2012 A1
20120133616 Nishihara May 2012 A1
20120133889 Bergt May 2012 A1
20120143029 Silverstein Jun 2012 A1
20120143212 Madhani Jun 2012 A1
20120147167 Mason Jun 2012 A1
20120154272 Hildreth Jun 2012 A1
20120154511 Hsu Jun 2012 A1
20120154536 Stoker Jun 2012 A1
20120154579 Hanpapur Jun 2012 A1
20120156661 Smith Jun 2012 A1
20120158197 Hinman Jun 2012 A1
20120162378 El Dokor et al. Jun 2012 A1
20120165964 Flaks Jun 2012 A1
20120167143 Longet Jun 2012 A1
20120169841 Chemali Jul 2012 A1
20120176314 Jeon Jul 2012 A1
20120184371 Shum Jul 2012 A1
20120188237 Han Jul 2012 A1
20120188371 Chen Jul 2012 A1
20120194422 El Dokor Aug 2012 A1
20120194517 Izadi et al. Aug 2012 A1
20120194561 Grossinger Aug 2012 A1
20120195466 Teng Aug 2012 A1
20120196660 El Dokor et al. Aug 2012 A1
20120197135 Slatkine Aug 2012 A1
20120200676 Huitema Aug 2012 A1
20120201428 Joshi et al. Aug 2012 A1
20120206604 Jones Aug 2012 A1
20120212594 Barns Aug 2012 A1
20120218407 Chien Aug 2012 A1
20120218421 Chien Aug 2012 A1
20120220233 Teague Aug 2012 A1
20120224666 Speller Sep 2012 A1
20120224743 Rodriguez Sep 2012 A1
20120225718 Zhang Sep 2012 A1
20120229643 Chidanand Sep 2012 A1
20120229651 Takizawa Sep 2012 A1
20120230561 Qureshi Sep 2012 A1
20120235896 Jacobsen Sep 2012 A1
20120238337 French Sep 2012 A1
20120242816 Cruz Sep 2012 A1
20120249741 Maciocci Oct 2012 A1
20120253201 Reinhold Oct 2012 A1
20120253241 Levital et al. Oct 2012 A1
20120262540 Rondinelli Oct 2012 A1
20120262558 Boger Oct 2012 A1
20120262583 Bernal Oct 2012 A1
20120268124 Herbst et al. Oct 2012 A1
20120275649 Cobb Nov 2012 A1
20120276995 Lansdale Nov 2012 A1
20120277001 Lansdale Nov 2012 A1
20120281093 Fong Nov 2012 A1
20120281873 Brown Nov 2012 A1
20120288142 Gossweiler, III Nov 2012 A1
20120288852 Willson Nov 2012 A1
20120289334 Mikhailov Nov 2012 A9
20120289822 Shachar et al. Nov 2012 A1
20120293412 El Dokor Nov 2012 A1
20120293506 Vertucci Nov 2012 A1
20120293663 Liu Nov 2012 A1
20120294511 Datta Nov 2012 A1
20120300961 Moeller Nov 2012 A1
20120303839 Jackson Nov 2012 A1
20120304126 Lavigne Nov 2012 A1
20120307075 Margalit Dec 2012 A1
20120307207 Abraham Dec 2012 A1
20120314066 Lee Dec 2012 A1
20120315016 Fung Dec 2012 A1
20120319946 El Dokor Dec 2012 A1
20120319989 Argiro Dec 2012 A1
20120320219 David Dec 2012 A1
20120326966 Rauber Dec 2012 A1
20120326976 Markovic Dec 2012 A1
20120326979 Geisert Dec 2012 A1
20120327241 Howe Dec 2012 A1
20120327246 Senior et al. Dec 2012 A1
20130002866 Hanpapur Jan 2013 A1
20130002879 Weber Jan 2013 A1
20130002900 Gossweiler, III Jan 2013 A1
20130009865 Valik Jan 2013 A1
20130010071 Valik Jan 2013 A1
20130013452 Dennard Jan 2013 A1
20130016009 Godfrey Jan 2013 A1
20130016876 Wooley Jan 2013 A1
20130021434 Ahiska Jan 2013 A1
20130021578 Chen Jan 2013 A1
20130024819 Rieffel Jan 2013 A1
20130030283 Vortman et al. Jan 2013 A1
20130033640 Lee Feb 2013 A1
20130033700 Hallil Feb 2013 A1
20130035590 Ma et al. Feb 2013 A1
20130035612 Mason Feb 2013 A1
20130040720 Cammegh Feb 2013 A1
20130041368 Cunninghan Feb 2013 A1
20130053683 Hwang et al. Feb 2013 A1
20130057702 Chavan Mar 2013 A1
20130064426 Watkins, Jr. Mar 2013 A1
20130064427 Picard Mar 2013 A1
20130065517 Svensson Mar 2013 A1
20130066448 Alonso Mar 2013 A1
20130066526 Mondragon Mar 2013 A1
20130069773 Li Mar 2013 A1
20130070201 Shahidi Mar 2013 A1
20130070257 Wong Mar 2013 A1
20130072787 Wallace et al. Mar 2013 A1
20130076863 Rappel Mar 2013 A1
20130076944 Kosaka Mar 2013 A1
20130077823 Mestha Mar 2013 A1
20130079033 Gupta Mar 2013 A1
20130084980 Hammontree Apr 2013 A1
20130088584 Malhas Apr 2013 A1
20130093866 Ohlhues et al. Apr 2013 A1
20130096439 Lee Apr 2013 A1
20130102879 MacLaren et al. Apr 2013 A1
20130102893 Vollmer Apr 2013 A1
20130108979 Daon May 2013 A1
20130113791 Isaacs et al. May 2013 A1
20130211421 Abovitz et al. Aug 2013 A1
20130281818 Vija et al. Oct 2013 A1
20140073908 Biber Mar 2014 A1
20140088410 Wu Mar 2014 A1
20140133720 Lee May 2014 A1
20140148685 Liu et al. May 2014 A1
20140159721 Grodzki Jun 2014 A1
20140171784 Ooi et al. Jun 2014 A1
20140378816 Oh et al. Dec 2014 A1
20150085072 Yan Mar 2015 A1
20150094597 Mestha et al. Apr 2015 A1
20150094606 Mestha et al. Apr 2015 A1
20150212182 Nielsen et al. Jul 2015 A1
20150245787 Kyal et al. Sep 2015 A1
20150257661 Mestha et al. Sep 2015 A1
20150265187 Bernal et al. Sep 2015 A1
20150265220 Ernst et al. Sep 2015 A1
20150297120 Son et al. Oct 2015 A1
20150297314 Fowler Oct 2015 A1
20150316635 Stehning et al. Nov 2015 A1
20150323637 Beck et al. Nov 2015 A1
20150331078 Speck et al. Nov 2015 A1
20150359464 Oleson Dec 2015 A1
20160000383 Lee et al. Jan 2016 A1
20160000411 Raju et al. Jan 2016 A1
20160045112 Weissler et al. Feb 2016 A1
20160091592 Beall et al. Mar 2016 A1
20160166205 Ernst et al. Jun 2016 A1
20160198965 Mestha et al. Jul 2016 A1
20160228005 Bammer et al. Aug 2016 A1
20160249984 Janssen Sep 2016 A1
20160256713 Saunders et al. Sep 2016 A1
20160262663 MacLaren et al. Sep 2016 A1
20160287080 Olesen et al. Oct 2016 A1
20160310229 Bammer et al. Oct 2016 A1
20160313432 Feiweier et al. Oct 2016 A1
20170032538 Ernst Feb 2017 A1
20170038449 Voigt et al. Feb 2017 A1
20170143271 Gustafsson et al. May 2017 A1
20170303859 Robertson et al. Oct 2017 A1
20170319143 Yu et al. Nov 2017 A1
20170345145 Nempont et al. Nov 2017 A1
20190004282 Park Jan 2019 A1
20190059779 Ernst Feb 2019 A1
Foreign Referenced Citations (40)
Number Date Country
100563551 Dec 2009 CN
105392423 Mar 2016 CN
106572810 Apr 2017 CN
106714681 May 2017 CN
29519078 Mar 1996 DE
102004024470 Dec 2005 DE
0904733 Mar 1991 EP
1319368 Jun 2003 EP
1354564 Oct 2003 EP
1524626 Apr 2005 EP
2515139 Oct 2012 EP
2948056 Dec 2015 EP
2950714 Dec 2015 EP
03023838 May 1991 JP
WO 9617258 Jun 1996 WO
WO 9938449 Aug 1999 WO
WO 0072039 Nov 2000 WO
WO 03003796 Jan 2003 WO
WO 2004023783 Mar 2004 WO
WO 2005077293 Aug 2005 WO
WO 2007025301 Mar 2007 WO
WO 2007085241 Aug 2007 WO
WO 2007136745 Nov 2007 WO
WO 2009101566 Aug 2009 WO
WO 2009129457 Oct 2009 WO
WO 2010066824 Jun 2010 WO
WO 2011047467 Apr 2011 WO
WO 2011113441 Sep 2011 WO
WO 2012046202 Apr 2012 WO
WO 2013032933 Mar 2013 WO
WO 2014005178 Jan 2014 WO
WO 2014116868 Jul 2014 WO
WO 2014120734 Aug 2014 WO
WO 2015022684 Feb 2015 WO
WO 2015042138 Mar 2015 WO
WO 2015092593 Jun 2015 WO
WO 2015148391 Oct 2015 WO
WO 2016014718 Jan 2016 WO
WO2017091479 Jun 2017 WO
WO2017189427 Nov 2017 WO
Non-Patent Literature Citations (74)
Entry
Ashouri, H., L. et al., Unobtrusive Estimation of Cardiac Contractility and Stroke Volume Changes Using Ballistocardiogram Measurements on a High Bandwidth Force Plate, Sensors 2016, 16, 787; doi:10.3390/s16060787.
Extended European Search Report for application No. 15824707.2 which is a EP application related to the present appliation, dated Apr. 16, 2018.
Gordon, J. W. Certain molar movements of the human body produced by the circulation of the blood. J. Anat. Physiol. 11, 533-536 (1877).
Kim, Chang-Sei et al. “Ballistocardiogram: Mechanism and Potential for Unobtrusive Cardiovascular Health Monitoring”, Scientific Reports, Aug. 9, 2016.
Tarvainen, M.P. et al., “An advanced de-trending method with application to HRV analysis,” IEEE Trans. Biomed. Eng., vol. 49, No. 2, pp. 172-175, Feb. 2002.
Aksoy et al., “Real-Time Optical Motion Correction for Diffusion Tensor Imaging, Magnetic Resonance in Medicine” (Mar. 22, 2011) 66 366-378.
Andrews et al., “Prospective Motion Correction for Magnetic Resonance Spectroscopy Using Single Camera Retro-Grate Reflector Optical Tracking, Journal of Magnetic Resonance Imaging” (Feb. 2011) 33(2): 498-504.
Angeles et al., “The Online Solution of the Hand-Eye Problem”, IEEE Transactions on Robotics and Automation, 16(6): 720-731 (Dec. 2000).
Anishenko et al., “A Motion Correction System for Brain Tomography Based on Biologically Motivated Models.” 7th IEEE International Conference on Cybernetic Intelligent Systems, dated Sep. 9, 2008, in 9 pages.
Armstrong et al., RGR-6D: Low-cost, high-accuracy measurement of 6-DOF Pose from a Single Image. Publication date unknown.
Armstrong et al., “RGR-3D: Simple, cheap detection of 6-DOF pose for tele-operation, and robot programming and calibration”, In Proc. 2002 Int. Conf. on Robotics and Automation, IEEE, Washington (May 2002).
Bandettini, Peter A., et al., “Processing Strategies for Time-Course Data Sets in Functional MRI of the Human Breain”, Magnetic Resonance in Medicine 30: 161-173 (1993).
Barmet et al, Spatiotemporal Magnetic Field Monitoring for MR, Magnetic Resonance in Medicine (Feb. 1, 2008) 60: 187-197.
Bartels, LW, et al., “Endovascular interventional magnetic resonance imaging”, Physics in Medicine and Biology 48: R37-R64 (2003).
Benchoff, Brian, “Extremely Precise Positional Tracking”, https://hackaday.com/2013/10/10/extremely-precise-positional-tracking/, printed on Sep. 16, 2017, in 7 pages.
Carranza-Herrezuelo et al, “Motion estimation of tagged cardiac magnetric resonance images using variational techniques” Elsevier, Computerized Medical Imaging and Graphics 34 (2010), pp. 514-522.
Chou, Jack C. K., et al., “Finding the Position and Orientation of a Sensor on a Robot Manipulator Using Quaternions”, The International Journal of Robotics Research, 10(3): 240-254 (Jun. 1991).
Cofaru et al “Improved Newton-Raphson digital image correlation method for full-field displacement and strain calculation,” Department of Materials Science and Engineering, Ghent University St-Pietersnieuwstraat, Nov. 20, 2010.
Communication pursuant to Article 94(3) EPC for application No. 14743670.3, which is an EP application related to the present application, dated Feb. 6, 2018.
Ernst et al., “A Novel Phase and Frequency Navigator for Proton Magnetic Resonance Spectroscopy Using Water-Suppression Cycling, Magnetic Resonance in Medicine” (Jan. 2011) 65(1): 13-7.
Eviatar et al., “Real time head motion correction for functional MRI”, In: Proceedings of the International Society for Magnetic Resonance in Medicine (1999) 269.
Extended European Search Report for application No. 14743670.3 which is a EP application related to the present application, dated Aug. 17, 2017.
Extended European Search Report for application No. 15769296.3 which is a EP application related to the present application, dated Dec. 22, 2017.
Forbes, Kristen P. N., et al., “Propeller MRI: Clinical Testing of a Novel Technique for Quantification and Compensation of Head Motion”, Journal of Magnetic Resonance Imaging 14: 215-222 (2001).
Fulton et al., “Correction for Head Movements in Positron Emission Tomography Using an Optical Motion-Tracking System”, IEEE Transactions on Nuclear Science, vol. 49(1):116-123 (Feb. 2002).
Glover, Gary H., et al., “Self-Navigated Spiral fMRI: Interleaved versus Single-shot”, Magnetic Resonance in Medicine 39: 361-368 (1998).
Gumus et al., “Elimination of DWI signal dropouts using blipped gradients for dynamic restoration of gradient moment”, ISMRM 20th Annual Meeting & Exhibition, May 7, 2012.
Herbst et al., “Preventing Signal Dropouts in DWI Using Continous Prospective Motion Correction”, Proc. Intl. Soc. Mag. Reson. Med. 19 (May 2011) 170.
Herbst et al., “Prospective Motion Correction With Continuous Gradient Updates in Diffusion Weighted Imaging, Magnetic Resonance in Medicine” (2012) 67:326-338.
Herbst et al., “Reproduction of Motion Artifacts for Performance Analysis of Prospective Motion Correction in MRI”, Magnetic Resonance in Medicine., vol. 71, No. 1, p. 182-190 (Feb. 25, 2013).
Hoff et al., “Analysis of Head Pose Accuracy in Augmented Reality”, IEEE Transactions on Visualization and Computer Graphics 6, No. 4 (Oct.-Dec. 2000): 319-334.
Horn, Berthold K. P., “Closed-form solution of absolute orientation using unit quaternions”, Journal of the Optical Society of America, vol. 4, p. 629-642 (Apr. 1987).
International Preliminary Report on Patentability for Application No. PCT/US2015/022041, dated Oct. 6, 2016, in 8 pages.
International Preliminary Report on Patentability for Application No. PCT/US2007/011899, dated Jun. 8, 2008, in 13 pages.
International Search Report and Written Opinion for Application No. PCT/US2007/011899, dated Nov. 14, 2007.
International Search Report and Written Opinion for Application No. PCT/US2014/012806, dated May 15, 2014, in 15 pages.
International Search Report and Written Opinion for Application No. PCT/US2015/041615, dated Oct. 29, 2015, in 13 pages.
International Preliminary Report on Patentability for Application No. PCT/US2014/013546, dated Aug. 4, 2015, in 9 pages.
International Search Report and Written Opinion for Application No. PCT/US2015/022041, dated Jun. 29, 2015, in 9 pages.
Jochen Triesch, et al.“Democratic Integration: Self-Organized Integration of Adaptive Cues”, Neural Computation., vol. 13, No. 9, dated Sep. 1, 2001, pp. 2049-2074.
Josefsson et al. “A flexible high-precision video system for digital recording of motor acts through lightweight reflect markers”, Computer Methods and Programs in Biomedicine, vol. 49:111-129 (1996).
Katsuki, et al., “Design of an Artificial Mark to Determine 3D Pose by Monocular Vision”, 2003 IEEE International Conference on Robotics and Automation (Cat. No. 03CH37422), Sep. 14-19, 2003, pp. 995-1000 vol. 1.
Kiebel et al., “MRI and PET coregistration—a cross validation of statistical parametric mapping and automated image registration”, Neuroimage 5(4):271-279 (1997).
Kiruluta et al., “Predictive Head Movement Tracking Using a Kalman Filter”, IEEE Trans. On Systems, Man, and Cybernetics—Part B: Cybernetics, 27(2):326-331 (Apr. 1997).
Lerner, “Motion correction in fmri images”, Technion-Israel Institute of Technology, Faculty of Computer Science ( Feb. 2006).
Maclaren et al., “Combined Prospective and Retrospective Motion Correction to Relax Navigator Requirements”, Magnetic Resonance in Medicine (Feb. 11, 2011) 65:1724-1732.
MacLaren et al., “Navigator Accuracy Requirements for Prospective Motion Correction”, Magnetic Resonance in Medicine (Jan. 2010) 63(1): 162-70.
MacLaren, “Prospective Motion Correction in MRI Using Optical Tracking Tape”, Book of Abstracts, ESMRMB (2009).
Maclaren et al., “Measurement and correction of microscopic head motion during magnetic resonance imaging of the brain”, PLOS ONE, vol. 7(11):1-9 (2012).
Maclaren et al., “Prospective Motion Correction in Brain Imaging: A Review” Online Magnetic Resonance in Medicine, vol. 69, No. 3, pp. 621-636 (Mar. 1, 2013).
McVeigh et al., “Real-time, Interactive MRI for Cardiovascular Interventions”, Academic Radiology, 12(9): 1121-1127 (2005).
Nehrke et al., “Prospective Correction of Affine Motion for Arbitrary MR Sequences on a Clinical Scanner”, Magnetic Resonance in Medicine (Jun. 28, 2005) 54:1130-1138.
Norris et al., “Online motion correction for diffusion-weighted imaging using navigator echoes: application to RARE imaging without sensitivity loss”, Magnetic Resonance in Medicine, vol. 45:729-733 (2001).
Olesen et al., “Structured Light 3D Tracking System for Measuring Motions in PET Brain Imaging”, Proceedings of SPIE, the International Society for Optical Engineering (ISSN: 0277-786X), vol. 7625:76250X (2010).
Olesen et al., “Motion Tracking in Narrow Spaces: A Structured Light Approach”, Lecture Notes in Computer Science (ISSN: 0302-9743)vol. 6363:253-260 (2010).
Olesen et al., “Motion Tracking for Medical Imaging: A Nonvisible Structured Light Tracking Approach”, IEEE Transactions on Medical Imaging, vol. 31(1), Jan. 2012.
Ooi et al., “Prospective Real-Time Correction for Arbitrary Head Motion Using Active Markers”, Magnetic Resonance in Medicine (Apr. 15, 2009) 62(4): 943-54.
Orchard et al., “MRI Reconstruction using real-time motion tracking: A simulation study”, Signals, Systems and Computers, 42nd Annual Conference IEEE, Piscataway, NJ, USA (Oct. 26, 2008).
Park, Frank C. and Martin, Bryan J., “Robot Sensor Calibration: Solving AX-XB on the Euclidean Group”, IEEE Transaction on Robotics and Automation, 10(5): 717-721 (Oct. 1994).
PCT Search Report from the International Searching Authority, dated Feb. 28, 2013, in 16 pages, regarding International Application No. PCT/US2012/052349.
Qin et al., “Prospective Head-Movement Correction for High-Resolution MRI Using an In-Bore Optical Tracking System”, Magnetic Resonance in Medicine (Apr. 13, 2009) 62: 924-934.
Schulz et al., “First Embedded In-Bore System for Fast Optical Prospective Head Motion-Correction in MRI”, Proceedings of the 28th Annual Scientific Meeting of the ESMRMB (Oct. 8, 2011) 369.
Shiu et al., “Calibration of Wrist-Mounted Robotic Sensors by Solving Homogeneous Transform Equations of the Form AX=XB”, IEEE Transactions on Robotics and Automation, 5(1): 16-29 (Feb. 1989).
Speck, et al., “Prospective real-time slice-by-slice Motion Correction for fMRI in Freely Moving Subjects”, Magnetic Resonance Materials in Physics, Biology and Medicine., 19(2), 55-61, published May 9, 2006.
Tremblay et al., “Retrospective Coregistration of Functional Magnetic Resonance Imaging Data using External monitoring”, Magnetic Resonance in Medicine 53:141-149 (2005).
Tsai et al., “A New Technique for Fully Autonomous and Efficient 3D Robotics Hand/Eye Calibration”, IEEE Transaction on Robotics and Automation, 5(3): 345-358 (Jun. 1989).
Wang, Ching-Cheng, “Extrinsic Calibration of a Vision Sensor Mounted on a Robot”, IEEE Transactions on Robotics and Automation, 8(2):161-175 (Apr. 1992).
Ward et al., “Prospective Multiaxial Motion Correction for fMRI”, Magnetic Resonance in Medicine 43:459-469 (2000).
Welch at al., “Spherical Navigator Echoes for Full 3D Rigid Body Motion Measurement in MRI”, Magnetic Resonance in Medicine 47:32-41 (2002).
Wilm et al., “Accurate and Simple Calibration of DLP Projector Systems”, Proceedings of SPIE, the International Society for Optical Engineering (ISSN: 0277-786X), vol. 8979 (2014).
Wilm et al., “Correction of Motion Artifacts for Real-Time Structured Light”, R.R. Paulsen and K.S. Pedersen (Eds.): SCIA 2015, LNCS 9127, pp. 142-151 (2015).
Yeo, et al. Motion correction in fMRI by mapping slice-to-volume with concurrent field-inhomogeneity correction:, International Conference on Medical Image Computing and Computer-Assisted Intervention, pp. 752-760 (2004).
Zaitsev, M., et al., “Prospective Real-Time Slice-by-Slice 3D Motion Correction for EPI Using an External Optical Motion Tracking System”, Proc.Intl.Soc.Mag.Reson.Med.11:517(2004).
Zeitsev et al., “Magnetic resonance imaging of freely moving objects: Prospective real-time motion correction using an external optical motion tracking system”, NeuroImage 31 (Jan. 29, 2006) 1038-1050.
Related Publications (1)
Number Date Country
20190059779 A1 Feb 2019 US
Provisional Applications (1)
Number Date Country
62198079 Jul 2015 US
Continuations (1)
Number Date Country
Parent 15222811 Jul 2016 US
Child 15920020 US