Motion tracking system for real time adaptive motion compensation in biomedical imaging

Information

  • Patent Grant
  • 10653381
  • Patent Number
    10,653,381
  • Date Filed
    Wednesday, September 6, 2017
    7 years ago
  • Date Issued
    Tuesday, May 19, 2020
    4 years ago
Abstract
The disclosure herein provides methods, systems, and devices for tracking motion of a patient or object of interest during biomedical imaging and for compensating for that motion in the biomedical imaging scanner and/or the resulting images to reduce or eliminate motion artifacts. In an embodiment, a motion tracking system is configured to overlay tracking data over biomedical imaging data in order to display the tracking data along with its associated image data. In an embodiment, a motion tracking system is configured to overlay tracking data over biomedical imaging data in order to display the tracking data along with its associated image data. In an embodiment, one or more detectors are configured to detect images of a patient, and a detector processing interface is configured to analyze the images to estimate motion or movement of the patient and to generate tracking data describing the patient's motion. The detector processing interface is configured to send the tracking data to a scanner controller to enable adjustment of scanning parameters in real-time in response to the patient's motion.
Description
BACKGROUND
Field

The disclosure relates generally to the field of biomedical imaging machines, and more specifically to a system for adaptive motion correction of medical imaging scans, such as magnetic resonance scans.


Description of the Related Art

“Tomographic” imaging techniques generate images of multiple slices of an object. Some commonly used tomographic imaging techniques include but are not limited to magnetic resonance imaging (MRI) and magnetic resonance spectroscopy (MRS) techniques, which are ideal for assessing the structure, physiology, chemistry and function of the living brain and other organs, in vivo and non-invasively. Because the object of interest is often imaged in many scanning steps in order to build a complete two or three dimensional view, scans are of long duration, usually lasting several minutes or more. To increase resolution (detail) of a tomographic scan, more slices and more scanning steps must be used, which further increases the duration of a scan. Scans may also be of long duration in order to obtain sufficient signal-to-noise ratio. Magnetic resonance techniques (including tomographic techniques), that are currently known or to be developed in the future (hereinafter collectively referred to as “MR” or “MRI”) can also afford relatively high spatial and temporal resolution, are non-invasive and repeatable, and may be performed in children and infants. However, due to their duration, MR scans can be subject to the problem of patient or object motion.


SUMMARY OF THE INVENTION

For purposes of this summary, certain aspects, advantages, and novel features of the invention are described herein. It is to be understood that not necessarily all such advantages may be achieved in accordance with any particular embodiment of the invention. Thus, for example, those skilled in the art will recognize that the invention may be embodied or carried out in a manner that achieves one advantage or group of advantages as taught herein without necessarily achieving other advantages as may be taught or suggested herein.


In an embodiment, a biomedical system for tracking motion of an object during biomedical imaging and for compensating for motion of the object comprises a biomedical imaging scanner configured to perform scanning of the object to generate biomedical images of the object; at least one detector for generating data describing at least one landmark of the object, wherein the at least one detector is configured to be positioned relative to the object to enable the at least one detector to detect movement of said landmark during the scanning; a detector processing interface configured to determine motion of the object based on analyzing said data received from the at least one detector, the detector processing interface configured to generate motion tracking data of the object; and a scanner controller for controlling at least one parameter of the biomedical imaging scanner, wherein the scanner controller is configured to adjust scanner parameters based on the motion tracking data, the scanner parameters configured for controlling the biomedical imaging scanner to account for motion of the object during the scanning of the object.


In an embodiment, the at least one detector is positioned within a bore of the biomedical imaging scanner. In an embodiment, the at least one detector only comprises components configured to not interfere with the biomedical imaging scanner. In an embodiment the at least one landmark comprises a facial feature of the subject. In an embodiment, the facial feature comprises at least one tooth of the upper jawbone. In an embodiment, the landmark comprises an organ of the subject. In an embodiment, the at least one landmark comprises an image projected onto the subject. In an embodiment, the at least one detector processing interface is configured to utilize an atlas-segmentation technique for identifying the at least one landmark of the object.


In an embodiment, the at least one detector is configured to generate data describing a first landmark and a second landmark of the object, wherein the detector processing interface is configured to utilize a first motion tracking technique to determine motion of the first landmark, and a second motion tracking technique to determine the motion of the second landmark, the detector processing interface configured to determine motion of the object based on analyzing the determined motion of the first landmark and the second landmark. In an embodiment, the detector processing interface is configured to apply a first weighting factor to the determined motion of the first landmark and apply a second weighting factor to the determined motion of the second landmark, wherein the first weighting factor is based on a historical accuracy of the first motion tracking technique and the second weighting factor is based on a historical accuracy of the second motion tracking technique.


In an embodiment, a computer implemented-method for tracking motion of an object during biomedical imaging by a scanner and for compensating for motion of the object comprises accessing, by a computer system, an image of the object; identifying, by the computer system, in the image a landmark of the object, the landmark being a feature naturally existing in the object; accessing, by the computer system, a plurality of images of the object; tracking, by the computer system, movement of the landmark in the plurality of images of the object; translating, by the computer system, the movement in a first reference plane to a second reference plane of the scanner; generating, by the computer system, data parameters based on the movement in the second reference plane, the data parameters configured to adjust the scanning parameters of the scanner to account for motion of the object; and transmitting, by the computer system, the data parameters to a scanner controller, the scanner controller configured to control the scanning parameters of the scanner.


In an embodiment, the image is from a video. In an embodiment, the accessing of the image of the object is from at least one detector that is positioned within a bore of the scanner. In an embodiment, the at least one detector only comprises components configured to not interfere with the scanner. In an embodiment, the landmark comprises a facial feature. In an embodiment, the facial feature comprises at least one tooth of the upper jawbone. In an embodiment, the landmark comprises an organ. In an embodiment, the identifying comprises utilizing an atlas-segmentation technique for identifying the landmark of the object.


In an embodiment, the computer-implemented method further comprises identifying, by the computer system, in the image a second landmark, the identifying of the landmark performed by utilizing a first motion tracking technique to determine motion of the landmark, and the identifying of the second landmark performed by utilizing a second motion tracking technique to determine the motion of the second landmark, the tracking comprises determining the movement of the landmark and the second landmark in the plurality of images of the object, wherein the movement is an average of the motion of the landmark and the motion of the second landmark. In an embodiment, the movement is determined by applying a first weighting factor to the determined motion of the landmark to generate a first weighted motion, and applying a second weighting factor to the determined motion of the second landmark to generate a second weighted motion, and averaging the first and second weighted motions, wherein the first weighting factor is based on a historical accuracy of the first motion tracking technique and the second weighting factor is based on a historical accuracy of the second motion tracking technique.





BRIEF DESCRIPTION OF THE DRAWINGS

The foregoing and other features, aspects, and advantages of the present invention are described in detail below with reference to the drawings of various embodiments, which are intended to illustrate and not to limit the invention. The drawings comprise the following figures in which:



FIG. 1 is an embodiment of a schematic diagram illustrating a motion tracking system for a biomedical imaging machine.



FIG. 2 is a block diagram depicting an embodiment of a motion tracking system.



FIG. 3 is a block diagram depicting an embodiment of a motion tracking system.



FIG. 4 is a block diagram depicting an embodiment of a motion tracking system.



FIG. 5 depicts an embodiment of a process flow diagram illustrating an example of tracking and compensating for motion in biomedical imaging using a motion tracking system.



FIG. 6 depicts an embodiment of a process flow diagram illustrating an example of tracking and compensating for motion in biomedical imaging using a motion tracking system.



FIG. 7 depicts an embodiment of a process flow diagram illustrating an example of combining position estimates from more than one motion tracking controller or filter to produce a single or unitary position estimate.



FIG. 8 depicts an embodiment of a process flow diagram illustrating an example of estimating the tracking of a feature during an imaging scan.



FIG. 9 is an embodiment of a schematic diagram illustrating a motion tracking system.



FIG. 10 is a block diagram depicting an embodiment of a motion tracking system.



FIG. 11 is a block diagram depicting an embodiment of a motion tracking system.



FIG. 12 depicts an embodiment of a process flow diagram illustrating an example of tracking and compensating for motion in biomedical imaging using a motion tracking system.



FIG. 13 depicts an embodiment of a process flow diagram illustrating an example of tracking and compensating for motion in biomedical imaging using a motion tracking system.



FIG. 14 depicts an embodiment of a process flow diagram illustrating an example of tracking and compensating for motion in biomedical imaging using a motion tracking system.



FIG. 15 depicts another embodiment of a process flow diagram illustrating an example of tracking and compensating for motion in biomedical imaging using a motion tracking system.



FIG. 16 depicts an embodiment of a process flow diagram illustrating an example of tracking and compensating for motion in biomedical imaging using a motion tracking system.



FIG. 17 is a block diagram depicting an embodiment of a motion tracking system.



FIG. 18 illustrates an embodiment of a scanner image combined with a tracking data overlay and a pictorial tracking overlay.



FIG. 19 depicts an embodiment of a process flow diagram illustrating an example of tracking and compensating for motion in biomedical imaging using a motion tracking system.



FIG. 20 depicts an embodiment of a process flow diagram illustrating an example of tracking and compensating for motion in biomedical imaging using a motion tracking system.



FIG. 21 illustrates an embodiment of a tracked motion display.



FIG. 22A illustrates an embodiment of a tracked motion display.



FIG. 22B illustrates an embodiment of a tracked motion display.



FIG. 22C illustrates an embodiment of a tracked motion display.



FIG. 22D illustrates an embodiment of a tracked motion display.



FIG. 23A illustrates an embodiment of a tracked motion display.



FIG. 23B illustrates an embodiment of a tracked motion display.



FIG. 23C illustrates an embodiment of a tracked motion display.



FIG. 24 is a schematic diagram illustrating a side view of the medical imaging scanner as a part of the motion compensation system.



FIG. 25 is another embodiment of a schematic diagram illustrating a front view of a medical imaging scanner as part of a motion compensation system.



FIG. 26 is a schematic diagram illustrating a side view of the medical imaging scanner as a part of the motion compensation system of FIG. 25.



FIG. 27 is another embodiment of a schematic diagram illustrating a front view of a medical imaging scanner as part of a motion compensation system.



FIG. 28 is a schematic diagram illustrating a side view of the medical imaging scanner as a part of the motion compensation system of FIG. 27.



FIG. 29 is another embodiment of a schematic diagram illustrating a front view of a medical imaging scanner as part of a motion compensation system.



FIG. 30 is another embodiment of a schematic diagram illustrating a side view of a medical imaging scanner as part of a motion compensation system.



FIG. 31 is another embodiment of a schematic diagram illustrating a side view of a medical imaging scanner as part of a motion compensation system.



FIG. 32 is another embodiment of a schematic diagram illustrating a side view of a medical imaging scanner as part of a motion compensation system.



FIG. 33 is another embodiment of a schematic diagram illustrating a front view of a medical imaging scanner as part of a motion compensation system.



FIG. 34 is a block diagram depicting an embodiment of a computer hardware system configured to run software for implementing one or more embodiments of the motion tracking systems described herein.





DETAILED DESCRIPTION OF THE EMBODIMENTS

Although several embodiments, examples, and illustrations are disclosed below, it will be understood by those of ordinary skill in the art that the invention described herein extends beyond the specifically disclosed embodiments, examples, and illustrations and includes other uses of the invention and obvious modifications and equivalents thereof. Embodiments of the invention are described with reference to the accompanying figures, wherein like numerals refer to like elements throughout. The terminology used in the description presented herein is not intended to be interpreted in any limited or restrictive manner simply because it is being used in conjunction with a detailed description of certain specific embodiments of the invention. In addition, embodiments of the invention can comprise several novel features and no single feature is solely responsible for its desirable attributes or is essential to practicing the inventions herein described.


The disclosure herein provides methods, systems, and devices for tracking motion of a patient or object of interest during biomedical imaging and for compensating for the patient motion by adjusting the imaging parameters of the biomedical imaging scanner and/or the resulting images to reduce or eliminate motion artifacts. In an embodiment, one or more detectors are configured to detect images of or signals reflected from or spatial information of a patient, and a detector processing interface is configured to analyze the images or signals or spatial information to estimate motion or movement of the patient and to generate tracking data describing the patient's motion. The detector processing interface is configured to send the tracking data to a scanner controller to enable adjustment of scanning parameters in real-time in response to the patient's motion.


In order to assess the structure, physiology, chemistry and function of the human brain or other organs, physicians may employ any number of tomographic medical imaging techniques. Some of the more commonly used tomographic imaging techniques include computerized tomography (CT), magnetic resonance imaging (MRI), magnetic resonance spectroscopy (MRS), positron emission tomography (PET), and single-photon emission computed tomography (SPECT). These techniques take a series of images that correspond to individual slices of the object of interest (for example, the brain), and use computer algorithms to align and assemble the slice images into three dimensional views. Because the object of interest is often imaged in many slices and scanning steps, the resulting scan time can be relatively long, typically lasting several minutes or longer.


Biomedical imaging techniques with long scan times can tend to be sensitive to subject motion, which can lead to image artifacts and/or loss of resolution. Due to the typical duration of a tomographic imaging scan, subject motion can become a significant obstacle to acquiring accurate or clear image data. Although subjects are typically instructed to remain still during a scan, remaining motionless is a near impossible task for many patients, especially infants, children, the elderly, animals, patients with movement disorders, and other patients who might be agitated or cannot control body movements due to, for example, disability, impairment, injury, severe sickness, anxiety, nervousness, drug use, or other disorder. Often, the resulting scans of such patients are obscured by significant motion artifacts, making adequate diagnosis and analysis difficult.


One method to reduce motion artifacts is to use physical restraints to prevent subject movement. Such restraints, however, can be difficult to employ due to both the limited space within the scanning volume of the tomographic imager and the uncomfortable nature of the restraints themselves.


Another method to reduce motion artifacts involves tracking and adapting to subject movement in real time (for example, “adaptive imaging” or “adaptive motion correction”). This approach involves tracking the position and rotation (together referred to as “pose”) of the object of interest in real time during a scan. The pose information is used to compensate for detected motion in subsequent data acquisitions. Although these techniques can have the benefit of being highly accurate, they can require periodic recalibration to maintain such accuracy. Additionally, some embodiments of motion tracking systems use one or more cameras to track the position of one or more markers attached to a subject or to an organ to be evaluated (such as the subject's head) to determine subject motion. However, the use of markers creates additional steps in the clinical workflow, which can be undesirable. Attachment of tracking markers may also not be accepted by certain subjects, such as young children, who may remove markers.


The systems, methods, and devices disclosed herein provide solutions to the foregoing problems as well as to other challenges related to biomedical imaging. Some embodiments disclosed herein provide systems for adaptive motion correction for biomedical imaging that do not require specialized removable markers for tracking (also referred to herein as “markerless” tracking or landmark tracking). In some embodiments, a motion tracking system includes a biomedical scanner, such as an MRI, CAT, PET, or other scanner, that uses tracking information from a markerless optical or non-optical tracking system to continuously adjust scanning parameters (such as scan planes, locations, and orientations) to result in biomedical images showing no or attenuated motion artifacts. In an embodiment, the tracking information is based on using detectors to track landmarks that are naturally existing on a subject, as opposed to attaching removable markers to a subject.


As used herein, the terms “landmark” and “feature”, when used in the context of describing a quality or characteristic of a subject or object, are interchangeable terms and are broad terms, and unless otherwise indicated the terms can include within their meaning, without limitation, features of the subject (for example, facial features including but not limited to indentations, protrusions, folds, curves, outlines, moles, skin pigmentations, or the like), projected images or other projections onto a subject, distances to a point or area of a subject, surfaces of a subject (for example, three-dimensional surface modeling), openings or orifices of a subject, bones or bone structures of a subject (for example, teeth or cheek bones, or the like), and hair features of a subject (for example, hair lines, eye brows, or the like).


The term “detector” as used herein is a broad term, and unless otherwise indicated the term can include within its meaning, without limitation, a camera (either digital or analog, and either capable of capturing still images or movies) that can detect the visible spectrum or other portions of the electromagnetic spectrum, a proximity sensor, an ultrasonic sensor, a radar sensor, a laser-based sensors, or any other kind of detector. In embodiments where the detector is positioned within the bore of a medical imaging device, the term “detector” includes within its meaning a detector that is configured to not interfere or only comprises components that do not interfere with the imaging capability of the medical imaging device, for example, the detector does not generate electrical or magnetic interference that could cause artifacts in the images generated by the medical imaging device.


In an embodiment, the system can be configured to track subject motion using landmarks of a subject through a variety of ways. For example, the system can be configured for tracking different types of body organs or facial features or the like. For each type of body organ or other feature, the system can comprise an atlas or a normative database showing a typical shape of a particular body organ or feature. In an embodiment, the system can be configured to utilize the atlas in order to perform atlas-segmentation to identify an organ or feature within an image generated by a detector. Based on detection of the organ or feature, the system can be configured to track the movement of the organ or feature in subsequent images generated by the detector. In an embodiment, the system can be configured with a different detection algorithm and/or atlas for each type of body organ. For example, the system can be configured with a different detection algorithm for the head and a different detection algorithm for knee of the patient.


In another example, the system can be configured to identify one or more teeth of the upper jaw. The detection of one or more teeth of the upper jaw can be ideal for landmark-based motion tracking because the upper teeth are rigidly affixed the skull of a patient. Any movement of the skull translates into direct movement of the upper teeth. In contrast, the teeth on the lower jawbone are subject to movement not only due to movement of the skull, but also due to movement of the lower jawbone. As disclosed above, the system can be configured to utilize atlas-segmentation techniques in order to locate and identify the upper teeth in an image generated by a detector. Based on detection of the upper teeth, the system can be configured to track the movement of the upper teeth in subsequent images generated by the detector. In an embodiment, the system can be configured to utilize the motion tracking of the upper teeth to generate data instructions for transmission to the scanner in order to adjust the scanner parameters. By adjusting the scanner parameters, the system can be configured to account for patient movement during the scanning process in order to produce clearer or better images of the subject. In an embodiment, a mouth insert or a mouth guard is configured to expose the upper teeth can be inserted into a subject's mouth in order for the detector to generate images of the upper teeth during the scanning process. In an embodiment, the mouth insert or guard need not be customized for the subject's particular mouth. In an embodiment, the mouth insert or guard is a “one size fits all” mouth insert or guard that is configured to move the upper lip to an upward position in order to expose the upper teeth during the scanning process.


In an embodiment, the system can be configured to identify a characteristic of a subject. For example, the system can be configured to detect a distance to a particular point on a subject, or a surface texture of a subject, or an image that is projected onto the subject. Based on detecting the characteristic of a subject, the system can be configured to track the movement of the characteristic in subsequent images generated by the detector. In an embodiment, the system can be configured to track subject movement using a combination of any of the landmark tracking techniques disclosed above. Based on the tracked movements of the subject, the system can be configured to utilize the data in order to generate instructions for adjusting the parameters of a scanner in order to generate a better image.


In an embodiment, the detected motion that is determined by the system can be an estimated motion of the subject because the system can only detect the position of the subject at the time that the image of the subject was detected. Generally, subjects are continuously moving and therefore a subject may have moved after the time in which an image generated by the detector is being analyzed.


In an embodiment, the system can be configured to estimate the accuracy of a detected motion. For example, the system can be configured to track the movements of an eyebrow of a subject. If the system detects the location of an eyebrow in a first image and then the system cannot detect the location of an eyebrow in a second subsequent image, then the system can be configured to discount the second image because any motion tracking data generated based on the first and second image is likely to be in accurate. In an embodiment, the system can be configured to assume that the eyebrow was truncated in the second image, or that tracking of the eyebrow has been lost, and therefore the second image is not a reliable image for determining or tracking motion.


In an embodiment, a motion tracking system utilizes one or more detectors, such as cameras, to continuously record partial or full views of an object of interest. A detector processing interface continuously analyzes the patient movement data from the detectors to estimate motion of the object of interest. The detector processing interface can be configured to analyze and track motion using a variety of filters or techniques, either individually or in combination, including anatomical landmark tracking, three dimensional surface modeling, distance estimation, or other similar techniques.


In an embodiment, the detector processing interface can be configured to average the detected estimated motion that has been determined using the variety of techniques or filters. The detector processing interface can be configured to employ a weighted average in combining the detected estimated motion that has been determined using the variety of techniques of filters. In an embodiment, the detector processing interface can be configured to select the detected estimated motion values that are determined to be the most accurate. In an embodiment, accuracy can be determined by historical accuracy, or by whether a threshold change has been satisfied, or by the current size or contrast of an object, or by the like.


In an embodiment, a motion tracking system tracks object motion with respect to a motion tracking system reference or coordinate frame and then transforms the positional data into a biomedical imaging device reference or coordinate frame. The positional data in the reference frame of the biomedical imaging device is then used by the biomedical imaging device to update scanning parameters in real-time, resulting in images that show no or fewer motion artifacts and/or increased resolution.


In some embodiments, the positional data in the reference frame of the biomedical imaging device is analyzed to determine an amount or magnitude of motion present or tracked. One of ordinary skill in the art will appreciate that the foregoing can be accomplished using any other possible reference frames in lieu of the reference frame of the biomedical imaging device. If the amount or magnitude of motion exceeds a predetermined threshold, then the positional data in the reference frame of the biomedical imaging device is used by the biomedical imaging device to update scanning parameters in real-time, resulting in images that show no or fewer motion artifacts and/or increased resolution.



FIG. 1 is an embodiment of a schematic diagram illustrating a motion tracking system 100. The motion tracking system 100 comprises one or more detectors 102, a detector processing interface 104, a scanner controller 106, and a scanner 108. In an embodiment, the one or more detectors 102 are positioned generally within an interior volume of the scanner 108 (one of ordinary skill in the art will appreciate that the one or more detectors can be positioned in other locations, for example, outside the volume of the scanner) and positioned to each have a different viewpoint from which to view the subject 110 or to detect information describing at least one feature or quality of the subject 110. For example, features or qualities of the subject 110 that may be detected by various detectors 102 include but are not limited to a visual image or depiction of the subject 110 or a portion of the subject 110, a distance of the subject 110 or a portion of the subject 110 to the detector 102, a surface texture of the subject 110 or a portion of the subject 110, an indentation or protrusion of the subject, an opening or orifice of the subject, a structural outline of the subject or a portion of the subject, or other anatomical landmark or feature of the subject. Various embodiments may be configured to employ various numbers of detectors 102, and the detectors 102 can be positioned places other than within an interior volume of a scanner, as long at the detectors 102 are positioned to enable viewing the subject 110 or detecting information describing at least one quality of the subject 110 (for example, “patient movement data”).


During an imaging scan, the detectors 102 are configured to acquire patient movement data and send the data to the detector processing interface 104. The detector processing interface 104 is configured to analyze the patient movement data using one or more tracking controllers or filters and to create tracking data describing movement or motion of the patient/object of interest in detector and/or scanner reference or coordinate frames. The tracking data is sent from the detector processing interface 104 to the scanner controller 106. The scanner controller 106 is configured to adjust the scanner 108 in real time based on patient/object of interest movement described in the tracking data to enable creation of scanned images with no or few motion artifacts. For example, the scanner controller 106 can be configured to adjust scan planes, locations, and/or orientations of the scanner 108 in real time.


In some embodiments, such as the motion tracking system 900 illustrated in FIG. 9, the tracking data generated by the detector processing interface 104 is used to compensate for motion during image reconstruction or post-processing, rather than to directly adjust the scanner 108. In some embodiments, tracking data is used to both compensate for motion during image reconstruction and to directly adjust the scanner 108.


Various embodiments of motion tracking systems can be configured to use various types of detectors. In some embodiments, the detectors 102 are all cameras, with each detector 102 being configured to continuously record a partial or full view of the object of interest, such as a subject's face in the case of tracking a patient's head. Recording the partial or full views from various detector vantage points can enable increased accuracy and/or redundancy of various tracking techniques. In some embodiments, the detectors 102 may be cameras, laser-based sensors, projection-based sensors, radar sensors, ultrasonic sensors, other remote sensors, or any combination thereof.


Referring to FIGS. 1 and 2, patient movement data (for example, images, distance measurements, or the like) from the one or more detectors 102 is sent to the detector processing interface 104, where one or more tracking controllers or filters analyze the data to estimate movement of the object of interest. Several possible tracking controllers or filters 202, as shown in FIG. 2, either in isolation or in combination, can be configured to track the object of interest. One embodiment of a tracking controller or filter 202, for example Tracking Controller 1 shown in FIG. 2, is configured to track the position and orientation of anatomical features or “landmarks” during subject movement, and uses this information to derive the object of interest's (for example, the subject's head) movement. For example, when tracking a subject's head, if the position of the subject's two eyes and the position of the tip of the subject's nose are known in detector coordinates, then the three translations and three rotations of the subject's head can be derived by means of triangulation or other methods. In general, accuracy of such a tracking controller or filter 202 can be improved by tracking a greater number of anatomical features. For example, if the position of a subject's nostrils and/or the bridge of the nose are tracked in addition to the nose tip and the eyes, then tracking of the subject's head can be generally more accurate. Tracking accuracy can also be improved by utilizing a greater number of detectors 102 and/or positioning the detectors 102 to view the subject's head from a variety of angles. Furthermore, in some embodiments, a single tracking controller or filter can be configured to provide data for less than all six degrees of freedom, i.e. less than three translations and three rotations, in which case information from one or more other tracking controllers or filters may be used to complete the tracking of all six degrees of freedom.


Another embodiment of a tracking controller or filter 202, for example Tracking Controller 2 shown in FIG. 2, is configured to create a three-dimensional surface model of the object of interest (for example, a subject's head), and to calculate motion tracking information based on changes to the three-dimensional surface model as it is updated when the subject moves. A three-dimensional surface model tracking controller or filter can be configured to employ various types of detectors 102 and modeling methods. For example, the controller or filter is configured to create a surface model based on a surface texture of the object as detected by a detector or as detected by the scanner. In an embodiment, the controller or filter is configured to create a surface model based on changes in lighting and/or shading of the object of interest.


Some embodiments of tracking controllers or filters 202, for example Tracking Controller 3 shown in FIG. 2, are configured to use estimates of a distance of the object of interest (or a portion or portions of the object of interest) to one or more of the detectors 102. The position of the object of interest can then be estimated or derived by combining the distance estimates from multiple detectors 102 and/or by monitoring changes in the distance estimates from an individual detector 102. Some distance estimation controller embodiments are configured to utilize, for example, range imaging, stereo triangulation, interferometry, or the like.


Other embodiments of tracking controllers or filters 202, for example Tracking Controller 4 shown in FIG. 2, are configured to track changes in a known pattern, for example, a regular grid, projected onto the object of interest. A projector projects one or more patterns onto the object of interest from one or more projection locations, and one or more detectors 102 detect images of the pattern projected onto the object of interest. The tracking controller or filter 202 is configured to analyze deformations and/or changes to the projection(s) as the subject 110 moves to derive an estimate of the object of interest's positioning.


Some embodiments of tracking controllers or filters 202 are configured to track light reflected from reflective and/or absorbent particles suspended or contained in a compound applied to a subject's skin. The compound can be, for example, a paste, a cream, a glue, a temporary tattoo, an ink, and the like. The compound can be painted, smeared, drawn, brushed, or otherwise applied to the subject's skin. The reflective particles can be configured to reflect light in different directions as the subject moves or rotates the skin area having the compound applied. For example, the reflective particles can be prisms that refract light in a known fashion, glitter particles, or the like. The absorbent particles can also be configured to absorb light in different directions as the subject moves or rotates the skin area having the compound applied. For example, the absorbent particles can be dark spheres that absorb light in a known fashion, or the like. This embodiment of a tracking controller or filter 202 is configured to analyze images detected by the detectors 102 to track light reflections and/or alterations from the various reflective and/or absorbent particles in order to determine movement of the object of interest. In some embodiments, the tracking controller or filter 202 is configured to track reflections and/or absorption of ambient light. In some embodiments, the tracking controller or filter 202 is configured to track reflections and/or absorptions of an auxiliary light source directed generally toward the reflective and/or absorbent particles.


In some embodiments, various embodiments of tracking controllers or filters 202 (including those described above and those using various other techniques) can be used either independently or in combination with other tracking controllers or filters, including markerless tracking controllers or filters, and modules utilizing markers for motion tracking. A tracking combination interface, such as the tracking combination interface 204 shown in FIG. 2, can be configured to receive position or movement estimates from a variety of tracking controllers or filters 202 and to either select one of the estimates to send to the scanner controller 106 or to combine one or more of the estimates to form a single or unitary, more accurate estimate to send to the scanner controller 106. In some embodiments, the position or movement estimates received by the tracking combination interface 204 each describe six degrees of freedom (for example, three translations and three rotations). In some embodiments, the position or movement estimates received by the tracking combination interface 204 each describe fewer than six degrees of freedom. In some embodiments, some of the position or movement estimates received by the tracking combination interface describe six degrees of freedom, while others describe fewer than six degrees of freedom. Tracking combination interface 204 can be configured to combine estimates from tracking controllers or filters 202, for example, as shown in FIG. 7 and described in greater detail below. In some embodiments, a tracking combination interface can be configured to send no motion updates to the scanner controller if the difference in motion or an amount or magnitude of tracked motion does not exceed a predetermined threshold.



FIG. 2 is a block diagram depicting an embodiment of a motion tracking system 200. The motion tracking system 200 comprises one or more detectors 102, a detector processing interface 104, a scanner controller 106, and a scanner 108. The detector processing interface further comprises several tracking controllers or filters 202 and a tracking combination interface 204. In the motion tracking system 200, the one or more detectors 102 send patient movement data (for example, camera images, distance estimates, signals, or the like) to the detector processing interface 104, and each of the several tracking controllers or filters 202 uses the patient movement data (or a portion of the patient movement data) to generate an estimate of movement of the patient/object of interest (for example, describing all six degrees of freedom or fewer than six degrees of freedom). The tracking combination interface 204 is configured to receive each tracking controller's individual estimate and to combine them (or to select one of them) to create tracking data comprising a single or unitary movement estimate to send to the scanner controller 106. The tracking combination interface 204 may also be configured to send no motion updates to the scanner controller 106, for example, to retain the most recent motion data, if the difference in motion or amount or magnitude of tracked motion does not exceed a predetermined threshold. The scanner controller 106 is configured to update one or more parameters of the scanner 108 in real time based on this tracking data received from the detector processing interface 104.


As described above, each of the tracking controllers or filters 202 of the motion tracking system 200 can be configured to track motion using a different technique (for example, anatomical landmark tracking, three-dimensional surface model tracking, distance tracking, or the like). In some embodiments, all or some of the tracking controllers or filters 202 can be configured to use the same technique, but with different configurations. For example, a detector processing interface 104 can comprise multiple tracking controllers or filters 202 utilizing anatomical landmark tracking, with each tracking controller or filter 202 being configured to track a different anatomical landmark or set of anatomical landmarks. Additionally, in some embodiments, tracking controllers or filters 202 can be configured to utilize more than one tracking technique. For example, a tracking module 202 can be configured to utilize both anatomical landmark tracking and three-dimensional surface model tracking, but to send one unitary tracking estimate based on a combination of both methods to the tracking combination interface 204 for combination with the estimates of other tracking controllers or filters 202.


The embodiment of a motion tracking system shown in FIG. 2 may be advantageous, because, in general, accuracy of a motion tracking system can be improved by tracking motion in a variety of ways (for example, utilizing a variety of tracking controllers or filters) and then combining the data derived from the various methods. Another advantage to using multiple tracking controllers or filters 202 (for example, equal to or greater than 2) is redundancy of data and measurements to improve the robustness of the tracking data. For example, when a patient is in certain positions, some tracking controllers or filters 202 may be able to produce more accurate estimates than others. Therefore, the most accurate tracking controller or controllers can be used at one time, and then a different controller or subset of controllers can be used at another time, to create the most accurate overall positioning estimates for a particular point in time or particular position of the subject at a particular point in time.


Redundancy in detectors 102 can also be advantageous. For example, some tracking controllers or filters 202 may only require one or two detectors 102, even though a tracking system, such as the tracking system shown in FIG. 1, has more than two detectors. However, in some cases, a patient's movement may block one or more detectors 102 from being able to view the object of interest. For example, if a patient turns his or her head to the left, a detector 102 on the patient's right may no longer be able to see, for example, the patient's left eye. In a system with redundant detectors 102, a tracking controller or filter 202 can be configured to, for example, use detectors 102 on the left side of a patient when the patient's head is turned to the left, but use detectors 102 on the right side when the patient's head is turned to the right.


Redundancy in detectors 102 and/or tracking controllers or filters 202 can enable, for example, the obstruction of an anatomical feature or landmark with respect to one detector 102 to not result in overall loss of tracking data, since other detectors 102 and/or tracking controllers or filters 202 can be configured to still have sufficient data to allow continued tracking.


Some embodiments of motion tracking systems utilize redundancy in tracking combination controllers or filters 204. For example, a detector processing interface 104 can comprise multiple tracking controllers or filters 202, with a first tracking combination controller or filter 204 configured to combine the position/movement data from half of the tracking controllers or filters 202, and a second tracking combination interface 204 configured to combine the position/movement data from the other half of the tracking controllers or filters 202. A third tracking combination interface 204 is configured to combine the position/movement data from the first and second tracking combination interfaces 204. This configuration may be advantageous in various situations, for example, when the second half of the tracking controllers or filters 202 are known to produce only intermittently accurate data. The third tracking combination interface 204 can then be configured to only take data from the second tracking combination interface 204 into account when the second tracking combination interface 204 indicates its position/movement data is accurate. This configuration may also be advantageous to allow grouping of tracking controllers or filters 202 with similar features together. For example, one tracking combination interface 204 can be configured to combine the estimates of all visual image-based tracking controllers or filters 202, while another tracking combination interface 204 can be configured to combine the estimates of tracking controllers or filters 204 using non-image based tracking, such as distance-based tracking.



FIG. 3 is a block diagram depicting an embodiment of a motion tracking system 300. The motion tracking system 300 includes, among other features, an anatomy configuration module 302 configured to allow changes to be made to configurations of the various tracking controllers or filters 202 used in the detector processing interface 104. For example, the anatomy configuration module 302 can configure the tracking controllers or filters 202 based on the specific anatomical region of the subject being tracked. If, for example, a subject's brain is being scanned, a tracking controller or filter 202 utilizing anatomical landmark tracking can be configured to track the subject's eyes, nostrils, or the like. But if a subject's knee is being scanned, a tracking controller or filter 202 utilizing anatomical landmark tracking can be configured to track, for example, regions above and below the knee and the kneecap.


The anatomy configuration module 302 can be configured to adjust the configuration of the tracking controllers or filters 202 based on various factors, such as the anatomical region or organ being scanned, a patient's age or sex, or even to compensate for situations where certain anatomical features are not available to be viewed, such as after surgery (where, for instance, an eye or another part of the face may be covered).


In some embodiments, an operator of the motion tracking system 300 provides data to the anatomy configuration module 302 to enable it to configure the various tracking controllers or filters 202. For example, the operator can use a computer interface to indicate that the scanner 108 will be scanning a subject's head, knee, or the like. In some embodiments, the anatomy configuration module 302 is configured to detect the portion of a subject that is being scanned and to automatically configure the tracking controllers or filters 202 without requiring operator input. For example, the anatomy configuration module 302 can be configured to analyze data from the detectors 102 to automatically determine whether the detectors are viewing a subject's head, knee, or the like.



FIG. 4 is a block diagram depicting an embodiment of a motion tracking system 400. The motion tracking system 400 includes, among other features, one or more deformation detectors 404, an internal consistency controller 402, and a veto controller 406. During the tracking of motion of a patient during a scan, the deformation detectors 404 and internal consistency controller 402 are configured to monitor data from the detectors 102 and/or tracking controllers or filters 202 for certain conditions that may adversely affect tracking data. When one of these conditions occurs, the deformation detector 404 or internal consistency controller 402 is configured to notify the veto controller 406 of the condition. The veto controller 406 is configured to analyze the condition(s) and send a veto signal to the scanner controller 106 if it determines the tracking data is sufficiently untrustworthy. The scanner controller 106 can be configured to pause or suppress scanner acquisitions if the veto controller 406 indicates that the tracking data are temporarily unreliable. Alternatively, the scanner controller 106 can be configured to not compensate for movement using the tracking data when the veto controller 406 indicates that the tracking data are temporarily unreliable.


In some embodiments, the veto controller 406 is configured to receive and analyze data from the deformation detectors 404 and internal consistency controller 402 substantially simultaneously. The veto controller 406 is configured to combine these data and make a determination as to whether to send a veto signal to the scanner controller 106. The combination of the data may be based on a simple “winner takes all” approach (for example, if data from one deformation detector or internal consistency controller indicates unreliable tracking, the veto controller 406 sends the veto signal), or the combination may involve weighting of different probabilities of the various discrepancies encountered, a Bayesian probability approach, or other probability or statistical-based approaches.


In the embodiment shown in FIG. 4, the deformation detectors 404 and internal consistency controller 402 both notify the veto controller 406 of potentially untrustworthy tracking data being generated by the tracking controllers or filters 202. However, the deformation detectors 404 and internal consistency controller 402 perform this function in different ways. The deformation detectors 404 monitor data from the detectors 102 for conditions likely to cause untrustworthy or degraded tracking data. For example, when tracking a head/brain, when a patient in the scanner sneezes or is squinting, the patient's skin may deform locally, resulting in loss of tracking accuracy, because, while the patient's skin moved, the patient's brain likely did not move in synchronization with the skin movement. The deformation detectors 404 can be configured to analyze the data from the detectors 102, and to flag these or other conditions detrimental to accurate tracking, such as sudden appearance of skin folds or changes in the shape of anatomical features.


The internal consistency controller 402 is configured to monitor the data output by the various tracking controllers or filters 202 to detect discrepancies between the tracking controllers or filters 202. For example, the internal consistency controller 402 can be configured to compare position estimates from each tracking controller 202 and to send a signal to the veto controller or filter 406 when the differences in position estimates from different tracking controllers or filters 202 exceed a threshold level or an estimated maximum magnitude of error. The threshold level that, if exceeded, triggers a signal to the veto controller or filter 406, can be a predetermined value or a continuously modified value based on, for example, weighting of different probabilities of the various discrepancies encountered, a Bayesian probability approach, or other methods.



FIG. 5 depicts an embodiment of a process flow diagram illustrating an example of tracking and compensating for motion in biomedical imaging using a motion tracking system. At block 502 the process begins. At block 504 the system provides baseline data representing a patient position. For example, the detectors 102 as shown in the motion tracking system 100 of FIG. 1 acquire information about a subject, such as images of the subject, and send this data to the detector processing interface 104. The detector processing interface 104 is configured to analyze this data and determine a baseline positioning of the patient or the object of interest. At block 506 a scanner, such as the scanner 108 of the motion tracking system 100, begins an imaging scan of the patient. For example, an MRI scanner begins a magnetic resonance imaging scan of the patient.


At block 508 the detectors acquire new patient movement data. For example, the detectors acquire new images, camera frames, distance estimates, or the like of the patient or the object of interest. At block 510 the system analyzes the new patient movement data to estimate a new patient positioning. For example, the data from the detectors 102 is analyzed by each of the tracking controllers or filters 202 as described above, and each tracking controller 202 generates an estimate of the new patient position. The estimates from the various tracking controllers or filters 202 are then fed into the tracking combination interface 204. The tracking combination interface 204 combines the various estimates from the tracking controllers or filters 202 and generates a single estimate to send to the scanner controller 106. At block 512 the tracking combination interface generates tracking data containing the single estimate derived from the various estimates from the tracking controllers or filters 202. At block 514 the scanner controller utilizes the tracking data from the tracking combination interface to adjust the scanner to compensate for patient movement. For example, the scanner controller 106 adjusts in real time scan planes, locations, or orientations of the scanner.


At block 516 the process varies depending on whether imaging scan is complete. If the imaging scan is not complete, the process returns to block 508 and acquires new patient movement data from the detectors. This process continues throughout the imaging scan to continuously adjust the scanner based on patient motion. When the imaging scan is complete, the process moves from block 516 to the end of the process at block 518.



FIG. 6 depicts an embodiment of a process flow diagram illustrating an example of tracking and compensating for motion in biomedical imaging using a motion tracking system. In this embodiment, the process includes blocks that analyze the quality of the tracking information and potentially notify the scanner controller of inconsistencies in the tracking information. The process begins at block 602. At block 604 baseline data representing the patient position is provided. For example, detectors, such as the detectors 102 shown in the motion tracking system 100, detect patient data and send that data to a detector processing interface, such as the detector processing interface 104 shown in the motion tracking system 100. The detector processing interface analyzes the data from the detectors and determines a baseline positioning of the patient or object of interest as previously described.


At block 606 the scanner begins an imaging scan of the patient. At block 608 new patient movement data is acquired from the detectors. For example, the detectors acquire new images, distance estimates, or the like of the current patient position or orientation. At block 610 the new patient movement data is analyzed to estimate a new patient position. For example, the detector processing interface 104 shown in the motion tracking system 100 utilizes its tracking controllers or filters 202 and tracking combination interface 204 to generate an estimate of the new patient position, as described above. At block 612 the system analyzes the detector data and/or the new patient position data to determine a quality of the movement data. For example, multiple deformation detectors, such as the deformation detectors 404 shown in the motion tracking system 400, analyze the new patient data from the detectors 102 to determine if the object being tracked is experiencing, for example, skin deformations that may reduce the quality of the tracking data. Additionally, an internal consistency controller, such as the internal consistency controller 402 of the motion tracking system 400, analyzes the output of each tracking controller or filter to determine if, for example, outputs of the various tracking controllers or filters differ by more than a predetermined threshold amount.


At block 614 the system generates tracking data describing the estimated positioning of the patient or object of interest. The tracking data can be generated, for example, by using the tracking combination interface 204 shown in the motion tracking system 400. At block 616 the scanner controller uses the generated tracking data to adjust the scanner to compensate for patient movement. For example, the scanner controller instructs the scanner to adjust scan planes, locations, or orientations.


At block 618, the process, for example by using a veto controller 406, determines whether the tracking quality is sufficient. If the veto controller 406 determines that an output from the internal consistency controller 402 or one of the deformation detectors 404 indicates unreliable tracking data, the veto controller can send a veto signal indicating that the tracking quality is insufficient. At block 620, if the tracking quality is insufficient, the scanner 108 can be instructed to pause acquisition of scanner data and/or to acquire dummy scanner data, for example, by sending a veto signal from the veto controller 402 to the scanner controller 106. The process then moves back to block 608 and acquires new patient movement data, continuing the process as before. This process can continue until the tracking quality is determined to be sufficient. If the tracking quality is determined to be sufficient at block 618, the process moves to block 622. At block 622, the process varies depending on whether the imaging scan is complete. If the imaging scan is not complete, the process moves to block 624 and acquires new scanner data with the imaging scanner. The process then moves back to block 608 and acquires new patient movement data and continues the process as previously described. If at block 622 the scan is complete, the process moves to block 626 and ends. In an embodiment, the system can be configured to move to block 626 if the system fails to complete a scan, times out, or exceeds a certain number of pauses or dummy scans at block 618 or block 622.


In some embodiment, blocks 616 and 618 are reversed. In these embodiments, the process determines whether the tracking quality is sufficient before the process adjusts the scanner to compensate for patient movement.



FIG. 7 depicts an embodiment of a process flow diagram illustrating an example of combining position estimates from more than one motion tracking controller or filter to produce a single or unitary position estimate. This embodiment illustrates an example of how a motion tracking system can use multiple tracking controllers or filters, such as the tracking controllers or filters 202 shown in FIG. 2, to individually calculate an estimate of a patient position, and then combine the various estimates to develop a single or unitary estimate using a tracking combination interface, such as the tracking combination interface 204 shown in FIG. 2. At block 702 the process begins. At block 704, the system receives both new and old patient movement data, such as images, distance estimates, or the like from the detectors 102. The new and old patient movement data is received by the detector processing interface 104 and sent to the various tracking controllers or filters 202.


At blocks 710, 712, 714, 716, and 718 various tracking controllers or filters 202 estimate a new patient position using the new and old patient movement data received at block 704. For example, one tracking controller or filter 202 estimates a new patient position using anatomical landmark tracking, one tracking controller estimates a patient position using three dimensional surface modeling, another tracking controller estimates the new patient position using distance estimation, or the like, as described above. At blocks 720, 722, 724, 726, and 728 the various tracking controllers or filters provide a weighting factor for their respective position estimates. For example, a weighting factor may include an error estimate, a probability, a confidence level, or another measure related to accuracy. Each weighting factor can be used to indicate at least partially a weight that should be applied to the patient positioning estimate output by each tracking controller. For example, if a one tracking controller 202 develops an estimate that it determines to be relatively accurate, that tracking controller's weighting factor may be 95 (on a scale of 1-100). If another tracking controller 202 develops an estimate that it determines to be relatively inaccurate or having a relatively large margin of error, that tracking controller's weighting factor may be 20 (on the same scale of 1-100).


At block 730 the system estimates a single or unitary new patient position, for example, by using the tracking combination interface 204, to combine the estimates from each tracking controller 202. This process of combining estimates from the various tracking controllers or filters can take various forms. For example, the estimates can be combined using a simple average or a weighted average based on the weighting factors provided by each tracking controller 202. Another option is a winner takes all approach. In a winner takes all approach, the tracking combination interface merely picks the estimate from the tracking controller having the highest weighting factor. The tracking combination interface may also use other more complex approaches, such as Bayesian probability or other statistical approaches. In some embodiments, at block 730 the tracking combination interface 204 also considers prior patient position estimates in estimating the new patient position. For example, the tracking combination interface can use Kalman filtering or other prediction approaches. The process ends at block 732. In a complete motion tracking system, such as the motion tracking system 200 shown in FIG. 2, the process illustrated in FIG. 7 can be performed continuously throughout an imaging scan to continuously develop position estimates and adjust the scanner in real time.



FIG. 8 depicts an embodiment of a process flow diagram illustrating an example of estimating tracking quality during an imaging scan. The process shown in FIG. 8 illustrates estimating deformation of an object of interest and estimating the internal consistency of tracking data, and then combining each of those estimates to estimate an overall tracking quality. The overall tracking quality is used to create a veto flag or signal as needed when tracking quality is insufficient, as described above. The process begins at block 802. At block 804 new and old patient movement data is received, for example, from detectors, such as the detectors 102 shown in the motion tracking system 400 of FIG. 4. The patient movement data may comprise, for example, images, distance estimates, or the like.


At blocks 810, 812, 814, 816, and 818 deformation of the subject or object of interest is estimated using various deformation filters, such as the deformation detectors 404 shown in FIG. 4. The various deformation filters can use different detection methods, such as anatomy shape analysis, three dimensional surface modeling, distance estimation, and/or skin folding analysis, as described above. At blocks 820, 822, 824, 826, and 828, each deformation detector provides a deformation factor representing at least partially the estimated accuracy of the estimate produced by each deformation detector. The deformation factors may include an absolute measure of deformation, a measure of nonlinear warping, an error estimate, a probability, a confidence level, or another measure related to the accuracy of the estimate of deformation of the object of interest.


At block 830 the internal consistency of the tracking data from the tracking controllers or filters is estimated. This function may be performed by, for example, an internal consistency controller, such as the internal consistency controller 402 shown in FIG. 4. The internal consistency controller 402, as described above, analyzes the positional data from the various tracking controllers or filters and determines if there are inconsistencies between the various controllers or filters that exceed a certain level.


At block 832 a controller, such as the veto controller 406 shown in FIG. 4, estimates the overall tracking quality of the motion tracking data. For example, the veto controller 406, as described above, combines the deformation detector data with the internal consistency controller data and determines whether the tracking data is of sufficient quality. At block 834, if the veto controller determines the overall tracking quality is of insufficient quality, the veto controller creates a veto flag or signal for a scanner controller or an image processing system, such as the scanner controller 106 shown in FIG. 4 or the image processing system 902 shown in FIG. 10. At block 836 the process is complete. In a complete motion tracking system, such as the motion tracking system 400 shown in FIG. 4, the process illustrated in FIG. 8 can be performed continuously throughout an imaging scan to continuously develop tracking quality estimates and inform the scanner controller or image processing system when tracking quality is insufficient.



FIG. 9 is an embodiment of a schematic diagram illustrating a motion tracking system 900. The motion tracking system 900 comprises detectors 102, a detector processing interface 104, an image processing system 902, a scanner image acquisition interface 904, and a scanner 108. In the motion tracking system 900, the detector processing interface analyzes patient movement data from the detectors 102 to estimate patient/object of interest movement during a scan. The detector processing interface 104 generates tracking data defining the estimates of the patient/object of interest's movement and sends the tracking data to an image processing system 902. In this embodiment, the motion tracking system 900 corrects for patient motion during image reconstruction or post-processing rather than adjusting a scanner in real time to compensate for patient movement. One advantage of the embodiment shown in FIG. 9 is that the motion tracking system 900 does not require a scanner with the ability to adjust imaging parameters, such as scan planes, locations, or orientations, in real time. In some embodiments, a motion tracking system includes features of both the motion tracking system 100 shown in FIG. 1 and the motion tracking system 900 shown in FIG. 9. For example, a motion tracking system can be configured to adjust some scanner parameters in real time while other parameters are compensated for during image reconstruction or post-processing.



FIG. 10 is a block diagram depicting an embodiment of a motion tracking system 1000. The motion tracking system 1000 comprises one or more detectors 102, a detector processing interface 104, an image processing system 902, a scanner image acquisition interface 904, and a scanner 108. The motion tracking system 1000 operates similarly to the motion tracking system 200 shown in FIG. 2. However, the motion tracking system 1000 sends tracking data from the detector processing interface 104 to an image processing system 902 instead of a scanner controller 106. The scanner acquisition interface 904 receives images from the scanner 108 and sends the images to the image processing system 902. The image processing system 902 is configured to receive image data from the scanner image acquisition interface 904 and tracking data from the detector processing interface 104. The image processing system 902 is configured to adjust the image data based on the tracking data received from the detector processing interface 104 to compensate for patient movement.



FIG. 11 is a block diagram depicting an embodiment of a motion tracking system 1100. The motion tracking system 1100 is similar to the motion tracking system 400 shown in FIG. 4; however, the motion tracking system 1100 is configured to correct for patient movement during image reconstruction or post-processing rather than adjusting a scanner in real time due to patient movement. In the motion tracking system 1100, tracking data and/or a veto signal from the detector processing interface 104 are sent to an image processing system 902, instead of to a scanner controller 106. The image processing system 902 uses tracking data from the detector processing interface 104 to correct images received from the scanner image acquisition interface 904 for patient motion during image reconstruction or post-processing. The image processing system 902 can be configured to not adjust certain images for motion during image reconstruction when the image processing system 902 receives a veto signal from the detector processing interface 104. The veto controller 406 operates to generate the veto signal as describe above in reference to various other embodiments.



FIG. 12 depicts an embodiment of a process flow diagram illustrating an example of tracking and compensating for motion in biomedical imaging using a motion tracking system. The process shown in FIG. 12 can be implemented by, for example, the motion tracking system 900 shown in FIG. 9. At block 1202 the process begins. The system provides baseline data representing a patient's pre-scan position at block 1204. For example, detectors 102 detect information, such as images of a patient or object of interest, and send this information to a detector processing interface 104. The detector processing interface 104 uses various tracking controllers or filters 202 and a tracking combination interface 204, as described above, to then determine a baseline positioning of the patient or object of interest. At block 1206 the imaging scan of the patient or object of interest is begun.


At block 1208 new patient movement data, for example images, distance estimates, or the like, is acquired using the detectors 102. At block 1210 the new patient movement data is analyzed and compared to the baseline patient data to determine a new patient positioning estimate as described above. Block 1210 is performed by, for example, the detector processing interface 104 shown in FIG. 10. At block 1212 the system generates motion tracking data. The motion tracking data can be generated by, for example, the tracking combination interface 204 shown in FIG. 10, and describes the motion estimate generated by the tracking combination interface 204. At block 1214 scanner data is acquired. For example, the scanner 108 shown in FIG. 10 acquires scanner image data and sends the data to the scanner image acquisition module 904.


At block 1216 the image processing system, such as the image processing system 902 shown in FIG. 10, utilizes the acquired scanner data and generated tracking data to modify scanner images to compensate for patient movement. At block 1218 the process varies depending on whether the imaging scan is complete. If the imaging scan is not complete the process proceeds back to block 1208 and acquires new patient movement data from the detectors 102. The process then continues as described above. This process continues throughout the imaging scan to continuously modify the scanner images based on patient motion. If the imaging scan is complete at block 1218, the process proceeds to block 1220 and the process is complete.



FIG. 13 depicts an embodiment of a process flow diagram illustrating an example of tracking and compensating for motion in biomedical imaging using a motion tracking system. The process illustrated in FIG. 13 can be performed by, for example, the motion tracking system 1100 shown in FIG. 11. The process begins at block 1302. At block 1304 baseline data representing a baseline patient position is provided. For example, the detectors 102 can detect images of the patient and send that data to the detector processing interface 104 for determination of a baseline patient position. At block 1306 an imaging scan of the patient is begun by the scanner 108.


At block 1308 new patient movement data (for example, images, distance estimates, or the like) is acquired from the detectors. At block 1310 the new patient movement data from the detectors is analyzed to estimate a new patient position. For example, the various tracking controllers or filters 202 analyze the data from the detectors 102 to develop estimates of the new patient positioning, as described above. The tracking combination interface 204 then combines the estimates from the various tracking controllers or filters to generate one unitary patient positioning estimate, as described above.


At block 1312 the system analyzes the detector data and/or the patient position data from the tracking controllers or filters to estimate a quality of the movement data. For example, as described above, the deformation detectors 404 and internal consistency interface 402 can analyze data from the detectors 102 and/or tracking controllers or filters 202 to estimate a level of quality. At block 1314 tracking data is generated. For example, the tracking combination interface 204 generates tracking data based on a combination of the various estimates from the tracking controllers or filters 202.


At block 1316 the process determines whether the tracking quality is sufficient. For example, the veto controller 406 analyzes the data from the internal consistency controller 402 and deformation detectors 404, as described above, to determine whether a certain level of quality has been met and therefore whether a veto signal should be generated and sent to, for example, the image processing system 902. If the tracking quality is not sufficient, at block 1318 the process pauses or holds scanner acquisition and/or instructs the scanner 108 to acquire dummy scanner data. The process then proceeds back to block 1308 and acquires new patient movement data from detectors. If the tracking quality is determined to be sufficient at block 1316, then the process proceeds to block 1320. At block 1320 the process varies depending on whether the scan is complete. If the scan is not complete, the process moves to block 1322 and scanner data is acquired by the scanner 108. At block 1324 the image processing system 902 utilizes the scanner data from the scanner 108 and scanner image acquisition interface 904 to adjust the image data to compensate for patient movement based on the tracking data received from the detector processing interface 104. The process then proceeds back to block 1308 and acquires new patient movement data from the detectors. If the scan is determined to be complete at block 1320, then the process proceeds to block 1326 and the process is complete.



FIG. 14 depicts an embodiment of a process flow diagram illustrating an example of tracking and compensating for motion in biomedical imaging using a motion tracking system. The process embodied in FIG. 14 illustrates another example of tracking patient movement and analyzing the quality of the movement tracking data. If the movement tracking data is not of sufficient quality, then the process is configured to ignore scanner data. The process begins at block 1402. At block 1404 baseline data representing the patient's baseline position is provided. For example, the detectors 102 of the motion tracking system 1100 detect patient movement data and send that data to the detector processing interface 104. The detector processing interface 104 uses its tracking controllers or filters 202 and tracking combination interface 204 to determine the baseline patient positioning estimate.


At block 1406 the imaging scan of the patient is begun. At block 1408 new patient movement data from the detectors 102 is acquired. At block 1410, the detector processing interface 104 analyzes the new patient movement data to determine a new patient position estimate. The detector processing interface 104 determines the new patient positioning estimate using its tracking controllers or filters 202 and tracking combination interface 204 as described above. At block 1412 the detector processing interface 104 analyzes the detector data and/or the new patient position estimate data to determine a quality of the overall patient movement estimate data. For example, the detector processing interface 104 utilizes the internal consistency controller 402 and deformation detectors 404 to analyze a quality of the data from the detectors 102 and tracking controller 202, as described above.


At block 1414 tracking data is generated. The tracking data is generated by the tracking combination interface 204 to be sent to the image processing system 902, as described above. At block 1416 the scanner 108 acquires scanner data. At block 1418 the process varies depending on whether the tracking quality is sufficient. For example, the veto controller 406 determines whether the quality as estimated by the deformation detectors 404 and internal consistency controller 402 exceeds a certain quality level. If the tracking quality is not sufficient, the process moves to block 1420, wherein the image processing system 902 is instructed to ignore the relevant scanner data. The process then moves back to block 1408 and acquires new patient movement data from the detectors. The process repeats in this fashion until the tracking quality is found to be sufficient. When the tracking quality is found to be sufficient at block 1418, the process moves to block 1422. At block 1422 the process varies depending on whether the scan is complete. If the scan is not complete the process moves to block 1424. At block 1424 the image processing system 902 utilizes the tracking data from the detector processing interface 104 to compensate for patient movement in the acquired images. The process then moves back to block 1408 and acquires new patient movement data from detectors. The process continues in this fashion until the imaging scan is complete. When the scan is complete at block 1422 the process moves to block 1426 and the process is complete.



FIG. 15 depicts another embodiment of a process flow diagram illustrating an example of tracking and compensating for motion in biomedical imaging using a motion tracking system. The process shown in FIG. 15 illustrates an example of tracking a jointed object, such as a human knee joint. The process shown in FIG. 15 can be performed by various motion tracking systems, for example the motion tracking system 200 shown in FIG. 2. The process begins at block 1502. At block 1504 baseline data representing the position of an object of interest is provided. For example, the object of interest may be a knee joint of a human being. At blocks 1506 and 1508 baseline data is provided representing positions of a first related object and a second related object. For example, at block 1506 baseline positional data representing a position of the patient's upper leg is provided. At block 1508, baseline positional data representing the position of the patient's lower leg is provided. The baseline positional data provided in blocks 1504, 1506, and 1508 can be provided as described above using detectors 102 and a detector processing interface 104.


At block 1510 an imaging scan of the patient/object of interest is begun. At block 1512 new patient movement data is acquired from the detectors, such as the detectors 102 shown in FIG. 2. At block 1514 the new patient movement data is analyzed to determine the new position of related object 1, such as the patient's upper leg. At block 1516 the new patient movement data is analyzed to estimate the new position of related object 2, such as the patient's lower leg. The new positions of related objects 1 and 2 can be determined as described above using, for example, the detector processing interface 104 shown in FIG. 2.


At block 1518 a new position of the object of interest is derived from the new positions of related objects 1 and 2. For example, a knee joint position or orientation can be derived from an estimated positioning of the patient's upper leg and lower leg. At block 1520, tracking data is generated to enable the scanner to track movement of the object of interest, such as the patient's knee joint. The tracking data can be generated by the detector processing interface 104 as described above.


At block 1522, a scanner controller, such as the scanner controller 106 shown in FIG. 2, utilizes the tracking data to adjust the scanner in real time to compensate for movement of the object of interest. At block 1524 the process varies depending on whether the imaging scan is complete. If the imaging scan is not complete, the process goes back to block 1512 and acquires new patient movement data from the detectors. The process continues in this fashion until the imaging scan is complete. If the imaging scan is complete, the process moves to block 1526, and the process is complete.



FIG. 16 depicts an embodiment of a process flow diagram illustrating an example of tracking and compensating for motion in biomedical imaging using a motion tracking system. The process shown in FIG. 16 can be used, for example, to increase an accuracy of tracking an object of interest by additionally tracking related objects and combining estimates based on directly tracking the object of interest and estimates derived from tracking the related objects. The process begins at block 1602. At block 1604 baseline data representing the position of the object of interest is provided. For example, when tracking a knee joint of a human being, baseline positional information of the knee joint is provided. The baseline positional information can be provided by, for example, utilizing the motion tracking system 200 shown in FIG. 2. At blocks 1606 and 1608 baseline data representing position estimates of two related objects are provided. At block 1606 an estimate of a position of a first related object, such as the patient's upper leg, is provided. At block 1608 a position estimate of a related object 2 is provided, such as the patient's lower leg.


At block 1610 the imaging scan of the patient is begun. At block 1612 new patient movement data is acquired from the detectors 102. At block 1614 the new patient movement data is analyzed to estimate a new position of related object 1. For example, the detector processing interface 104 shown in FIG. 2 is used as described above to estimate the new position of the patient's upper leg. At block 1616 the new patient movement data is analyzed to estimate the new position of related object 2. For example, the detector processing interface 104 is used to estimate a position of the patient's lower leg. At block 1618 the patient movement data is analyzed to determine a first estimate of the new position of the object of interest. For example, the detector processing interface 104 is used as described above to estimate the new position of the patient's knee joint.


At block 1620 a confidence level is provided for the first estimate of the position of the object of interest. The confidence level can be a weighting factor, a probability, or another measure related to accuracy. The confidence level can be an indication of how accurately the detector processing interface has estimated the new position of the object of interest.


At block 1622 a second estimate of the new position of the object of interest is calculated by deriving the estimate from the new position estimates of related objects 1 and 2. For example, when tracking a knee joint, an estimate of the position or orientation of the knee joint can be derived from estimates of the patient's upper leg and lower leg positioning. At block 1624 the system provides a confidence level for the second estimate of the object of interest's position. The confidence level can be an error estimate, a probability, or other measure related to accuracy.


At block 1626 a third estimate of the new positioning of the object of interest is calculated by combining the first and second estimates. In some embodiments, the first and second estimates are combined with a simple average or weighted average, weighting each estimate based on its relative confidence level. In other embodiments, the first estimate and second estimate are combined in a winner takes all approach. For example, the estimate with the highest relative confidence level may be used and the other one discarded. In other examples, the first estimate and second estimate can be combined using Bayesian probability or other statistical approaches. At block 1628 the system generates tracking data based on a differential between the third estimate of the patient's new positioning and the old or prior positioning estimate of the object of interest. This tracking data can be generated, for example, by the tracking combination controller 204 as described above.


At block 1630 the scanner controller utilizes the tracking data to adjust the scanner to compensate for movement of the patient or object of interest. At block 1632 the process varies depending on whether the imaging scan is complete. If the imaging scan is not complete, the process goes back to block 1612 and acquires new patient movement data from the detectors. The process continues in this fashion until the imaging scan is complete. When the imaging scan is complete at block 1632, the process proceeds to block 1634 and is complete.


In some embodiments of motion tracking systems, the motion tracking system is configured to associate subject motion or movement tracking data with image data acquired from a scanner and to display the tracking data along with the associated image data by, for example, overlaying the tracking data over the image data. For example, FIG. 18 illustrates an embodiment of a scanner image 1802 combined with a tracking data overlay 1804 and a pictorial tracking overlay 1806. The scanner image 1802 is, for example, a magnetic resonance image acquired using a magnetic resonance scanner, such as the scanner 108 shown in FIG. 17. While the scanner image 1802 shown in FIG. 18 depicts an entire human body, the scanner image 1802 can be an image of any object being scanned, for example, a human brain, a knee joint, or the like.


The tracking data overlay 1804 shown in FIG. 18 can be configured to display information related to motion of the subject or object of interest that occurred during the scan represented by the scanner image 1802 and that was tracked by a motion tracking system, such as the motion tracking system 1700 shown in FIG. 17. For example, the tracking data overlay 1804 can be configured to display a speed or velocity of tracked movement. The speed or velocity can be displayed in numerical form (for example, 10 mm/sec), or in pictorial form, for example, by displaying a horizontal bar having a relatively long length to represent a relatively fast speed or a relatively short length to represent a relatively slow speed, or by displaying a graph representing the temporal evolution of motion during the scan. The tracking data overlay 1804 can also be configured to display a magnitude of tracked movement. The magnitude can be displayed in numerical form (for example, 10 mm), or in pictorial form, for example, by displaying a horizontal bar having a relatively long length to represent a relatively large movement or a relatively short length to represent a relatively small movement.


The tracking data overlay 1804 can additionally be configured to display a direction of tracked movement. The direction can be displayed in numerical or pictorial form. For example, the direction can be depicted as numerical values representing the three translations and three rotations in the detector and/or scanner coordinate systems. In some embodiments, the direction can be depicted using a pictorial representation or representations of a rotated or translated coordinate system or of a motion path of the tracked subject (for example, using the motion indicators 2104 shown in FIGS. 21, 22A-22D, and 23A-23C).


In some embodiments, a pictorial representation can be configured to show a speed, magnitude, or direction of tracked motion, or any combination thereof. For example, an arrow, such as the motion indicator 2104 shown in FIG. 21, can be configured to display directions by the direction or directions the arrow or segments of the arrow are pointing, magnitude by lengths of the arrow segments, and/or velocity by a color or thickness of the arrow segments.


In some embodiments, the tracking data overlay 1804 can be configured to display absolute values, average values, median values, minimum values, maximum values, variance values, range values, and the like, or any combination thereof.


The tracking data overlay 1804 can also be configured to indicate whether or not motion compensation was applied to the scanner image 1802. For example, the tracking data overlay 1804 can be configured to display text, such as “Comp: ON” or “Comp: OFF” to indicate that motion compensation was or was not applied, respectively. The motion tracking system can alternatively be configured to display whether motion compensation was applied to the scanner image 1802 is various other ways. For example, a portion of the scanner image 1802, such as a border, a graphic, a bar, text, or the like, can be configured to be a different color depending on whether or not motion tracking was applied.


In some embodiments, a scanner image 1802 can be combined with multiple tracking data overlays 1804. For example, in a motion tracking system configured to adjust or update scanner parameters based on tracked motion more than one time during the creation of each scanner image 1802, the scanner image 1802 can be configured to display a separate tracking data overlay 1804 for each adjustment or update to the scanner parameters. Alternatively, the system can be configured to combine all adjustments or updates into one tracking data overlay 1804 by providing, for example, average values, median values, minimum values, maximum values, variance values, range values, or the like.


The pictorial tracking overlay 1806 shown in FIG. 18 can be configured to indicate pictorially the position of the subject or object of interest during the creation of the scanner image 1802. For example, the pictorial tracking overlay 1806 illustrates a human head turned slightly to the left. The positioning of the head shown in the pictorial tracking overlay 1806 can indicate, for example, the positioning of the subject's head at the beginning of scan illustrated by scanner image 1802, at the end of the scan, at the middle of the scan, or, for example, an average position of the subject's head during the scan.


In some embodiments, the pictorial tracking overlay 1806 can additionally or alternatively be configured to display motion that was tracked during the creation of scanner image 1802. For example, a series of semi-transparent depictions of a human head can be shown on top of one another but slightly translated or rotated with respect to each other to depict the tracked motion. In other examples, as illustrated in FIGS. 21, 22A-22D, and 23A-23C, various motion indicators 2104 can be configured to display tracked motion.


In some embodiments, a motion tracking system, such as the motion tracking system 1700 shown in FIG. 17, can be configured to display a video depiction of tracked motion. For example, the system can be configured to electronically display the scanner image 1802 with an animated pictorial tracking overlay 1806 showing the subject's tracked motion. If the system tracked a subject's head moving from right to left during creation of the scanner image 1802, then the pictorial tracking overlay 1806 can, for example, depict an animated head moving from right to left.


Although the pictorial tracking overlay 1806 illustrated in FIG. 18 shows a representation of a human head, in some embodiments the pictorial tracking overlay 1806 can alternatively include a representation of any other organ being scanned or even an arbitrary shape, cross, coordinate system axes depiction, or the like. In some embodiments, the pictorial tracking overlay 1806 can include a visual photographic image and/or video of the subject, for example, as acquired by one or more of the detectors 102.



FIG. 21 illustrates an embodiment of a tracked motion display 2100. The tracked motion display 2100 includes a subject representation 2102 and a motion indicator 2104. The subject representation 2102 can be, for example, a representation of a human head or any other object of interest being scanned. The motion indicator 2104 comprises an arrow with multiple segments indicating motion that was tracked during a scan. For example, in this embodiment, the motion indicator 2104 is displaying that the patient rotated his or her head generally up and to the left during a scan. The tracked motion display 2100 can be used as a pictorial tracking overlay 1806 as described above. The tracked motion display 2100 can alternatively be displayed on a separate electronic display or on a separate printout.



FIGS. 22A-22D illustrate various embodiments of tracked motion displays 2200. The tracked motion displays 2200 include a subject representation 2102, a motion indicator 2104, and a compensation indicator 2202. In some embodiments, the tracked motion displays 2200 represent individual frames of an animated video showing tracked motion and whether or not motion compensation was applied. In other embodiments, the tracked motion displays 2200 are static displays associated with specific scanner images and displayed along with their associated scanner images by, for example, being used as a pictorial tracking overlay 1806 as described above or being displayed on an electronic display while a user is viewing the scanned images.


The compensation indicators 2202 are configured to display whether or not motion compensation was applied to the scanner image or images associated with each tracked motion display 2200. For example, if compensation was not applied, the compensation indicator 2202 is configured to be colored red and to say “No Prospective Motion Correction.” If compensation was applied, the compensation indicator 2202 is configured to be colored green and to say “Prospective Motion Correction Enabled.” In other embodiments, the compensation indicators 2202 can be configured to display whether motion compensation was applied in various other ways. For example, the compensation indicators 2202 can be a colored border or background that changes colors depending on whether motion compensation was applied.


The motion indicator 2104 is configured to indicate motion of the patient or object of interest that was tracked during the scan. In some embodiments, the motion indicator 2104 is configured to only display motion tracked during creation of the scanned image associated with that tracked motion display 2200. In other embodiments, the motion indicator 2104 is configured to be cumulative. For example, in some embodiments, the motion indicator 2104 is configured to display motion tracked during creation of the scanned image associated with that tracked motion display 2200, but also to display motion tracked during prior scanned images. In some embodiments, the subject representation 2101 is also configured to display tracked motion. For example, in FIG. 22C, the subject representation 2101 is shown tilted to the right, indicating the patient had his or her head tilted to the right during the creation of the scanned image or images associated with that tracked motion display 2200.



FIGS. 23A-23C illustrate additional embodiments of tracked motion displays 2300. The tracked motion displays 2300 include a subject representation 2102, a motion indicator 2104, and a reference indicator 2304. The motion indicators 2104 comprise a representation of coordinate system axes configured to show all three translations and all three rotations of the object of interest through rotations and/or translations of the motion indicators 2104. The reference indicator 2304 is configured to show where the patient's head or other object of interest was located at the start of a scan. In some embodiments, as shown in FIG. 23B, the subject representation 2102 remains static, along with the reference indicator 2304, and only the motion indicator 2104 moves to display tracked motion. In other embodiments, as shown in FIG. 23C, both the subject representation 2102 and motion indicator 2104 move to display tracked motion. In some embodiments, the motion indicator 2104 and/or reference indicator 2304 are displayed using different colors to allow a user to more easily differentiate between them. For example, as shown in FIGS. 23A-23C, the motion indicator 2104 is shown using a red color and the reference indicator 2304 is shown using a blue color. In some embodiments, the indicators are illustrated using different line styles to allow a user to more easily differentiate between them. For example, as shown in FIGS. 23A-23C, the motion indicator 2104 is shown using solid lines and the reference indicator 2304 is shown using dashed lines. In various embodiments, motion indicators, such as those shown in FIGS. 21, 22A-22D, and 23A-23C, can be configured to be displayed using a different color than the subject representation to allow a user to more easily differentiate between the subject representation and motion indicator. For example, the subject representations in various figures are illustrated as black, while the motion indicators are illustrated as blue or red.



FIG. 17 is a block diagram depicting an embodiment of a motion tracking system 1700. The motion tracking system 1700 comprises one or more detectors 102, a detector processing interface 104, a scanner controller 106, a scanner 108, a scanner image acquisition interface 904, an image overlay interface 1702, and an image data database 1704. The detector processing interface further comprises several tracking controllers or filters 202 and a tracking combination interface 204, as described above and illustrated in, for example, motion tracking system 200. The motion tracking system 1700 operates similarly to the motion tracking system 200 shown in FIG. 2, with the addition of the scanner image acquisition controller 904, image overlay interface 1702, and image data database 1704, as described below.


Although motion tracking system 1700 is illustrated using multiple tracking controllers or filters 202 utilizing both markerless tracking techniques (for example, anatomical landmark tracking, distance tracking, or the like) and marker-based tracking techniques, the concepts described herein relating to image overlay techniques can be applied to any motion tracking system, including, but not limited to, systems using markerless tracking controllers, tracking controllers utilizing markers, or any combination thereof. The image overlay techniques described herein can additionally be used with motion tracking systems that utilize only one method of tracking and therefore do not comprise a tracking combination interface 204.


In operation, the scanner controller 106 shown in FIG. 17 receives tracking data describing tracked motion of the object of interest from the detector processing interface 104. The scanner controller 106 optionally uses this tracking data to adjust one or more parameters of the scanner 108 to compensate for the tracked motion. The scanner controller 106 additionally sends the tracking data and an indicator of whether or not the scanner controller 106 adjusted the scanner 108 for the tracked motion to the image overlay interface 1702. The image overlay interface 1702 utilizes the tracking data and indicator from the scanner controller 106 to generate data representing, for example, the tracking data overlay 1804 and/or the pictorial tracking overlay 1806 shown in FIG. 18, as described above.


In some embodiment, the image overlay interface 1702 communicates with the scanner image acquisition interface 904 to apply one or more tracking overlays to the scanner images acquired by the scanner image acquisition interface 904. In some embodiments, the scanner image acquisition interface 904 sends acquired scanner images to the image data database 1704 for later retrieval and display. The image overlay interface 1702 can additionally be configured to send data representing, for example, the tracking data overlay 1804 and/or the pictorial tracking overlay 1806, to the image data database 1704 and to associate this overlay data with the acquired scanner image or images in the database to which it should be applied. Scanner images can be retrieved from the image data database 1704 along with the associated overlay data to be, for example, printed, displayed on an electronic display device, transmitted through a network for display at a remote terminal, or the like.



FIG. 19 depicts an embodiment of a process flow diagram illustrating an example of tracking and compensating for motion in biomedical imaging using a motion tracking system. This embodiment additionally illustrates an example of storing and/or overlaying tracking data for display along with acquired scanner images. At block 1902 the process begins. At block 1904 the system provides baseline data representing a patient position. For example, the detectors 102 as shown in the motion tracking system 1700 of FIG. 17 acquire information about a subject, such as images of the subject, and send this data to the detector processing interface 104. The detector processing interface 104 is configured to analyze this data and determine a baseline positioning of the patient or the object of interest. At block 1906 a scanner, such as the scanner 108 of the motion tracking system 1700, begins an imaging scan of the patient. For example, an MRI scanner begins a magnetic resonance imaging scan of the patient.


At block 1908 the detectors acquire new patient movement data. For example, the detectors acquire new images, camera frames, distance estimates, or the like of the patient or the object of interest. At block 1910 the system analyzes the new patient movement data to estimate a new patient positioning. For example, the data from the detectors 102 is analyzed by each of the tracking controllers or filters 202 as described above, and each tracking controller 202 generates an estimate of the new patient position. The estimates from the various tracking controllers or filters 202 are then fed into the tracking combination interface 204. The tracking combination interface 204 combines the various estimates from the tracking controllers or filters 202 and generates a single estimate to send to the scanner controller 106. At block 1912 the tracking combination interface generates tracking data containing the single estimate derived from the various estimates from the tracking controllers or filters 202. At block 1914 the scanner controller optionally utilizes the tracking data from the tracking combination interface to adjust the scanner to compensate for patient movement. For example, the scanner controller 106 adjusts in real time scan planes, locations, or orientations of the scanner. In some cases the scanner controller may not adjust the scanner, because, for example, a veto signal indicates the current tracking data is unreliable.


At block 1916, scanner data is acquired. For example, the scanner image acquisition interface 904 shown in FIG. 17 receives data from the scanner 108 representing an image of the subject or object of interest. At block 1918, tracking data associated with the acquired scanner data is stored in a database and associated with the scanner data and/or overlaid onto an image generated by the scanner image acquisition interface 904. For example, the image overlay interface 1702 shown in FIG. 17 associates tracking data received from the scanner controller 106 with the appropriate image data in the image data database 1704 and/or generates data describing a tracking overlay, as described above.


At block 1920 the process varies depending on whether the imaging scan is complete. If the imaging scan is not complete, the process returns to block 1908 and acquires new patient movement data from the detectors. This process continues throughout the imaging scan to continuously adjust the scanner based on patient motion and to store tracking data to be overlaid onto the resulting scanner images. When the imaging scan is complete, the process moves from block 1920 to the end of the process at block 1922.



FIG. 20 depicts an embodiment of a process flow diagram illustrating an example of tracking and compensating for motion in biomedical imaging using a motion tracking system. This embodiment additionally illustrates an example of storing and/or overlaying tracking data for display along with acquired scanner images. The process shown in FIG. 20 can be implemented by, for example, the motion tracking system 900 shown in FIG. 9. At block 2002 the process begins. The system provides baseline data representing a patient's pre-scan position at block 2004. For example, detectors 102 detect information, such as images of a patient or object of interest, and send this information to a detector processing interface 104. The detector processing interface 104 uses various tracking controllers or filters 202 and a tracking combination interface 204, as described above, to then determine a baseline positioning of the patient or object of interest. At block 2006 the imaging scan of the patient or object of interest is begun.


At block 2008 new patient movement data, for example images, distance estimates, or the like, is acquired using the detectors 102. At block 2010 the new patient movement data is analyzed and compared to the baseline patient data to determine a new patient positioning estimate as described above. Block 2010 is performed by, for example, the detector processing interface 104 shown in FIG. 10. At block 2012 the system generates motion tracking data. The motion tracking data can be generated by, for example, the tracking combination interface 204 shown in FIG. 10, and describes the motion estimate generated by the tracking combination interface 204. At block 2014 scanner data is acquired. For example, the scanner 108 shown in FIG. 10 acquires scanner image data and sends the data to the scanner image acquisition interface 904.


At block 2016 the image processing system, such as the image processing system 902 shown in FIG. 10, optionally utilizes the acquired scanner data and generated tracking data to modify scanner images to compensate for patient movement. The image processing system 902 may not modify the scanner images in some cases, because, for example, a veto signal indicates the tracking data is unreliable.


At block 2018, tracking data associated with the scanner images from the scanner image acquisition interface 904 is stored in a database and associated with the scanner images and/or overlaid onto the scanner images. For example, the image processing system 902 may further comprise an image overlay interface 1702 and/or image data database 1704, as shown in FIG. 17, to generate and/or store data representing tracking overlays associated with scanner images, such as the tracking data overlay 1804 and/or pictorial tracking overlay 1806 shown in FIG. 18.


At block 2020 the process varies depending on whether the imaging scan is complete. If the imaging scan is not complete the process proceeds back to block 2008 and acquires new patient movement data from the detectors 102. The process then continues as described above. This process continues throughout the imaging scan to continuously modify the scanner images based on patient motion and to store tracking data to be overlaid onto the scanner images. If the imaging scan is complete at block 2020, the process proceeds to block 2022 and the process is complete.


Detector Positions


For any of the embodiments disclosed herein, one of ordinary skill in the art will appreciate that there can be a number of ways to position the detectors with respect to the medical imaging scanner. Disclosed below are several embodiments for positioning detectors with respect to the medical imaging scanner.


Any of the embodiments disclosed herein can be combined with the system illustrated in FIG. 24. FIG. 24 is a schematic diagram illustrating a side view of the medical imaging scanner 108 as part of the motion compensation system 2400. The motion compensation system 2400 is similar to the motion compensation system 100 illustrated in FIG. 1. However, the motion compensation system 100, as described above, illustrates three detectors 102. In the motion compensation system 2400, the detectors 2408 are positioned at a 90 degree angle 422 (also referred to as a scissor angle) to each other. The detectors 2408 of the motion compensation system 2400 are configured to view the landmark 110a along two different lines of sight 420. The motion compensation system 2400 illustrates that the detectors 2408 can be positioned in various ways, as long as each detector 2408 views the landmark 110a along a different line of sight. The angle 422 can vary and can be larger or smaller. In an embodiment, the angle 422 can be between 100 degrees and 70 degrees. In an embodiment, the angle 422 can be between 100 degrees and 20 degrees. In an embodiment, the angle 422 can be 30 degrees. For example, FIG. 33 illustrates a motion compensation system 490 similar to the motion compensation system 2400, except that the angle 422 is 30 degrees. In other embodiments, the angle can be various other angles, as long as the two lines of sight 420 are different.


Any of the embodiments disclosed herein can be combined with the system illustrated in FIGS. 25 and 26. FIG. 25 is another embodiment of a schematic diagram illustrating a front view of a medical imaging scanner 108 as part of a motion compensation system 430. FIG. 26 is a schematic diagram illustrating a side view of the medical imaging scanner 108 as a part of the motion compensation system 430. The motion compensation system 430 is similar to the motion compensation system 100 illustrated in FIG. 1. However, the motion compensation system 430 further comprises a head cage or head coil 432 configured to be positioned around a patient's head. In certain medical imaging tasks, such as certain MRI head scans, a head cage 432 can be utilized and positioned around the patient's head. The head cage can make it more difficult for a detector 2408 to image the landmark 110a if the detectors 2408 were mounted to the bore of the scanner body 108. Accordingly, the motion compensation system 430 comprises two detectors 2408 mounted to the head cage instead of the scanner body. The detectors 2408 and motion tracking system 102 are configured to operate similarly to as described above. The term head cage as utilized herein may be used to describe a device configured to help position the head of a patient during an MRI scan. The term head cage may also refer to a head coil device configured to wrap around a patient's head to perform MRI scanning functions.


Any of the embodiments disclosed herein can be combined with the system illustrated in FIGS. 27 and 28. FIG. 27 is another embodiment of a schematic diagram illustrating a front view of a medical imaging scanner 108 as part of a motion compensation system 440. FIG. 28 is a schematic diagram illustrating a side view of the medical imaging scanner 108 as a part of the motion compensation system 440. The motion compensation system 440 is similar to the motion compensation system 430 illustrated in FIGS. 25 and 26. However, in some cases, there can be limited space within the bore of a scanner 108 and/or the cage 432. In those cases, it can be difficult to position detectors 2408 to have a direct line of sight between their lens and the landmark 110a. Accordingly, the motion compensation system 440 comprises two detectors 2408 positioned flat against the head cage 432 with a line of sight 420 being through a mirror 442 to the landmark 110a. The mirrors 442 enable an indirect line of sight to make the system more compact but to still enable viewing of the landmark 110a from along two different lines of sight 420. Although this embodiment illustrates the use of mirrors with detectors mounted to a head cage, various other embodiments may use mirrors and/or detectors attached to the scanner body, the head cage, or any other location, as long as the detectors can view the optical marker through the mirrors. In some embodiments, multiple mirrors are used to redirect the line of sight 420 multiple times. For example, a detector 2408 may be positioned outside of the scanner and have its line of sight pass through one or more mirrors positioned within the scanner to image the optical marker.


Although the motion compensation system 440 comprises mirrors to redirect the lines of sight, other methods of redirecting a line of sight may be used, alone or in combination with mirrors. For example, fiber optics or prisms may be used to redirect a line of sight and create a virtual scissor angle.


Any of the embodiments disclosed herein can be combined with the system illustrated in FIG. 29. FIG. 29 is another embodiment of a schematic diagram illustrating a front view of a medical imaging scanner 108 as part of a motion compensation system 450. The motion compensation system 450 is similar to the motion compensation system 100 illustrated in FIG. 1. However, the motion compensation system 450 comprises two landmarks 110a. In this embodiment, each of the two landmarks 110a are directly in the line of sight of one of the detectors 2408. However, in other embodiments, multiple landmarks 110a may be selected. For example, multiple landmarks can be selected at various rigid or substantially rigid portions of the object being imaged. For example, as further described below, one landmark 110a can be a patient's top teeth, while one or more other landmarks can be selected from a patient's forehead.


Landmarks may also be selected from locations that are not rigid or substantially rigid. For example, a landmark may be selected from a patient's skin. In an embodiment, such as when the landmark is selected from a patient's skin, due to skin movement or skin elasticity, the landmark may at times move in relation to the object being scanned, which can introduce inaccuracies into a medical imaging scan. Accordingly, in some embodiments, a motion compensation system can be configured to differentiate between movements of the object being scanned, such as a patient's head, and skin movement, which may not correlate to actual movement of the object being scanned. In some embodiments, the system can be configured to compare the positioning of two or more landmarks relative to themselves in order to differentiate between head movement and skin movement.


Utilizing multiple landmarks 110a can have various benefits. For example, multiple landmarks may be used for redundancy, in case one or more landmarks is not currently visible to one or more detectors based on the current object's pose. Another advantage is that multiple landmarks can be analyzed simultaneously by the motion tracking system 102a to obtain multiple object pose estimates. Those multiple object pose estimates can then be combined to generate a single more accurate estimate. For example, the multiple estimates can be averaged to come up with an average estimate. In another example, there may be a measure of margin of error for each estimate and the estimates may be combined using a weighted average based on the margin of error. In other embodiments, only the most accurate estimate is used and other estimates are dropped.


Any of the embodiments disclosed herein can be combined with the system illustrated in FIGS. 30-31. FIGS. 30-31 illustrate additional embodiments of motion compensation systems configured to use indirect lines of sight. Given that many medical imaging systems have limited space within the bore of the device, it can be advantageous to position detectors to be generally flat against a bore of the device or flush within the bore of the device. The embodiment of a motion tracking system 460 shown in FIG. 30 illustrates a system wherein two optical detectors 2408 are positioned flat against a bore of the medical imaging scanner 108. In this embodiment, the detectors 2408 are positioned facing each other along a longitudinal axis of the bore. Two mirrors 442 are positioned relatively close to the detectors to redirect their lines of sight 120 toward the landmark 110a. In this embodiment, the scissor angle 422 is significantly smaller than 90 degrees. However, in other embodiments, the detectors and/or mirrors may be positioned differently to increase or decrease the scissor angle 422.


The motion compensation system 470 illustrated in FIG. 31 is similar to the motion compensation system 460 illustrated in FIG. 30. However, the motion compensation system 470 comprises two detectors 2408 and two mirrors 442 mounted within the medical imaging scanner 108 such that they do not protrude into the bore of the scanner 108. The scanner 108 body can comprise openings to enable the lines of sight 420 to pass from the landmark 110a to the detectors 2408. In some embodiments, detectors may be positioned on a surface of the scanner bore, partially within the body of the scanner, fully within the body of the scanner, and/or the like. One determining factor of whether detectors can be mounted within a scanner body and/or whether any of the detector must protrude beyond the scanner body is the size of the detectors and the space available within the scanner body. More space available within the scanner body and/or smaller detectors may enable more or all of the detectors to be positioned within the scanner body.


Any of the embodiments disclosed herein can be combined with the system illustrated in FIG. 32. FIG. 32 illustrates a motion compensation system 480. The motion compensation system 480 is similar to the motion compensation system 460 illustrated in FIG. 30. However, the motion compensation system 480 comprises a head cage 432, and the detectors 2408 and mirrors 442 are mounted opposite each other on opposite ends of the head cage 432, rather than being mounted to the bore of the scanner. In various embodiments, the detectors 2408 may be mounted in various positions, not necessarily facing each other. For example, both detectors 2408 may be positioned on the same side of the head cage 432. As can be seen in FIG. 32, each of the two detectors 2408 is configured to view the landmark 110a along a line of sight 420 viewing the landmark 110a along a different angle relative to the landmark 110a. The line of sight 420 on the left-hand side is at a shallower angle than the line of sight 420 on the right-hand side. In other embodiments, the positioning of the detectors, the optical marker, and/or the mirrors may be adjusted to adjust the angles of each of the lines of sight relative to the marker and/or to adjust the scissor angle.


Computing System



FIG. 34 is a block diagram depicting an embodiment of a computer hardware system configured to run software for implementing one or more embodiments of the motion tracking systems described herein.


In some embodiments, the computer clients and/or servers described above take the form of a computing system 3400 illustrated in FIG. 34, which is a block diagram of one embodiment of a computing system that is in communication with one or more computing systems 3417 and/or one or more data sources 3419 via one or more networks 3416. The computing system 3400 may be used to implement one or more of the systems and methods described herein. In addition, in one embodiment, the computing system 3400 may be configured to manage access or administer a software application. While FIG. 34 illustrates one embodiment of a computing system 3400, it is recognized that the functionality provided for in the components and modules of computing system 3400 may be combined into fewer components and modules or further separated into additional components and modules.


Detector Processing Interface


In one embodiment, the computing system 3400 comprises a detector processing interface 3406 that carries out the functions described herein with reference to tracking motion during a scan, including any one of the motion tracking techniques described above. In some embodiments, the computing system 3400 additionally comprises a scanner controller, an anatomy configuration module, an image processing system, a scanner image acquisition module, and/or an image overlay module that carries out the functions described herein with reference to tracking motion during a scan and/or storing or overlaying tracking data with associated scanner images. The detector processing interface 3406 and/or other modules may be executed on the computing system 3400 by a central processing unit 3402 discussed further below.


In general, the word “module,” as used herein, refers to logic embodied in hardware or firmware, or to a collection of software instructions, possibly having entry and exit points, written in a programming language, such as, for example, COBOL, CICS, Java, Lua, C or C++. A software module may be compiled and linked into an executable program, installed in a dynamic link library, or may be written in an interpreted programming language such as, for example, BASIC, Perl, or Python. It will be appreciated that software modules may be callable from other modules or from themselves, and/or may be invoked in response to detected events or interrupts. Software instructions may be embedded in firmware, such as an EPROM. It will be further appreciated that hardware modules may be comprised of connected logic units, such as gates and flip-flops, and/or may be comprised of programmable units, such as programmable gate arrays or processors. The modules described herein are preferably implemented as software modules, but may be represented in hardware or firmware. Generally, the modules described herein refer to logical modules that may be combined with other modules or divided into sub-modules despite their physical organization or storage.


Computing System Components


In one embodiment, the computing system 3400 also comprises a mainframe computer suitable for controlling and/or communicating with large databases, performing high volume transaction processing, and generating reports from large databases. The computing system 3400 also comprises a central processing unit (“CPU”) 3402, which may comprise a conventional microprocessor. The computing system 3400 further comprises a memory 3404, such as random access memory (“RAM”) for temporary storage of information and/or a read only memory (“ROM”) for permanent storage of information, and a mass storage device 3408, such as a hard drive, diskette, or optical media storage device. Typically, the modules of the computing system 3400 are connected to the computer using a standards based bus system. In different embodiments, the standards based bus system could be Peripheral Component Interconnect (PCI), Microchannel, SCSI, Industrial Standard Architecture (ISA) and Extended ISA (EISA) architectures, for example.


The computing system 3400 comprises one or more commonly available input/output (I/O) devices and interfaces 3412, such as a keyboard, mouse, touchpad, and printer. In one embodiment, the I/O devices and interfaces 3412 comprise one or more display devices, such as a monitor, that allows the visual presentation of data to a user. More particularly, a display device provides for the presentation of GUIs, application software data, and multimedia presentations, for example. In one or more embodiments, the I/O devices and interfaces 3412 comprise a microphone and/or motion sensor that allow a user to generate input to the computing system 3400 using sounds, voice, motion, gestures, or the like. In the embodiment of FIG. 34, the I/O devices and interfaces 3412 also provide a communications interface to various external devices. The computing system 3400 may also comprise one or more multimedia devices 3410, such as speakers, video cards, graphics accelerators, and microphones, for example.


Computing System Device/Operating System


The computing system 3400 may run on a variety of computing devices, such as, for example, a server, a Windows server, a Structure Query Language server, a Unix server, a personal computer, a mainframe computer, a laptop computer, a tablet computer, a cell phone, a smartphone, a personal digital assistant, a kiosk, an audio player, an e-reader device, and so forth. The computing system 3400 is generally controlled and coordinated by operating system software, such as z/OS, Windows 95, Windows 98, Windows NT, Windows 2000, Windows XP, Windows Vista, Windows 7, Windows 8, Linux, BSD, SunOS, Solaris, Android, iOS, BlackBerry OS, or other compatible operating systems. In Macintosh systems, the operating system may be any available operating system, such as MAC OS X. In other embodiments, the computing system 3400 may be controlled by a proprietary operating system. Conventional operating systems control and schedule computer processes for execution, perform memory management, provide file system, networking, and I/O services, and provide a user interface, such as a graphical user interface (“GUI”), among other things.


Network


In the embodiment of FIG. 34, the computing system 3400 is coupled to a network 3416, such as a LAN, WAN, or the Internet, for example, via a wired, wireless, or combination of wired and wireless, communication link 3414. The network 3416 communicates with various computing devices and/or other electronic devices via wired or wireless communication links. In the embodiment of FIG. 34, the network 3416 is communicating with one or more computing systems 3417 and/or one or more data sources 3419.


Access to the detector processing interface 3406 of the computer system 3400 by computing systems 3417 and/or by data sources 3419 may be through a web-enabled user access point such as the computing systems' 3417 or data source's 3419 personal computer, cellular phone, smartphone, laptop, tablet computer, e-reader device, audio player, or other device capable of connecting to the network 3416. Such a device may have a browser module that is implemented as a module that uses text, graphics, audio, video, and other media to present data and to allow interaction with data via the network 3416.


The browser module may be implemented as a combination of an all points addressable display such as a cathode-ray tube (CRT), a liquid crystal display (LCD), a plasma display, or other types and/or combinations of displays. In addition, the browser module may be implemented to communicate with input devices 3412 and may also comprise software with the appropriate interfaces which allow a user to access data through the use of stylized screen elements such as, for example, menus, windows, dialog boxes, toolbars, and controls (for example, radio buttons, check boxes, sliding scales, and so forth). Furthermore, the browser module may communicate with a set of input and output devices to receive signals from the user.


The input device(s) may comprise a keyboard, roller ball, pen and stylus, mouse, trackball, voice recognition system, or pre-designated switches or buttons. The output device(s) may comprise a speaker, a display screen, a printer, or a voice synthesizer. In addition a touch screen may act as a hybrid input/output device. In another embodiment, a user may interact with the system more directly such as through a system terminal connected to the score generator without communications over the Internet, a WAN, or LAN, or similar network.


In some embodiments, the system 3400 may comprise a physical or logical connection established between a remote microprocessor and a mainframe host computer for the express purpose of uploading, downloading, or viewing interactive data and databases on-line in real time. The remote microprocessor may be operated by an entity operating the computer system 3400, including the client server systems or the main server system, an/or may be operated by one or more of the data sources 3419 and/or one or more of the computing systems 3417. In some embodiments, terminal emulation software may be used on the microprocessor for participating in the micro-mainframe link.


In some embodiments, computing systems 3417 who are internal to an entity operating the computer system 3400 may access the detector processing interface 3406 internally as an application or process run by the CPU 3402.


User Access Point


In an embodiment, a user access point or user interface comprises a personal computer, a laptop computer, a tablet computer, an e-reader device, a cellular phone, a smartphone, a GPS system, a Blackberry® device, a portable computing device, a server, a computer workstation, a local area network of individual computers, an interactive kiosk, a personal digital assistant, an interactive wireless communications device, a handheld computer, an embedded computing device, an audio player, or the like.


Other Systems


In addition to the systems that are illustrated in FIG. 34, the network 3416 may communicate with other data sources or other computing devices. The computing system 3400 may also comprise one or more internal and/or external data sources. In some embodiments, one or more of the data repositories and the data sources may be implemented using a relational database, such as DB2, Sybase, Oracle, CodeBase and Microsoft® SQL Server as well as other types of databases such as, for example, a flat file database, an entity-relationship database, and object-oriented database, and/or a record-based database.


Conditional language, such as, among others, “can,” “could,” “might,” or “may,” unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without user input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment. The headings used herein are for the convenience of the reader only and are not meant to limit the scope of the inventions or claims.


Although this invention has been disclosed in the context of certain preferred embodiments and examples, it will be understood by those skilled in the art that the present invention extends beyond the specifically disclosed embodiments to other alternative embodiments and/or uses of the invention and obvious modifications and equivalents thereof. Additionally, the skilled artisan will recognize that any of the above-described methods can be carried out using any appropriate apparatus. Further, the disclosure herein of any particular feature, aspect, method, property, characteristic, quality, attribute, element, or the like in connection with an embodiment can be used in all other embodiments set forth herein. For all of the embodiments described herein the steps of the methods need not be performed sequentially. Thus, it is intended that the scope of the present invention herein disclosed should not be limited by the particular disclosed embodiments described above.

Claims
  • 1. A biomedical system for tracking motion of an object during biomedical imaging and for compensating for motion of the object, the biomedical system comprising: a biomedical imaging scanner configured to perform scanning of the object to generate biomedical images of the object;at least one detector for generating data describing at least a first landmark and a second landmark of the object,one or more computer readable storage media configured to store a plurality of computer executable instructions; andone or more hardware computer processors in communication with the one or more computer readable storage media and configured to execute the plurality of computer executable instructions in order to cause the biomedical system to: determine motion of the first landmark using a first motion tracking technique;determine motion of the second landmark using a second motion tracking technique;differentiate between object movement and skin movement based at least in part on the determined motion of the first landmark and the determined motion of the second landmark;generate motion tracking data of the object based at least in part on the object movement differentiated based at least in part on the determined motion of the first landmark and the determined motion of the second landmark; andcontrol one or more scanner parameters of the biomedical imaging scanner based on the generated motion tracking data, the scanner parameters configured for controlling the biomedical imaging scanner to account for motion of the object during the scanning of the object.
  • 2. The biomedical system of claim 1, wherein the object movement and the skin movement are differentiated based at least in part on comparing the determined motion of the first landmark and the determined motion of the second landmark.
  • 3. The biomedical system of claim 1, wherein the first motion tracking technique and the second motion tracking technique are the same.
  • 4. The biomedical system of claim 1, wherein the first motion tracking technique and the second motion tracking technique are different.
  • 5. The biomedical system of claim 1, wherein the first and second landmarks comprise one or more of a facial feature of the subject, an organ of the subject, or an image projected onto the subject.
  • 6. The biomedical system of claim 1, wherein the biomedical system is further caused to utilize an atlas-segmentation technique for identifying the first landmark or the second landmark of the object.
  • 7. The biomedical system of claim 1, wherein the biomedical system is further caused to apply a first weighting factor to the determined motion of the first landmark and apply a second weighting factor to the determined motion of the second landmark, wherein the first weighting factor is based on a historical accuracy of the first motion tracking technique and the second weighting factor is based on a historical accuracy of the second motion tracking technique.
  • 8. The biomedical system of claim 1, wherein the biomedical system is further caused to perform calculations of a characteristic of the object.
  • 9. The biomedical system of claim 8, wherein the biomedical system is further caused to perform the calculations of the characteristic of the object by measuring distances of points on the object to the at least one detector.
  • 10. The biomedical system of claim 1, wherein the biomedical system is further caused to characterize different types of body organs and/or facial features of the object.
  • 11. The biomedical system of claim 1, wherein the biomedical system is further caused to measure at least the first and second landmarks of the object in coordinates of the at least one detector.
  • 12. The biomedical system of claim 1, wherein the biomedical imaging scanner comprises one or more of a magnetic resonance imaging (MRI) scanner, computerized tomography (CT) scanner, or positron emission tomography (PET) scanner.
  • 13. The biomedical system of claim 1, wherein the at least one detector is positioned within a bore of the biomedical imaging scanner.
  • 14. The biomedical system of claim 1, wherein the at least one detector is positioned in a head cage of the biomedical imaging scanner.
  • 15. The biomedical system of claim 1, wherein the at least one detector is positioned within a body of the biomedical imaging scanner.
  • 16. The biomedical system of claim 1, wherein the at least one detector is positioned flat against a bore of the biomedical imaging scanner.
  • 17. The biomedical system of claim 1, further comprising a first detector and a second detector, wherein the first detector and the second detector are positioned to view the first landmark at a different angle.
  • 18. The biomedical system of claim 17, wherein the first detector and the second detector are positioned at a 90 degree angle to each other.
  • 19. The biomedical system of claim 1, wherein each of the first motion tracking technique and the second motion tracking technique comprise one of anatomical landmark tracking, three-dimensional surface modeling tracking, or distance tracking.
  • 20. The biomedical system of claim 1, wherein the determined motion of the first landmark and the determined motion of the second landmark each comprise six degrees of freedom.
INCORPORATION BY REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 14/762,583, titled MOTION TRACKING SYSTEM FOR REAL TIME ADAPTIVE MOTION COMPENSATION IN BIOMEDICAL IMAGING, filed on Jul. 22, 2015, which is a National Stage of International Application No. PCT/US2014/013546, titled MOTION TRACKING SYSTEM FOR REAL TIME ADAPTIVE MOTION COMPENSATION IN BIOMEDICAL IMAGING, filed on Jan. 29, 2014, which claims the benefit of U.S. Provisional Patent Application No. 61/759,883, titled MOTION TRACKING SYSTEM FOR REAL TIME ADAPTIVE MOTION COMPENSATION IN BIOMEDICAL IMAGING, filed on Feb. 1, 2013. Each of the foregoing applications is hereby incorporated herein by reference in its entirety.

STATEMENT REGARDING FEDERALLY SPONSORED R&D

This invention was made with government support under grant number R01DA021146-01A1 awarded by the National Institutes of Health. The government has certain rights in the invention.

US Referenced Citations (771)
Number Name Date Kind
3811213 Eaves May 1974 A
4689999 Shkedi Sep 1987 A
4724386 Haacke et al. Feb 1988 A
4894129 Leiponen et al. Jan 1990 A
4923295 Sireul et al. May 1990 A
4953554 Zerhouni et al. Sep 1990 A
4988886 Palum et al. Jan 1991 A
5075562 Greivenkamp et al. Dec 1991 A
5318026 Pelc Jun 1994 A
5515711 Hinkle May 1996 A
5545993 Taguchi et al. Aug 1996 A
5615677 Pelc et al. Apr 1997 A
5687725 Wendt Nov 1997 A
5728935 Czompo Mar 1998 A
5802202 Yamada et al. Sep 1998 A
5808376 Gordon et al. Sep 1998 A
5835223 Zawemer et al. Nov 1998 A
5877732 Ziarati Mar 1999 A
5886257 Gustafson et al. Mar 1999 A
5889505 Toyama Mar 1999 A
5891060 McGregor Apr 1999 A
5936722 Armstrong et al. Aug 1999 A
5936723 Schmidt et al. Aug 1999 A
5947900 Derbyshire et al. Sep 1999 A
5987349 Schulz Nov 1999 A
6016439 Acker Jan 2000 A
6031888 Ivan et al. Feb 2000 A
6044308 Huissoon Mar 2000 A
6057680 Foo et al. May 2000 A
6057685 Zhou May 2000 A
6061644 Leis May 2000 A
6088482 He Jul 2000 A
6144875 Schweikard et al. Nov 2000 A
6175756 Ferre Jan 2001 B1
6236737 Gregson et al. May 2001 B1
6246900 Cosman et al. Jun 2001 B1
6279579 Riaziat et al. Aug 2001 B1
6285902 Kienzle, III et al. Sep 2001 B1
6289235 Webber Sep 2001 B1
6292683 Gupta et al. Sep 2001 B1
6298262 Franck et al. Oct 2001 B1
6381485 Hunter et al. Apr 2002 B1
6384908 Schmidt et al. May 2002 B1
6390982 Bova et al. May 2002 B1
6402762 Hunter et al. Jun 2002 B2
6405072 Cosman Jun 2002 B1
6421551 Kuth et al. Jul 2002 B1
6467905 Stahl et al. Oct 2002 B1
6474159 Foxlin et al. Nov 2002 B1
6484131 Amoral-Moriya et al. Nov 2002 B1
6490475 Seeley et al. Dec 2002 B1
6501981 Schweikard et al. Dec 2002 B1
6587707 Nehrke et al. Jul 2003 B2
6621889 Mostafavi Sep 2003 B1
6650920 Schaldach et al. Nov 2003 B2
6662036 Cosman Dec 2003 B2
6687528 Gupta et al. Feb 2004 B2
6690965 Riaziat et al. Feb 2004 B1
6711431 Sarin et al. Mar 2004 B2
6731970 Schlossbauer et al. May 2004 B2
6758218 Anthony Jul 2004 B2
6771997 Schaffer Aug 2004 B2
6794869 Brittain Sep 2004 B2
6856827 Seeley et al. Feb 2005 B2
6856828 Cossette et al. Feb 2005 B2
6876198 Watanabe et al. Apr 2005 B2
6888924 Claus et al. May 2005 B2
6891374 Brittain May 2005 B2
6892089 Prince et al. May 2005 B1
6897655 Brittain et al. May 2005 B2
6913603 Knopp et al. Jul 2005 B2
6937696 Mostafavi Aug 2005 B1
6959266 Mostafavi Oct 2005 B1
6973202 Mostafavi Dec 2005 B2
6980679 Jeung et al. Dec 2005 B2
7007699 Martinelli et al. Mar 2006 B2
7024237 Bova et al. Apr 2006 B1
7107091 Jutras et al. Sep 2006 B2
7110805 Machida Sep 2006 B2
7123758 Jeung et al. Oct 2006 B2
7171257 Thomson Jan 2007 B2
7173426 Bulumulla et al. Feb 2007 B1
7176440 Cofer et al. Feb 2007 B2
7191100 Mostafavi Mar 2007 B2
7204254 Riaziat et al. Apr 2007 B2
7209777 Saranathan et al. Apr 2007 B2
7209977 Acharya et al. Apr 2007 B2
7260253 Rahn et al. Aug 2007 B2
7260426 Schweikard et al. Aug 2007 B2
7295007 Dold Nov 2007 B2
7313430 Urquhart et al. Dec 2007 B2
7327865 Fu et al. Feb 2008 B2
7348776 Aksoy et al. Mar 2008 B1
7403638 Jeung et al. Jul 2008 B2
7494277 Setala Feb 2009 B2
7498811 Macfarlane et al. Mar 2009 B2
7502413 Guillaume Mar 2009 B2
7505805 Kuroda Mar 2009 B2
7535411 Falco May 2009 B2
7551089 Sawyer Jun 2009 B2
7561909 Pai et al. Jul 2009 B1
7567697 Mostafavi Jul 2009 B2
7573269 Yao Aug 2009 B2
7602301 Stirling et al. Oct 2009 B1
7603155 Jensen Oct 2009 B2
7623623 Raanes et al. Nov 2009 B2
7657300 Hunter et al. Feb 2010 B2
7657301 Mate et al. Feb 2010 B2
7659521 Pedroni Feb 2010 B2
7660623 Hunter et al. Feb 2010 B2
7668288 Conwell et al. Feb 2010 B2
7689263 Fung et al. Mar 2010 B1
7702380 Dean Apr 2010 B1
7715604 Sun et al. May 2010 B2
7742077 Sablak et al. Jun 2010 B2
7742621 Hammoud et al. Jun 2010 B2
7742804 Faul et al. Jun 2010 B2
7744528 Wallace et al. Jun 2010 B2
7760908 Curtner et al. Jul 2010 B2
7766837 Pedrizzetti et al. Aug 2010 B2
7769430 Mostafavi Aug 2010 B2
7772569 Bewersdorf et al. Aug 2010 B2
7787011 Zhou et al. Aug 2010 B2
7787935 Dumoulin et al. Aug 2010 B2
7791808 French et al. Sep 2010 B2
7792249 Gertner et al. Sep 2010 B2
7796154 Senior et al. Sep 2010 B2
7798730 Westerweck Sep 2010 B2
7801330 Zhang et al. Sep 2010 B2
7805987 Smith Oct 2010 B1
7806604 Bazakos et al. Oct 2010 B2
7817046 Coveley et al. Oct 2010 B2
7817824 Liang et al. Oct 2010 B2
7819818 Ghajar Oct 2010 B2
7825660 Yui et al. Nov 2010 B2
7833221 Voegele Nov 2010 B2
7834846 Bell Nov 2010 B1
7835783 Aletras Nov 2010 B1
7839551 Lee et al. Nov 2010 B2
7840253 Tremblay et al. Nov 2010 B2
7844094 Jeung et al. Nov 2010 B2
7844320 Shahidi Nov 2010 B2
7850526 Zalewski et al. Dec 2010 B2
7860301 Se et al. Dec 2010 B2
7866818 Schroeder et al. Jan 2011 B2
7868282 Lee et al. Jan 2011 B2
7878652 Chen et al. Feb 2011 B2
7883415 Larsen et al. Feb 2011 B2
7889907 Engelbart et al. Feb 2011 B2
7894877 Lewin et al. Feb 2011 B2
7902825 Bammer et al. Mar 2011 B2
7907987 Dempsey Mar 2011 B2
7908060 Basson et al. Mar 2011 B2
7908233 Angell et al. Mar 2011 B2
7911207 Macfarlane et al. Mar 2011 B2
7912532 Schmidt et al. Mar 2011 B2
7920250 Robert et al. Apr 2011 B2
7920911 Hoshino et al. Apr 2011 B2
7925066 Ruohonen et al. Apr 2011 B2
7925549 Looney et al. Apr 2011 B2
7931370 Prat Bartomeu Apr 2011 B2
7944354 Kangas et al. May 2011 B2
7944454 Zhou et al. May 2011 B2
7945304 Feinberg May 2011 B2
7946921 Ofek et al. May 2011 B2
7962197 Rioux et al. Jun 2011 B2
7971999 Zinser Jul 2011 B2
7977942 White Jul 2011 B2
7978925 Souchard Jul 2011 B1
7988288 Donaldson Aug 2011 B2
7990365 Marvit et al. Aug 2011 B2
8005571 Sutherland et al. Aug 2011 B2
8009198 Alhadef Aug 2011 B2
8019170 Wang et al. Sep 2011 B2
8021231 Walker et al. Sep 2011 B2
8022982 Thorn Sep 2011 B2
8024026 Groszmann Sep 2011 B2
8031909 Se et al. Oct 2011 B2
8031933 Se et al. Oct 2011 B2
8036425 Hou Oct 2011 B2
8041077 Bell Oct 2011 B2
8041412 Glossop et al. Oct 2011 B2
8048002 Ghajar Nov 2011 B2
8049867 Bridges et al. Nov 2011 B2
8055020 Meuter et al. Nov 2011 B2
8055049 Stayman et al. Nov 2011 B2
8060185 Hunter et al. Nov 2011 B2
8063929 Kurtz et al. Nov 2011 B2
8073197 Xu et al. Dec 2011 B2
8077914 Kaplan Dec 2011 B1
8085302 Zhang et al. Dec 2011 B2
8086026 Schulz Dec 2011 B2
8086299 Adler et al. Dec 2011 B2
RE43147 Aviv Jan 2012 E
8094193 Peterson Jan 2012 B2
8095203 Wright et al. Jan 2012 B2
8095209 Flaherty Jan 2012 B2
8098889 Zhu et al. Jan 2012 B2
8113991 Kutliroff Feb 2012 B2
8116527 Sabol Feb 2012 B2
8121356 Friedman Feb 2012 B2
8121361 Ernst et al. Feb 2012 B2
8134597 Thorn Mar 2012 B2
8135201 Smith et al. Mar 2012 B2
8139029 Boillot Mar 2012 B2
8139896 Ahiska Mar 2012 B1
8144118 Hildreth Mar 2012 B2
8144148 El Dokor Mar 2012 B2
8150063 Chen Apr 2012 B2
8150498 Gielen et al. Apr 2012 B2
8160304 Rhoads Apr 2012 B2
8165844 Luinge et al. Apr 2012 B2
8167802 Baba et al. May 2012 B2
8172573 Sonenfeld et al. May 2012 B2
8175332 Herrington May 2012 B2
8179604 Prada Gomez et al. May 2012 B1
8180428 Kaiser et al. May 2012 B2
8180432 Sayeh May 2012 B2
8187097 Zhang May 2012 B1
8189869 Bell May 2012 B2
8189889 Pearlstein et al. May 2012 B2
8189926 Sharma May 2012 B2
8190233 Dempsey May 2012 B2
8191359 White et al. Jun 2012 B2
8194134 Furukawa Jun 2012 B2
8195084 Xiao Jun 2012 B2
8199983 Qureshi Jun 2012 B2
8206219 Shum Jun 2012 B2
8207967 El Dokor Jun 2012 B1
8208758 Wang Jun 2012 B2
8213693 Li Jul 2012 B1
8214012 Zuccolotto et al. Jul 2012 B2
8214016 Lavallee et al. Jul 2012 B2
8216016 Yamagishi et al. Jul 2012 B2
8218818 Cobb Jul 2012 B2
8218819 Cobb Jul 2012 B2
8218825 Gordon Jul 2012 B2
8221399 Amano Jul 2012 B2
8223147 El Dokor Jul 2012 B1
8224423 Faul Jul 2012 B2
8226574 Whillock Jul 2012 B2
8229163 Coleman Jul 2012 B2
8229166 Teng Jul 2012 B2
8229184 Benkley Jul 2012 B2
8232872 Zeng Jul 2012 B2
8235529 Raffle Aug 2012 B1
8235530 Maad Aug 2012 B2
8241125 Huges Aug 2012 B2
8243136 Aota Aug 2012 B2
8243269 Matousek Aug 2012 B2
8243996 Steinberg Aug 2012 B2
8248372 Saila Aug 2012 B2
8249691 Chase et al. Aug 2012 B2
8253770 Kurtz Aug 2012 B2
8253774 Huitema Aug 2012 B2
8253778 Atsushi Aug 2012 B2
8259109 El Dokor Sep 2012 B2
8260036 Hamza et al. Sep 2012 B2
8279288 Son Oct 2012 B2
8284157 Markovic Oct 2012 B2
8284847 Adermann Oct 2012 B2
8287373 Marks et al. Oct 2012 B2
8289390 Aggarwal Oct 2012 B2
8289392 Senior et al. Oct 2012 B2
8290208 Kurtz Oct 2012 B2
8290229 Qureshi Oct 2012 B2
8295573 Bredno et al. Oct 2012 B2
8301226 Csavoy et al. Oct 2012 B2
8306260 Zhu Nov 2012 B2
8306267 Gossweiler, III Nov 2012 B1
8306274 Grycewicz Nov 2012 B2
8306663 Wickham Nov 2012 B2
8310656 Zalewski Nov 2012 B2
8310662 Mehr Nov 2012 B2
8311611 Csavoy et al. Nov 2012 B2
8314854 Yoon Nov 2012 B2
8315691 Sumanaweera et al. Nov 2012 B2
8316324 Boillot Nov 2012 B2
8320621 McEldowney Nov 2012 B2
8320709 Arartani et al. Nov 2012 B2
8323106 Zalewski Dec 2012 B2
8325228 Mariadoss Dec 2012 B2
8330811 Maguire, Jr. Dec 2012 B2
8330812 Maguire, Jr. Dec 2012 B2
8331019 Cheong Dec 2012 B2
8334900 Qu et al. Dec 2012 B2
8339282 Noble Dec 2012 B2
8351651 Lee Jan 2013 B2
8368586 Mohamadi Feb 2013 B2
8369574 Hu Feb 2013 B2
8374393 Cobb Feb 2013 B2
8374411 Ernst et al. Feb 2013 B2
8374674 Gertner Feb 2013 B2
8376226 Dennard Feb 2013 B2
8376827 Cammegh Feb 2013 B2
8379927 Taylor Feb 2013 B2
8380284 Saranathan et al. Feb 2013 B2
8386011 Wieczorek Feb 2013 B2
8390291 Macfarlane et al. Mar 2013 B2
8390729 Long Mar 2013 B2
8395620 El Dokor Mar 2013 B2
8396654 Simmons et al. Mar 2013 B1
8400398 Schoen Mar 2013 B2
8400490 Apostolopoulos Mar 2013 B2
8405491 Fong Mar 2013 B2
8405656 El Dokor Mar 2013 B2
8405717 Kim Mar 2013 B2
8406845 Komistek et al. Mar 2013 B2
8411931 Zhou Apr 2013 B2
8427538 Ahiska Apr 2013 B2
8428319 Tsin et al. Apr 2013 B2
8571293 Ernst et al. Oct 2013 B2
8615127 Fitzpatrick Dec 2013 B2
8744154 Van Den Brink Jun 2014 B2
8747382 D'Souza Jun 2014 B2
8788020 Mostafavi Jul 2014 B2
8805019 Jeanne et al. Aug 2014 B2
8848977 Bammer et al. Sep 2014 B2
8862420 Ferran et al. Oct 2014 B2
8953847 Moden Feb 2015 B2
8996094 Schouenborg et al. Mar 2015 B2
9076212 Ernst et al. Jul 2015 B2
9082177 Sebok Jul 2015 B2
9084629 Rosa Jul 2015 B1
9103897 Herbst et al. Aug 2015 B2
9138175 Ernst et al. Sep 2015 B2
9173715 Baumgartner Nov 2015 B2
9176932 Baggen et al. Nov 2015 B2
9194929 Siegert et al. Nov 2015 B2
9305365 Lovberg et al. Apr 2016 B2
9318012 Johnson Apr 2016 B2
9395386 Corder et al. Jul 2016 B2
9451926 Kinahan et al. Sep 2016 B2
9453898 Nielsen et al. Sep 2016 B2
9606209 Ernst et al. Mar 2017 B2
9607377 Lovberg et al. Mar 2017 B2
9629595 Walker e Apr 2017 B2
9717461 Yu et al. Aug 2017 B2
9734589 Yu et al. Aug 2017 B2
9779502 Lovberg et al. Oct 2017 B1
9782141 Yu Oct 2017 B2
9785247 Horowitz et al. Oct 2017 B1
9943247 Ernst et al. Apr 2018 B2
10327708 Yu et al. Jun 2019 B2
10339654 Lovberg et al. Jul 2019 B2
20020082496 Kuth Jun 2002 A1
20020087101 Barrick et al. Jul 2002 A1
20020091422 Greenberg et al. Jul 2002 A1
20020115931 Strauss et al. Aug 2002 A1
20020118373 Eviatar et al. Aug 2002 A1
20020180436 Dale et al. Dec 2002 A1
20020188194 Cosman Dec 2002 A1
20030063292 Mostafavi Apr 2003 A1
20030088177 Totterman et al. May 2003 A1
20030116166 Anthony Jun 2003 A1
20030130574 Stoyle Jul 2003 A1
20030195526 Vilsmeir Oct 2003 A1
20040071324 Norris et al. Apr 2004 A1
20040116804 Mostafavi Jun 2004 A1
20040140804 Polzin et al. Jul 2004 A1
20040171927 Lowen et al. Sep 2004 A1
20050027194 Adler et al. Feb 2005 A1
20050054910 Tremblay et al. Mar 2005 A1
20050070784 Komura et al. Mar 2005 A1
20050105772 Voronka et al. May 2005 A1
20050107685 Seeber May 2005 A1
20050137475 Dold et al. Jun 2005 A1
20050148845 Dean et al. Jul 2005 A1
20050148854 Ito et al. Jul 2005 A1
20050283068 Zuccoloto et al. Dec 2005 A1
20060004281 Saracen Jan 2006 A1
20060045310 Tu et al. Mar 2006 A1
20060074292 Thomson et al. Apr 2006 A1
20060241405 Leitner et al. Oct 2006 A1
20070049794 Glassenberg et al. Mar 2007 A1
20070093709 Abernathie Apr 2007 A1
20070189386 Imagawa et al. Aug 2007 A1
20070206836 Yoon Sep 2007 A1
20070239169 Plaskos et al. Oct 2007 A1
20070276224 Lang Nov 2007 A1
20070280508 Ernst et al. Dec 2007 A1
20080039713 Thomson et al. Feb 2008 A1
20080129290 Yao Jun 2008 A1
20080181358 Van Kampen et al. Jul 2008 A1
20080183074 Carls et al. Jul 2008 A1
20080208012 Ali Aug 2008 A1
20080212835 Tavor Sep 2008 A1
20080221442 Tolowsky et al. Sep 2008 A1
20080221520 Nagel et al. Sep 2008 A1
20080273754 Hick et al. Nov 2008 A1
20080287728 Mostafavi Nov 2008 A1
20080287780 Chase et al. Nov 2008 A1
20080317313 Goddard et al. Dec 2008 A1
20090028411 Pfeuffer Jan 2009 A1
20090041200 Lu et al. Feb 2009 A1
20090052760 Smith et al. Feb 2009 A1
20090116719 Jaffray et al. May 2009 A1
20090185663 Gaines, Jr. et al. Jul 2009 A1
20090187112 Meir et al. Jul 2009 A1
20090209846 Bammer Aug 2009 A1
20090253985 Shachar et al. Oct 2009 A1
20090304297 Adabala et al. Dec 2009 A1
20090306499 Van Vorhis et al. Dec 2009 A1
20100054579 Okutomi Mar 2010 A1
20100057059 Makino Mar 2010 A1
20100059679 Albrecht Mar 2010 A1
20100069742 Partain et al. Mar 2010 A1
20100091089 Cromwell et al. Apr 2010 A1
20100099981 Fishel Apr 2010 A1
20100125191 Sahin May 2010 A1
20100137709 Gardner et al. Jun 2010 A1
20100148774 Kamata Jun 2010 A1
20100149099 Elias Jun 2010 A1
20100149315 Qu Jun 2010 A1
20100160775 Pankratov Jun 2010 A1
20100164862 Sullivan Jul 2010 A1
20100165293 Tanassi et al. Jul 2010 A1
20100167246 Ghajar Jul 2010 A1
20100172567 Prokoski Jul 2010 A1
20100177929 Kurtz Jul 2010 A1
20100178966 Suydoux Jul 2010 A1
20100179390 Davis Jul 2010 A1
20100179413 Kadour et al. Jul 2010 A1
20100183196 Fu et al. Jul 2010 A1
20100191631 Weidmann Jul 2010 A1
20100194879 Pasveer Aug 2010 A1
20100198067 Mahfouz Aug 2010 A1
20100198101 Song Aug 2010 A1
20100198112 Maad Aug 2010 A1
20100199232 Mistry Aug 2010 A1
20100210350 Walker Aug 2010 A9
20100214267 Radivojevic Aug 2010 A1
20100231511 Henty Sep 2010 A1
20100231692 Perlman Sep 2010 A1
20100245536 Huitema Sep 2010 A1
20100245593 Kim Sep 2010 A1
20100251924 Taylor Oct 2010 A1
20100253762 Cheong Oct 2010 A1
20100268072 Hall et al. Oct 2010 A1
20100277571 Xu Nov 2010 A1
20100282902 Rajasingham Nov 2010 A1
20100283833 Yeh Nov 2010 A1
20100284119 Coakley Nov 2010 A1
20100289899 Hendron Nov 2010 A1
20100290668 Friedman Nov 2010 A1
20100292841 Wickham Nov 2010 A1
20100295718 Mohamadi Nov 2010 A1
20100296701 Hu Nov 2010 A1
20100302142 French Dec 2010 A1
20100303289 Polzin Dec 2010 A1
20100311512 Lock Dec 2010 A1
20100321505 Kokubun Dec 2010 A1
20100328055 Fong Dec 2010 A1
20100328201 Marbit Dec 2010 A1
20100328267 Chen Dec 2010 A1
20100330912 Saila Dec 2010 A1
20110001699 Jacobsen Jan 2011 A1
20110006991 Elias Jan 2011 A1
20110007939 Teng Jan 2011 A1
20110007946 Liang Jan 2011 A1
20110008759 Usui Jan 2011 A1
20110015521 Faul Jan 2011 A1
20110019001 Rhoads Jan 2011 A1
20110025853 Richardson Feb 2011 A1
20110038520 Yui Feb 2011 A1
20110043631 Marman Feb 2011 A1
20110043759 Bushinsky Feb 2011 A1
20110050562 Schoen Mar 2011 A1
20110050569 Marvit Mar 2011 A1
20110050947 Marman Mar 2011 A1
20110052002 Cobb Mar 2011 A1
20110052003 Cobb Mar 2011 A1
20110052015 Saund Mar 2011 A1
20110054870 Dariush Mar 2011 A1
20110057816 Noble Mar 2011 A1
20110058020 Dieckmann Mar 2011 A1
20110064290 Punithakaumar Mar 2011 A1
20110069207 Steinberg Mar 2011 A1
20110074675 Shiming Mar 2011 A1
20110081000 Gertner Apr 2011 A1
20110081043 Sabol Apr 2011 A1
20110085704 Han Apr 2011 A1
20110087091 Olson Apr 2011 A1
20110092781 Gertner Apr 2011 A1
20110102549 Takahashi May 2011 A1
20110105883 Lake et al. May 2011 A1
20110105893 Akins et al. May 2011 A1
20110115793 Grycewicz May 2011 A1
20110115892 Fan May 2011 A1
20110116683 Kramer et al. May 2011 A1
20110117528 Marciello et al. May 2011 A1
20110118032 Zalewski May 2011 A1
20110133917 Zeng Jun 2011 A1
20110142411 Camp Jun 2011 A1
20110150271 Lee Jun 2011 A1
20110157168 Bennett Jun 2011 A1
20110157358 Bell Jun 2011 A1
20110157370 Livesey Jun 2011 A1
20110160569 Cohen et al. Jun 2011 A1
20110172060 Morales Jul 2011 A1
20110172521 Zdeblick et al. Jul 2011 A1
20110175801 Markovic Jul 2011 A1
20110175809 Markovic Jul 2011 A1
20110175810 Markovic Jul 2011 A1
20110176723 Ali et al. Jul 2011 A1
20110180695 Li Jul 2011 A1
20110181893 MacFarlane Jul 2011 A1
20110182472 Hansen Jul 2011 A1
20110187640 Jacobsen Aug 2011 A1
20110193939 Vassigh Aug 2011 A1
20110199461 Horio Aug 2011 A1
20110201916 Duyn et al. Aug 2011 A1
20110201939 Hubschman et al. Aug 2011 A1
20110202306 Eng Aug 2011 A1
20110205358 Aota Aug 2011 A1
20110207089 Lagettie Aug 2011 A1
20110208437 Teicher Aug 2011 A1
20110216002 Weising Sep 2011 A1
20110216180 Pasini Sep 2011 A1
20110221770 Kruglick Sep 2011 A1
20110229862 Parikh Sep 2011 A1
20110230755 MacFarlane et al. Sep 2011 A1
20110234807 Jones Sep 2011 A1
20110234834 Sugimoto Sep 2011 A1
20110235855 Smith Sep 2011 A1
20110237933 Cohen Sep 2011 A1
20110242134 Miller Oct 2011 A1
20110244939 Cammegh Oct 2011 A1
20110250929 Lin Oct 2011 A1
20110251478 Wieczorek Oct 2011 A1
20110255845 Kikuchi Oct 2011 A1
20110257566 Burdea Oct 2011 A1
20110260965 Kim Oct 2011 A1
20110262002 Lee Oct 2011 A1
20110267427 Goh Nov 2011 A1
20110267456 Adermann Nov 2011 A1
20110275957 Bhandari Nov 2011 A1
20110276396 Rathod Nov 2011 A1
20110279663 Fan Nov 2011 A1
20110285622 Marti Nov 2011 A1
20110286010 Kusik et al. Nov 2011 A1
20110291925 Israel Dec 2011 A1
20110293143 Narayanan et al. Dec 2011 A1
20110293146 Grycewicz Dec 2011 A1
20110298708 Hsu Dec 2011 A1
20110298824 Lee Dec 2011 A1
20110300994 Verkaaik Dec 2011 A1
20110301449 Maurer, Jr. Dec 2011 A1
20110301934 Tardis Dec 2011 A1
20110303214 Welle Dec 2011 A1
20110304541 Dalal Dec 2011 A1
20110304650 Canpillo Dec 2011 A1
20110304706 Border et al. Dec 2011 A1
20110306867 Gopinadhan Dec 2011 A1
20110310220 McEldowney Dec 2011 A1
20110310226 McEldowney Dec 2011 A1
20110316994 Lemchen Dec 2011 A1
20110317877 Bell Dec 2011 A1
20120002112 Huang Jan 2012 A1
20120004791 Buelthoff Jan 2012 A1
20120007839 Tsao et al. Jan 2012 A1
20120019645 Maltz Jan 2012 A1
20120020524 Ishikawa Jan 2012 A1
20120021806 Maltz Jan 2012 A1
20120027226 Desenberg Feb 2012 A1
20120029345 Mahfouz et al. Feb 2012 A1
20120032882 Schlachta Feb 2012 A1
20120033083 Horvinger Feb 2012 A1
20120035462 Maurer, Jr. et al. Feb 2012 A1
20120039505 Bastide et al. Feb 2012 A1
20120044363 Lu Feb 2012 A1
20120045091 Kaganovich Feb 2012 A1
20120049453 Morichau-Beauchant et al. Mar 2012 A1
20120051588 McEldowney Mar 2012 A1
20120051664 Gopalakrishnan et al. Mar 2012 A1
20120052949 Weitzner Mar 2012 A1
20120056982 Katz Mar 2012 A1
20120057640 Shi Mar 2012 A1
20120065492 Gertner et al. Mar 2012 A1
20120065494 Gertner et al. Mar 2012 A1
20120072041 Miller Mar 2012 A1
20120075166 Marti Mar 2012 A1
20120075177 Jacobsen Mar 2012 A1
20120076369 Abramovich Mar 2012 A1
20120081504 Ng Apr 2012 A1
20120083314 Ng Apr 2012 A1
20120083960 Zhu Apr 2012 A1
20120086778 Lee Apr 2012 A1
20120086809 Lee Apr 2012 A1
20120092445 McDowell Apr 2012 A1
20120092502 Knasel Apr 2012 A1
20120093481 McDowell Apr 2012 A1
20120098938 Jin Apr 2012 A1
20120101388 Tripathi Apr 2012 A1
20120105573 Apostolopoulos May 2012 A1
20120106814 Gleason et al. May 2012 A1
20120108909 Slobounov et al. May 2012 A1
20120113140 Hilliges May 2012 A1
20120113223 Hilliges May 2012 A1
20120116202 Bangera May 2012 A1
20120119999 Harris May 2012 A1
20120120072 Se May 2012 A1
20120120237 Trepess May 2012 A1
20120120243 Chien May 2012 A1
20120120277 Tsai May 2012 A1
20120121124 Bammer May 2012 A1
20120124604 Small May 2012 A1
20120127319 Rao May 2012 A1
20120133616 Nishihara May 2012 A1
20120133889 Bergt May 2012 A1
20120143029 Silverstein Jun 2012 A1
20120143212 Madhani Jun 2012 A1
20120147167 Mason Jun 2012 A1
20120154272 Hildreth Jun 2012 A1
20120154511 Hsu Jun 2012 A1
20120154536 Stoker Jun 2012 A1
20120154579 Hanpapur Jun 2012 A1
20120156661 Smith Jun 2012 A1
20120158197 Hinman Jun 2012 A1
20120162378 El Dokor et al. Jun 2012 A1
20120165964 Flaks Jun 2012 A1
20120167143 Longet Jun 2012 A1
20120169841 Chemali Jul 2012 A1
20120176314 Jeon Jul 2012 A1
20120184371 Shum Jul 2012 A1
20120188237 Han Jul 2012 A1
20120188371 Chen Jul 2012 A1
20120194422 El Dokor Aug 2012 A1
20120194517 Izadi et al. Aug 2012 A1
20120194561 Grossinger Aug 2012 A1
20120195466 Teng Aug 2012 A1
20120196660 El Dokor et al. Aug 2012 A1
20120197135 Slatkine Aug 2012 A1
20120200676 Huitema Aug 2012 A1
20120201428 Joshi et al. Aug 2012 A1
20120206604 Jones Aug 2012 A1
20120212594 Barns Aug 2012 A1
20120218407 Chien Aug 2012 A1
20120218421 Chien Aug 2012 A1
20120220233 Teague Aug 2012 A1
20120224666 Speller Sep 2012 A1
20120224743 Rodriguez Sep 2012 A1
20120225718 Zhang Sep 2012 A1
20120229643 Chidanand Sep 2012 A1
20120229651 Takizawa Sep 2012 A1
20120230561 Qureshi Sep 2012 A1
20120235896 Jacobsen Sep 2012 A1
20120238337 French Sep 2012 A1
20120238864 Piferi et al. Sep 2012 A1
20120242816 Cruz Sep 2012 A1
20120249741 Maciocci Oct 2012 A1
20120253201 Reinhold Oct 2012 A1
20120253241 Levital et al. Oct 2012 A1
20120262540 Rondinelli Oct 2012 A1
20120262558 Boger Oct 2012 A1
20120262583 Bernal Oct 2012 A1
20120268124 Herbst et al. Oct 2012 A1
20120275649 Cobb Nov 2012 A1
20120276995 Lansdale Nov 2012 A1
20120277001 Lansdale Nov 2012 A1
20120281093 Fong Nov 2012 A1
20120281873 Brown Nov 2012 A1
20120288142 Gossweiler, III Nov 2012 A1
20120288852 Willson Nov 2012 A1
20120289334 Mikhailov Nov 2012 A9
20120289822 Shachar et al. Nov 2012 A1
20120293412 El Dokor Nov 2012 A1
20120293506 Vertucci Nov 2012 A1
20120293663 Liu Nov 2012 A1
20120294511 Datta Nov 2012 A1
20120300961 Moeller Nov 2012 A1
20120303839 Jackson Nov 2012 A1
20120304126 Lavigne Nov 2012 A1
20120307075 Margalit Dec 2012 A1
20120307207 Abraham Dec 2012 A1
20120314066 Lee Dec 2012 A1
20120315016 Fung Dec 2012 A1
20120319946 El Dokor Dec 2012 A1
20120319989 Argiro Dec 2012 A1
20120320219 David Dec 2012 A1
20120326966 Rauber Dec 2012 A1
20120326976 Markovic Dec 2012 A1
20120326979 Geisert Dec 2012 A1
20120327241 Howe Dec 2012 A1
20120327246 Senior et al. Dec 2012 A1
20130002866 Hanpapur Jan 2013 A1
20130002879 Weber Jan 2013 A1
20130002900 Gossweiler, III Jan 2013 A1
20130009865 Valik Jan 2013 A1
20130010071 Valik Jan 2013 A1
20130013452 Dennard Jan 2013 A1
20130016009 Godfrey Jan 2013 A1
20130016876 Wooley Jan 2013 A1
20130021434 Ahiska Jan 2013 A1
20130021578 Chen Jan 2013 A1
20130024819 Rieffel Jan 2013 A1
20130030283 Vortman et al. Jan 2013 A1
20130033640 Lee Feb 2013 A1
20130033700 Hallil Feb 2013 A1
20130035590 Ma et al. Feb 2013 A1
20130035612 Mason Feb 2013 A1
20130040720 Cammegh Feb 2013 A1
20130041368 Cunningham Feb 2013 A1
20130053683 Hwang et al. Feb 2013 A1
20130057702 Chavan Mar 2013 A1
20130064426 Watkins, Jr. Mar 2013 A1
20130064427 Picard Mar 2013 A1
20130065517 Svensson Mar 2013 A1
20130066448 Alonso Mar 2013 A1
20130066526 Mondragon Mar 2013 A1
20130069773 Li Mar 2013 A1
20130070201 Shahidi Mar 2013 A1
20130070257 Wong Mar 2013 A1
20130072787 Wallace et al. Mar 2013 A1
20130076863 Rappel Mar 2013 A1
20130076944 Kosaka Mar 2013 A1
20130077823 Mestha Mar 2013 A1
20130079033 Gupta Mar 2013 A1
20130084980 Hammontree Apr 2013 A1
20130088584 Malhas Apr 2013 A1
20130093866 Ohlhues et al. Apr 2013 A1
20130096439 Lee Apr 2013 A1
20130102879 MacLaren et al. Apr 2013 A1
20130102893 Vollmer Apr 2013 A1
20130108979 Daon May 2013 A1
20130113791 Isaacs et al. May 2013 A1
20130211421 Abovitz et al. Aug 2013 A1
20130237811 Mihailescu et al. Sep 2013 A1
20130281818 Vija Oct 2013 A1
20140005527 Nagarkar et al. Jan 2014 A1
20140055563 Jessop Feb 2014 A1
20140073908 Biber Mar 2014 A1
20140088410 Wu Mar 2014 A1
20140133720 Lee et al. May 2014 A1
20140148685 Liu et al. May 2014 A1
20140159721 Grodzki Jun 2014 A1
20140171784 Ooi et al. Jun 2014 A1
20140378816 Oh et al. Dec 2014 A1
20150085072 Yan Mar 2015 A1
20150212182 Nielsen et al. Jul 2015 A1
20150265220 Ernst et al. Sep 2015 A1
20150289878 Tal et al. Oct 2015 A1
20150297120 Son et al. Oct 2015 A1
20150297314 Fowler Oct 2015 A1
20150316635 Stehning et al. Nov 2015 A1
20150323637 Beck et al. Nov 2015 A1
20150327948 Schoepp et al. Nov 2015 A1
20150331078 Speck et al. Nov 2015 A1
20150359464 Olesen Dec 2015 A1
20150366527 Yu et al. Dec 2015 A1
20160000383 Lee et al. Jan 2016 A1
20160000411 Raju et al. Jan 2016 A1
20160045112 Weissler et al. Feb 2016 A1
20160091592 Beall et al. Mar 2016 A1
20160166205 Ernst et al. Jun 2016 A1
20160228005 Bammer et al. Aug 2016 A1
20160249984 Janssen Sep 2016 A1
20160256713 Saunders et al. Sep 2016 A1
20160262663 MacLaren et al. Sep 2016 A1
20160287080 Olesen et al. Oct 2016 A1
20160310229 Bammer et al. Oct 2016 A1
20160313432 Feiweier et al. Oct 2016 A1
20170032538 Ernst Feb 2017 A1
20170038449 Voigt et al. Feb 2017 A1
20170143271 Gustafsson et al. May 2017 A1
20170303859 Robertson et al. Oct 2017 A1
20170319143 Yu et al. Nov 2017 A1
20170345145 Nempont et al. Nov 2017 A1
20180220925 Lauer Aug 2018 A1
20190004282 Park et al. Jan 2019 A1
20190059779 Ernst et al. Feb 2019 A1
Foreign Referenced Citations (46)
Number Date Country
100563551 Dec 2009 CN
104603835 May 2015 CN
105392423 Mar 2016 CN
106572810 Apr 2017 CN
106714681 May 2017 CN
29519078 Mar 1996 DE
102004024470 Dec 2005 DE
0904733 Mar 1991 EP
1319368 Jun 2003 EP
1354564 Oct 2003 EP
1524626 Apr 2005 EP
2023812 Feb 2009 EP
2515139 Oct 2012 EP
2747641 Jul 2014 EP
2948056 Dec 2015 EP
2950714 Dec 2015 EP
3157422 Apr 2017 EP
3188660 Jul 2017 EP
03023838 May 1991 JP
2015-526708 Sep 2015 JP
WO 9617258 Jun 1996 WO
WO 9938449 Aug 1999 WO
WO 0072039 Nov 2000 WO
WO 03003796 Jan 2003 WO
WO 2004023783 Mar 2004 WO
WO 2005077293 Aug 2005 WO
WO 2007025301 Mar 2007 WO
WO 2007085241 Aug 2007 WO
WO 2007136745 Nov 2007 WO
WO 2009101566 Aug 2009 WO
WO 2009129457 Oct 2009 WO
WO 2010066824 Jun 2010 WO
WO 2011047467 Apr 2011 WO
WO 2011113441 Sep 2011 WO
WO 2012046202 Apr 2012 WO
WO 2013032933 Mar 2013 WO
WO 2014005178 Jan 2014 WO
WO 2014116868 Jul 2014 WO
WO 2014120734 Aug 2014 WO
WO 2015022684 Feb 2015 WO
WO 2015042138 Mar 2015 WO
WO 2015092593 Jun 2015 WO
WO 2015148391 Oct 2015 WO
WO 2016014718 Jan 2016 WO
WO2017091479 Jun 2017 WO
WO2017189427 Nov 2017 WO
Non-Patent Literature Citations (89)
Entry
Ashouri, H., L. et al., Unobtrusive Estimation of Cardiac Contractility and Stroke Volume Changes Using Ballistocardiogram Measurements on a High Bandwidth Force Plate, Sensors 2016, 16, 787; doi:10.3390/s16060787.
Communication pursuant to Article 94(3) EPC for application No. 14743670.3, which is an EP application related to the present application, dated Feb. 6, 2018.
Extended Europen Search Report for application No. 14743670.3 which is a EP application related to the present application, dated Aug. 17, 2017.
Extended Europen Search Report for application No. 15769296.3 which is a EP application related to the present application, dated Dec. 22, 2017.
Extended European Search Report for application No. 15824707.2 which is a EP application related to the present appliation, dated Apr. 16, 2018.
Gordon, J. W. Certain molar movements of the human body produced by the circulation of the blood. J. Anat. Physiol. 11, 533-536 (1877).
Herbst et al., “Reproduction of Motion Artifacts for Performance Analysis of Prospective Motion Correction in MRI”, Magnetic Resonance in Medicine., vol. 71, No. 1, p. 182-190 (Feb. 25, 2013).
Kim, Chang-Sei et al. “Ballistocardiogram: Mechanism and Potential for Unobtrusive Cardiovascular Health Monitoring”, Scientific Reports, Aug. 9, 2016.
Maclaren et al., “Prospective Motion Correction in Brain Imaging: A Review” Online Magnetic Resonance in Medicine, vol. 69, No. 3, pp. 621-636 (Mar. 1, 2013.
Tarvainen, M.P. et al., “An advanced de-trending method with application to HRV analysis,” IEEE Trans. Biomed. Eng., vol. 49, No. 2, pp. 172-175, Feb. 2002.
Aksoy et al., “Hybrind Prospective and Retrospective Head Motion Correction to Mitigate Cross-Calibration Errors”, NIH Publication, Nov. 2012.
Aksoy et al., “Real-Time Optical Motion Correction for Diffusion Tensor Imaging, Magnetic Resonance in Medicine” (Mar. 22, 2011) 66 366-378.
Andrews et al., “Prospective Motion Correction for Magnetic Resonance Spectroscopy Using Single Camera Retro-Grate Reflector Optical Tracking, Journal of Magnetic Resonance Imaging” (Feb. 2011) 33(2): 498-504.
Angeles et al., “The Online Solution of the Hand-Eye Problem”, IEEE Transactions on Robotics and Automation, 16(6): 720-731 (Dec. 2000).
Anishenko et al., “A Motion Correction System for Brain Tomography Based on Biologically Motivated Models.” 7th IEEE International Conference on Cybernetic Intelligent Systems, dated Sep. 9, 2008, in 9 pages.
Armstrong et al., RGR-6D: Low-cost, high-accuracy measurement of 6-DOF Pose from a Single Image. Publication date unknown.
Armstrong et al., “RGR-3D: Simple, cheap detection of 6-DOF pose for tele-operation, and robot programming and calibration”, In Proc. 2002 Int. Conf. on Robotics and Automation, IEEE, Washington (May 2002).
Bandettini, Peter A., et al., “Processing Strategies for Time-Course Data Sets in Functional MRI of the Human Breain”, Magnetic Resonance in Medicine 30: 161-173 (1993).
Barmet et al, Spatiotemporal Magnetic Field Monitoring for MR, Magnetic Resonance in Medicine (Feb. 1, 2008) 60: 187-197.
Bartels, LW, et al., “Endovascular interventional magnetic resonance imaging”, Physics in Medicine and Biology 48: R37-R64 (2003).
Benchoff, Brian, “Extremely Precise Positional Tracking”, https://hackaday.com/2013/10/10/extremely-precise-positional-tracking/, printed on Sep. 16, 2017, in 7 pages.
Carranza-Herrezuelo et al, “Motion estimation of tagged cardiac magnetric resonance images using variational techniques” Elsevier, Computerized Medical Imaging and Graphics 34 (2010), pp. 514-522.
Chou, Jack C. K., et al., “Finding the Position and Orientation of a Sensor on a Robot Manipulator Using Quaternions”, The International Journal of Robotics Research, 10(3): 240-254 (Jun. 1991).
Cofaru et al “Improved Newton-Raphson digital image correlation method for full-field displacement and strain calculation,” Department of Materials Science and Engineering, Ghent University St-Pietersnieuwstraat, Nov. 20, 2010.
Ernst et al., “A Novel Phase and Frequency Navigator for Proton Magnetic Resonance Spectroscopy Using Water-Suppression Cycling, Magnetic Resonance in Medicine” (Jan. 2011) 65(1): 13-7.
Eviatar et al., “Real time head motion correction for functional MRI”, In: Proceedings of the International Society for Magnetic Resonance in Medicine (1999) 269.
Forbes, Kristen P. N., et al., “Propeller MRI: Clinical Testing of a Novel Technique for Quantification and Compensation of Head Motion”, Journal of Magnetic Resonance Imaging 14: 215-222 (2001).
Fulton et al., “Correction for Head Movements in Positron Emission Tomography Using an Optical Motion-Tracking System”, IEEE Transactions on Nuclear Science, vol. 49(1):116-123 (Feb. 2002).
Glover, Gary H., et al., “Self-Navigated Spiral fMRI: Interleaved versus Single-shot”, Magnetic Resonance in Medicine 39: 361-368 (1998).
Gumus et al., “Elimination of DWI signal dropouts using blipped gradients for dynamic restoration of gradient moment”, ISMRM 20th Annual Meeting & Exhibition, May 7, 2012.
Herbst et al., “Preventing Signal Dropouts in DWI Using Continous Prospective Motion Correction”, Proc. Intl. Soc. Mag. Reson. Med. 19 (May 2011) 170.
Herbst et al., “Prospective Motion Correction With Continuous Gradient Updates in Diffusion Weighted Imaging, Magnetic Resonance in Medicine” (2012) 67:326-338.
Hoff et al., “Analysis of Head Pose Accuracy in Augmented Reality”, IEEE Transactions on Visualization and Computer Graphics 6, No. 4 (Oct.-Dec. 2000): 319-334.
Horn, Berthold K. P., “Closed-form solution of absolute orientation using unit quaternions”, Journal of the Optical Society of America, vol. 4, p. 629-642 (Apr. 1987).
International Preliminary Report on Patentability for Application No. PCT/US2015/022041, dated Oct. 6, 2016, in 8 pages.
International Preliminary Report on Patentability for Application No. PCT/US2007/011899, dated Jun. 8, 2008, in 13 pages.
International Search Report and Written Opinion for Application No. PCT/US2007/011899, dated Nov. 14, 2007.
International Search Report and Written Opinion for Application No. PCT/US2014/012806, dated May 15, 2014, in 15 pages.
International Search Report and Written Opinion for Application No. PCT/US2015/041615, dated Oct. 29, 2015, in 13 pages.
International Preliminary Report on Patentability for Application No. PCT/US2014/013546, dated Aug. 4, 2015, in 9 pages.
International Search Report and Written Opinion for Application No. PCT/US2015/022041, dated Jun. 29, 2015, in 9 pages.
Josefsson et al. “A flexible high-precision video system for digital recording of motor acts through lightweight reflect markers”, Computer Methods and Programs in Biomedicine, vol. 49:111-129 (1996).
Katsuki, et al., “Design of an Artificial Mark to Determine 3D Pose by Monocular Vision”, 2003 IEEE International Conference on Robotics and Automation (Cat. No. 03CH37422), Sep. 14-19, 2003, pp. 995-1000 vol. 1.
Kiebel et al., “MRI and PET coregistration—a cross validation of statistical parametric mapping and automated image registration”, Neuroimage 5(4):271-279 (1997).
Kiruluta et al., “Predictive Head Movement Tracking Using a Kalman Filter”, IEEE Trans. On Systems, Man, and Cybernetics—Part B: Cybernetics, 27(2):326-331 (Apr. 1997).
Lerner, “Motion correction in fmri images”, Technion-Israel Institute of Technology, Faculty of Computer Science ( Feb. 2006).
Maclaren et al., “Combined Prospective and Retrospective Motion Correction to Relax Navigator Requirements”, Magnetic Resonance in Medicine (Feb. 11, 2011) 65:1724-1732.
MacLaren et al., “Navigator Accuracy Requirements for Prospective Motion Correction”, Magnetic Resonance in Medicine (Jan. 2010) 63(1): 162-70.
MacLaren, “Prospective Motion Correction in MRI Using Optical Tracking Tape”, Book of Abstracts, ESMRMB (2009).
Maclaren et al., “Measurement and correction of microscopic head motion during magnetic resonance imaging of the brain”, PLOS One, vol. 7(11):1-9 (2012).
McVeigh et al., “Real-time, Interactive MRI for Cardiovascular Interventions”, Academic Radiology, 12(9): 1121-1127 (2005).
Nehrke et al., “Prospective Correction of Affine Motion for Arbitrary MR Sequences on a Clinical Scanner”, Magnetic Resonance in Medicine (Jun. 28, 2005) 54:1130-1138.
Norris et al., “Online motion correction for diffusion-weighted imaging using navigator echoes: application to RARE imaging without sensitivity loss”, Magnetic Resonance in Medicine, vol. 45:729-733 (2001).
Olesen et al., “Structured Light 3D Tracking System for Measuring Motions in PET Brain Imaging”, Proceedings of SPIE, the International Society for Optical Engineering (ISSN: 0277-786X), vol. 7625:76250X (2010).
Olesen et al., “Motion Tracking in Narrow Spaces: A Structured Light Approach”, Lecture Notes in Computer Science (ISSN: 0302-9743)vol. 6363:253-260 (2010).
Olesen et al., “Motion Tracking for Medical Imaging: A Nonvisible Structured Light Tracking Approach”, IEEE Transactions on Medical Imaging, vol. 31(1), Jan. 2012.
Ooi et al., “Prospective Real-Time Correction for Arbitrary Head Motion Using Active Markers”, Magnetic Resonance in Medicine (Apr. 15, 2009) 62(4): 943-54.
Orchard et al., “MRI Reconstruction using real-time motion tracking: A simulation study”, Signals, Systems and Computers, 42nd Annual Conference IEEE, Piscataway, NJ, USA (Oct. 26, 2008).
Park, Frank C. and Martin, Bryan J., “Robot Sensor Calibration: Solving AX-XB on the Euclidean Group”, IEEE Transaction on Robotics and Automation, 10(5): 717-721 (Oct. 1994).
PCT Search Report from the International Searching Authority, dated Feburary 28, 2013, in 16 pages, regarding International Application No. PCT/US2012/052349.
Qin et al., “Prospective Head-Movement Correction for High-Resolution MRI Using an In-Bore Optical Tracking System”, Magnetic Resonance in Medicine (Apr. 13, 2009) 62: 924-934.
Schulz et al., “First Embedded In-Bore System for Fast Optical Prospective Head Motion-Correction in MRI”, Proceedings of the 28th Annual Scientific Meeting of the ESMRMB (Oct. 8, 2011) 369.
Shiu et al., “Calibration of Wrist-Mounted Robotic Sensors by Solving Homogeneous Transform Equations of the Form AX=XB”, IEEE Transactions on Robotics and Automation, 5(1): 16-29 (Feb. 1989).
Speck, et al., “Prospective real-time slice-by-slice Motion Correction for fMRI in Freely Moving Subjects”, Magnetic Resonance Materials in Physics, Biology and Medicine., 19(2), 55-61, published May 9, 2006.
Tremblay et al., “Retrospective Coregistration of Functional Magnetic Resonance Imaging Data using External monitoring”, Magnetic Resonance in Medicine 53:141-149 (2005).
Tsai et al., “A New Technique for Fully Autonomous and Efficient 3D Robotics Hand/Eye Calibration”, IEEE Transaction on Robotics and Automation, 5(3): 345-358 (Jun. 1989).
Wang, Ching-Cheng, “Extrinsic Calibration of a Vision Sensor Mounted on a Robot”, IEEE Transactions on Robotics and Automation, 8(2):161-175 (Apr. 1992).
Ward et al., “Prospective Multiaxial Motion Correction for fMRI”, Magnetic Resonance in Medicine 43:459-469 (2000).
Welch at al., “Spherical Navigator Echoes for Full 3D Rigid Body Motion Measurement in MRI”, Magnetic Resonance in Medicine 47:32-41 (2002).
Wilm et al., “Accurate and Simple Calibration of DLP Projector Systems”, Proceedings of SPIE, the International Society for Optical Engineering (ISSN: 0277-786X), vol. 8979 (2014).
Wilm et al., “Correction of Motion Artifacts for Real-Time Structured Light”, R.R. Paulsen and K.S. Pedersen (Eds.): SCIA 2015, LNCS 9127, pp. 142-151 (2015).
Yeo, et al. Motion correction in fMRI by mapping slice-to-volume with concurrent field-inhomogeneity correction:, International Conference on Medical Image Computing and Computer-Assisted Intervention, pp. 752-760 (2004).
Zaitsev, M., et al., “Prospective Real-Time Slice-by-Slice 3D Motion Correction for EPI Using an External Optical Motion Tracking System”, Proc.Intl.Soc.Mag.Reson.Med.11:517(2004).
Zeitsev et al., “Magnetic resonance imaging of freely moving objects: Prospective real-time motion correction using an external optical motion tracking system”, NeuroImage 31 (Jan. 29, 2006) 1038-1050.
Jochen Triesch, et al.“Democratic Integration: Self-Organized Integration of Adaptive Cues”, Neural Computation., vol. 13, No. 9, dated Sep. 1, 2001, pp. 2049-2074.
European Examination Report for application No. 15202598.7 dated Nov. 12, 2018.
European Examination Report for application No. 12826869.5 dated Mar. 4, 2019.
Gaul, Scott, Quiet Mind Cafe, https://www.youtube.com/watch?v=7wFX9Wn70eM, Jan. 2019.
https://www.innoveremedical.com/, Jan. 2019.
International Search Report and Written Opinion for Application No. PCT/US2019/013147, dated Apr. 29, 2019 in 10 pages.
Ming-Zhere Poh, D.J. McDuff, and R.W. Picard, “Advancements in Noncontact, Multiparameter Physiological Measurements Using a Webcam”, IEEE Transactions on Biomedical Engineering, vol. 58, No. 1, Jan 2011.
Rostaminia, A. Mayberry, D. Ganesan, B. Marlin, and J. Gummeson, “Low-power Sensing of Fatigue and Drowsiness Measures on a Computational Eyeglass”, Proc ACM Interact Mob Wearable Ubiquitous Technol.; 1(2): 23; doi: 10.1145/3090088, Jun. 2017.
Dold et al., “Advantages and Limitations of Prospective Head Motion Compensation for MRI Using an Optical Motion Tracking Device”, Academic Radiology, vol. 13(9):1093-1103 (2006).
Extended European Search Report for application No. 16869116.0 which is a EP Application related to the present application, dated Aug. 2, 2019.
Fodor et al., Aesthetic Applications of Intense Pulsed Light, DOI: 10.1007/978-1-84996-456-2_2, © Springer-Verlag London Limited 2011.
International Search Report and Written Opinion for Application No. PCT/US2019/020593 dated Jun. 12, 2019 in 12 pages.
Supplementary European Search Report for application No. 17790186.5 which is an EP application related to the present application, dated Nov. 4, 2019.
Van Gernert MJ, Welch AJ. Time constants in thermal laser medicine. Lasers Surg Med. 1989;9(4):405-421.
Wallace et al., Head motion measurement and correction using FID navigators, Magnetic Resonance in Medicine, 2019;81:258-274.
Related Publications (1)
Number Date Country
20180070904 A1 Mar 2018 US
Provisional Applications (1)
Number Date Country
61759883 Feb 2013 US
Continuations (1)
Number Date Country
Parent 14762583 US
Child 15696920 US