The disclosure relates generally to the field of motion tracking, and more specifically to systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan.
There are various modalities for performing medical imaging of patients. For example, magnetic resonance imaging (MRI) is a medical imaging technique used in radiology to visualize internal structures of the body in detail. An MRI scanner is a device in which the patient or a portion of the patient's body is positioned within a powerful magnet where a magnetic field is used to align the magnetization of some atomic nuclei (usually hydrogen nuclei—protons) and radio frequency magnetic fields are applied to systematically alter the alignment of this magnetization. This causes the nuclei to produce a rotating magnetic field detectable by the scanner and this information is recorded to construct an image of the scanned region of the body. These scans typically take several minutes (up to about 40 minutes in some scanners) and in some devices any significant movement can ruin the images and require the scan to be repeated.
Additionally, there are various radiation therapies, proton therapies, and other therapies that can be applied to patients. For example, radiation therapy can be applied to a targeted tissue region. In some systems, radiation therapy can be dynamically applied in response to patient movements. However, in many such systems, the tracking of patient movements does not have a high degree of accuracy. Accordingly, the use of such systems can result in the application of radiation therapy to non-targeted tissue regions, thereby unintentionally harming healthy tissue while intentionally affecting diseased tissue. The foregoing is also true for proton therapies and other therapies.
An accurate and reliable method of determining the dynamic position and orientation of a patient's head or other body portion during MRI scanning or therapeutic procedures is a requirement in any attempt to compensate for subject motion during such procedures. Toward this end, disclosed herein are systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan and/or therapeutic procedures, such as during a magnetic resonance imaging (MRI) scan and/or radiation therapy.
In some embodiments, a motion detection and correction system and/or device for tracking and correcting or compensating for patient motion during a medical imaging scan can be adapted to be integrated into a medical imaging scanner, such as an MRI scanner, or be adapted to retrofit a pre-existing medical imaging scanner. In certain embodiments, the motion detection system and/or device can comprise one or more carriers and a motion correction device housing, which can further comprise one or more camera modules or detectors and a power unit. In further embodiments, motion data of a subject collected and detected by a motion detection and correction device can be further analyzed by an image processing unit of the motion tracking and/or correction system.
In some embodiments, a motion correction device for a medical imaging scanner comprises: a device housing, wherein the device housing comprises an arcuate surface, and wherein the device housing comprises: one or more optics openings on the arcuate surface; one or more camera modules or detectors configured to detect motion of a subject of the medical imaging scanner through the one or more optics openings, wherein each of the one or more camera modules or detectors further comprises: a camera module or detector housing; and a sensor module placed within the camera module or detector housing, wherein the sensor module is configured to be removably coupled to the camera module or detector housing; a power unit configured to regulate power to the one or more camera module or detector; and one or more wires configured to connect the one or more camera module or detectors to the power unit, wherein the device housing is configured to be removably coupled to a top inner surface of a bore of the medical imaging scanner.
In certain embodiments, the device is configured to be removably coupled to a plurality of medical imaging scanners, wherein each of the plurality of medical imaging scanners comprises a bore of a different size. In some embodiments, the device is configured to detect motion of the subject of the medical imaging scanner and transmit the detected motion to a motion tracking system for processing the detected motion. In certain embodiments, the device is configured to be removed and reattached to the medical imaging scanner without losing alignment of the one or more camera modules or detectors.
In some embodiments, the one or more optics openings comprises indium tin oxide coated glass. In certain embodiments, the one or more optics openings protrude from the arcuate surface at an angle. In some embodiments, the device housing further comprises one or more radiofrequency chokes. In certain embodiments, the device housing further comprises one or more mounting clips, wherein the one or more mounting clips are configured to be removably attached to a mounting bracket, wherein the mounting bracket is attached to the top inner surface of the bore. In some embodiments, the camera module or detector housing is flash plated with a material configured to delay oxidation.
In certain embodiments, the camera module or detector housing comprises a top cover and a bottom cover, wherein the top cover comprises one or more non-parallel walls to eliminate standing waves. In some embodiments, the top cover comprises copper and/or nickel. In certain embodiments, an optics module is mechanically fixated to a sensor module within the camera module or detector housing. In some embodiments, the optics module comprises an optics and a sensor. In some embodiments, the optics module further comprises one or more mirrors. In certain embodiments, the optics is placed within the optics module in a longitudinal direction of the optics module. In some embodiments, the sensor module includes an imaging sensor, sensor electronics, a processing unit, and one or more light sources for illumination.
For purposes of this summary, certain aspects, advantages, and novel features of the invention are described herein. It is to be understood that not necessarily all such advantages may be achieved in accordance with any particular embodiment of the invention. Thus, for example, those skilled in the art will recognize that the invention may be embodied or carried out in a manner that achieves one advantage or group of advantages as taught herein without necessarily achieving other advantages as may be taught or suggested herein.
All of these embodiments are intended to be within the scope of the invention herein disclosed. These and other embodiments will become readily apparent to those skilled in the art from the following detailed description having reference to the attached figures, the invention not being limited to any particular disclosed embodiment(s).
The foregoing and other features, aspects, and advantages of the present inventions are described in detail below with reference to the drawings of various embodiments, which are intended to illustrate and not to limit the inventions. The drawings comprise the following figures in which:
Although several embodiments, examples, and illustrations are disclosed below, it will be understood by those of ordinary skill in the art that the inventions described herein extend beyond the specifically disclosed embodiments, examples, and illustrations and includes other uses of the inventions and obvious modifications and equivalents thereof. Embodiments of the inventions are described with reference to the accompanying figures, wherein like numerals refer to like elements throughout. The terminology used in the description presented herein is not intended to be interpreted in any limited or restrictive manner simply because it is being used in conjunction with a detailed description of certain specific embodiments of the inventions. In addition, embodiments of the inventions can comprise several novel features and no single feature is solely responsible for its desirable attributes or is essential to practicing the inventions herein described.
With the use of diagnostic technologies and therapeutic technologies, it can be advantageous to track for patient movement with a high degree of accuracy. Such high accuracy tracking can improve the imaging quality obtained and produced by diagnostic equipment, such as imaging technologies. Further, the use of high accuracy patient movement tracking technology can improve the application of patient therapies, such as radiation treatment, proton treatment, and the like. By accounting for patient movement with a high degree of accuracy, therapeutic technologies can apply therapies only to the targeted tissue and avoid healthy surrounding tissue.
U.S. Pat. No. 8,121,361, issued Feb. 21, 2012, entitled “MOTION TRACKING SYSTEM FOR REAL TIME ADAPTIVE IMAGING AND SPECTROSCOPY,” describes a system that adaptively compensates for subject motion, and the disclosure therein is hereby incorporated herein by reference. U.S. Pat. No. 9,305,365, issued Apr. 5, 2016, and entitled “SYSTEMS, DEVICES, AND METHODS FOR TRACKING MOVING TARGETS,” U.S. patent application Ser. No. 14/762,583, filed Jul. 22, 2015, and entitled “MOTION TRACKING SYSTEM FOR REAL TIME ADAPTIVE MOTION COMPENSATION IN BIOMEDICAL IMAGING,” U.S. patent application Ser. No. 13/594,563, filed Aug. 24, 2012, and entitled “METHODS, SYSTEMS, AND DEVICES FOR INTRA-SCAN MOTION CORRECTION,” U.S. patent application Ser. No. 14/806,521, filed Jul. 22, 2015, and entitled “SYSTEMS, DEVICES, AND METHODS FOR TRACKING AND COMPENSATING FOR PATIENT MOTION DURING A MEDICAL IMAGING SCAN,” U.S. patent application Ser. No. 14/762,581, filed Jul. 22, 2015, and entitled “SYSTEMS, DEVICES, AND METHODS FOR TRACKING AND COMPENSATING FOR PATIENT MOTION DURING A MEDICAL IMAGING SCAN,” and U.S. patent application Ser. No. 14/666,049, filed Mar. 23, 2015, and entitled “SYSTEMS, METHODS, AND DEVICES FOR REMOVING PROSPECTIVE MOTION CORRECTION FROM MEDICAL IMAGING SCANS,” are also incorporated herein by reference in their entirety.
The embodiments disclosed herein relate to a patient motion tracking and/or correction systems, devices, and methods. In some embodiments, motion tracking and/or correction systems, devices, and methods can be adapted to track and/or correct motion of a subject of a medical imaging scan as to produce high quality medical image scans despite movement by the subject. Similarly, in certain embodiments, motion tracking and/or correction systems, devices, and methods can be adapted to track and/or correct motion of a subject of a therapeutic procedure as to better apply therapy to a targeted area of the body. The embodiments disclosed herein can track patient movement with translation accuracies of about 0.1 mm and angle accuracies of about 0.1 degrees in order to obtain high quality medical image scans correcting for subject movement and/or better apply radiation therapy, proton therapy, or any other therapy to the targeted tissue or area of the body.
More specifically, as disclosed herein, the system can be adapted to track patient movement in order to feed such movement data to an MRI scanner such that the MRI scanner can adjust the focus and position of the scanner in order to produce a clear MRI image of the patient. Further, the system can be adapted to connect to therapeutic technologies. For example, the system can be adapted to track patient movement in order to direct a therapeutic radiation beam at a diseased tissue region while avoiding surrounding healthy tissue.
There are various technologies for therapeutic radiation and other therapeutics. For example, it can be advantageous in radiation therapy, proton therapy, or other therapies to dynamically apply the radiation to a targeted area in order to account for patient movement. Patient movement can include respiration, twitches or any other voluntary or involuntary movements of the patient. By dynamically and automatically tracking patient movement, radiation therapy, proton therapy, and any other kind of therapy can be applied in a more targeted way, thereby allowing surrounding healthy tissue to be avoided and/or unharmed.
Further, the patient movement tracking system, as disclosed herein, can be utilized to track periodic involuntary movement of the patient, such as breathing. By tracking the periodic patient movement with a high degree of accuracy, the system can be adapted to apply a radiation therapy, a proton therapy, or the like during strategic moments when the target tissue is in a certain position while the patient's involuntary movements continue. Additionally, the system can be adapted to track not only normal breathing movement of the patient, but also the system can be adapted to track irregular movement of the patient caused by patient activity or based on diseased tissue of the patient. For example, when a patient is running, the ribs of the patient have a larger egression that the system can track in order to continuously identify a target tissue area. In another example, the patient may be suffering from Chronic Obstructive Pulmonary Disease (COPD) or other breathing disorder or diagrammatic issues. For example, the patient could be suffering from theurofusion, which is water outside the lung that prevents the patient from breathing or a tumor is irritating a lung region thereby preventing normal breathing. The system can be adapted to track such irregular patient movements due to such conditions.
In certain embodiments, motion tracking and/or correction systems, devices, and methods can be integrated into one or more medical imaging scanners and/or therapeutic systems. In other embodiments, motion tracking and/or correction systems, devices, and methods can be adapted to be retrofitted into one or more pre-existing medical imaging scanners and/or therapeutic systems.
General Overview of Motion Tracking and/or Correction System
As discussed above, motion tracking and/or correction system, device, and/or methods described herein can be used in conjunction with a medical imaging scanner and/or a therapeutic system.
The motion tracking and/or correction system 100, as illustrated in
In the illustrated embodiment, the optical marker 110 is configured to be viewable by each of the two camera modules or detectors 108. The camera modules or detectors 108 can be, for example, digital cameras capable of acquiring images of the optical marker 110 and transmitting those images to the motion tracking system 102. In this embodiment, each of the camera modules or detectors 108 is configured to view the optical marker 110 from along a different line of sight. This can be helpful, for example, to enable the motion tracking system 102 to analyze two dimensional images of the optical marker 110 from different vantage points to help in locating the optical marker 110 to estimate patient motion or pose. In the illustrated embodiment, the camera modules or detectors 108 each are configured to view the optical marker 110 along a line of sight 120 separated from each other by an angle 122. In this embodiment, the angle 122 is approximately 90 degrees. Other angles may be used, such as 30 degrees, 45 degrees, 60 degrees, 70 degrees, etc. In some embodiments, 90 degrees is an optimal angle to enable maximum differentiation of in plane and out of plane motion of the optical marker 110, as further described below. For example, if the optical marker 110 moves in a direction that is directly along the line of sight of one detector, that detector may have a harder time distinguishing motion of the optical marker 110 than the other detector. On the other hand, the other detector may relatively easily detect the motion of the optical marker 110, as the motion is perpendicular to that detector's line of sight.
In some embodiments, the angle 122 may be referred to as a scissor angle. In the embodiment illustrated in
Mirrors or other devices used to redirect a line of sight can have both advantages and disadvantages. For example, disadvantages of mirrors include that they could potentially vibrate, potentially introducing error into the object orientation determination process. As another example, the further away a mirror is from a camera modules or detector, generally the larger the mirror needs to be to enable an equivalent range of vision. Accordingly, it can be advantageous to position a mirror relatively close to a camera modules or detector to enable the mirror to be relatively small. One advantage of using mirrors or other sight line redirection methods is that a virtual scissor angle can be configured to be closer to an optimal scissor angle of 90°, even when a particular medical imaging scanner configuration may not allow for camera modules or detectors that are positioned to directly view a marker using a 90° scissor angle. Further, some mirrors are not conductive, which can be advantageous in magnetic resonance imaging, because nonconductive mirrors will not introduce artifacts into MRI images. A digital camera, on the other hand, may include conductive components and/or a wire leading to the camera modules or detector may include conductive components. When a digital camera and/or its wire are within the medical imaging envelope, they may introduce artifacts into MRI images.
The embodiment of a motion tracking and/or correction system 100 illustrated in
As discussed above, in some embodiments, a motion tracking and/or correction system can be integrated into a medical imaging scanner or therapeutic device or adapted to be retrofitted to a pre-produced and/or pre-existing medical imaging scanner or therapeutic device. More specifically, the one or more camera modules or detectors 108 can be integrated in a medical imaging scanner or therapeutic device in some embodiments, whereas the one or more camera modules or detectors 108 can be retrofitted to a medical imaging scanner or therapeutic device in other embodiments.
In general, MRI or other medical imaging scanners can be of different sizes. The embodiments of integrated and retrofitted motion tracking and/or correction systems disclosed herein can be applied to MRI scanners and/or other medical imaging scanners of various sizes, including MRI scanners with a diameter of about 70 cm and/or with a diameter of about 60 cm. Moreover, an integrated and/or retrofit motion tracking and/or correction system can be adapted to fit a medical imaging scanner or MRI scanner with a diameter of about 40 cm, 50 cm, 60 cm, 70 cm, 80 cm, 90 cm, 100 cm, 110 cm, 120 cm, 130 cm, 140 cm, 150 cm or between a range defined by any two of the values mentioned above. In certain embodiments, an MRI scanner, other medical imaging scanner, and/or therapeutic device can have a diameter of about 694.5 mm, about 685.5 mm, about 684.52 mm, about 694.24 mm, about 597 mm, about 595 mm, and/or about 596.34 mm.
Further, in some embodiments, a head coil 124 can be adapted to be used in conjunction with a medical imaging scanner. Head coils 124 of different configurations and/or sizes may be used. For example, in some embodiments, a head/neck 64-channel configuration coil and/or a head/neck 20-channel configuration coil may be used in conjunction with a medical imaging scanner and/or a motion tracking and/or correction system and/or device. In other embodiments, head/neck (HN) configurations of 2-, 4-, 6-, 8-, 10-, 12-, 14-, 16-, 18-, 22-, 24-, 26-28-, 30-, 32-, 34-, 36-, 38, 40-, 42-, 44-, 46-, 48-, 50-, 52-, 54-, 56-, 58-, 60-, 62-channels or configurations of a number of channels within a range defined by any of the two aforementioned values can be used. The above-identified number of channels and/or head/neck coil configurations can be related to the resolution of the coil.
Furthermore, in certain embodiments, one or more markers 110 can be adapted to be used in conjunction with a motion tracking and/or correction system and/or device. As shown in
A retrofit and/or integrated motion tracking and/or correction system can comprise one or more camera modules or detectors 108. For example, a retrofit and/or integrated motion tracking and/or correction system can comprise one, two, three, four, five, six, seven, eight, nine, or ten camera modules or detectors 108. In certain embodiments, all camera modules or detectors 108 installed can be adapted to detect, track, and/or collect motion data of the subject. In some embodiments, a subset of the total number of camera modules or detectors 108 installed can be configured to detect, track, and/or collect motion data of the subject depending on the line of sight of each camera modules or detector 108 at any given point in time. For example, in some embodiments, one of a total of two camera modules or detectors 108, one of a total of three camera modules or detectors 108, two of a total of three camera modules or detectors 108, one of a total of four camera modules or detectors 108, two of a total of four camera modules or detectors 108, three of a total of four camera modules or detectors 108, one of a total of five camera modules or detectors 108, two of a total of five camera modules or detectors 108, three of a total of five camera modules or detectors 108, four of a total of five camera modules or detectors 108, one of a total of six camera modules or detectors 108, two of a total of six camera modules or detectors 108, three of a total of six camera modules or detectors 108, four of a total of six camera modules or detectors 108, and/or five of a total of six camera modules or detectors 108 can be configured to detect, track, and/or collect motion data of the subject.
In certain embodiments, a motion tracking and/or correction system, whether integrated or retrofit, comprises dynamic switching capabilities such that the system is configured to identify which of the one or more camera modules or detectors 108 are actually in a position to view the target and utilize only those camera modules or detectors 108 to track, maintain tracking, and/or continuously track the target. Further, in some embodiments, the specific angles of each camera modules or detector 108 can be optimized for optimal collection of motion data of the subject. For example, a motion tracking and/or correction system, whether integrated or retrofit, can be configured to determine an optimal position of one or more camera modules or detectors 108 for viewing the subject and alter the position or angle or direction of the camera modules or detector 108 accordingly.
Viewing and/or Monitoring Subject
In some embodiments, a retrofit or integrated motion tracking and/or correction system can be configured to capture and/or detect the position and movement of a subject and transfer such data in real-time, near real-time, or substantially thereof to one or more computing systems to allow a user to view and/or monitor the subject. In certain embodiments, one or more computing systems can further be adapted to receive data collected by the retrofit or integrated motion tracking and/or correction system to generate one or more images and/or video feed of the subject viewable by a user in real-time, near real-time, or substantially thereof. For example, in some embodiments, the computing device can comprise one or more displays adapted to visually display the current position and/or motion of the subject to a user, such that a user can monitor the subject.
In certain embodiments, one or more camera modules or detectors 108 of a retrofit or integrated motion tracking and/or correction system can be adapted to capture and/or detect the position and movement of a subject, which can further be visually displayed to a user. For example, in some embodiments, a retrofit or integrated motion tracking and/or correction system can comprise a plurality of camera modules or detectors 108 configured to collect data. One or more computing devices and/or an image processing module of the retrofit or integrated motion tracking and/or correction system can be adapted to generate an image, video, composite image, and/or composite video of a subject, based on the collected data, in real-time, near real-time, or substantially thereof. In some embodiments, one or more computing devices and/or an image processing module of the retrofit or integrated motion tracking and/or correction system and/or software thereof can be configured to generate a composite view of a subject after virtually removing the head coil 124 from the displayed view to a healthcare provider to provide an unobstructed view of the subject without the head coil. For viewing other body portions of the subject other than the head, virtual removal of the head coil 124 may not be required.
In some embodiments, the same one or more camera modules or detectors 108 used for tracking and/or detecting motion data of a subject are used to collect data for generating a visual display of the subject. In other embodiments, one or more additional camera modules or detectors different from camera modules or detectors 108 used for tracking and/or detecting motion data of a subject are used to collect data for generating a visual display of the subject. For example, in certain embodiments, the one or more camera modules or detectors used to collect data for generating a visual display of the subject can comprise a larger field of view than the camera modules or detectors 108 used to detect and/or track subject movement.
In certain embodiments, the one or more camera modules or detectors 108 used to collect data for generating a visual display of the subject can be turned to one or more different light spectrums, for example visible light, infrared, or near infrared. For example, in certain embodiments, the one or more camera modules or detector 108 can be configured to collect data to produce a night vision-type display of the subject as to allow a user to view the subject even in a dark setting. In some embodiments, the one or more camera modules or detectors 108 can comprise an adjustable field of view, and can, for example, go narrower to view a marker or go wider to view the subject.
In some embodiments, the one or more camera modules or detectors 108 can be configured to view the subject with or without any subject motion. For example, in certain embodiments, the one or more detectors 108 can be adapted to collect data to generate a visual display of a subject even when the subject is not moving, as to allow a user to check on a subject. In other words, in certain embodiments, subject or patient viewing can be active even when motion tracking is not active.
In some embodiments, a retrofit or integrated motion tracking and/or correction system is adapted to automatically determine which of the one or more detectors 108 or camera/detector modules to utilize to view the patient, based on a detected pose and/or position of the subject. For example, in certain embodiments, if a subject turns left, the system can be configured to automatically collect data using one or more detectors or cameras 108 located on the left side of a medical imaging scanner in order to generate a visual display of the subject and/or collect motion tracking data. Similarly, in some embodiments, if a subject subsequently turns right, the system can be configured to automatically collect data using one or more camera modules or detectors 108 located on the right side of the medical imaging scanner in order to generate a visual display of the subject and/or collect motion tracking data.
Integrated Motion Tracking and/or Correction System
As illustrated in
In order to attach the camera modules or detectors 108, the bore 104 can comprise one or more through holes 302 for fixating the one or more camera modules or detectors 108 and allowing the same to view the interior of the bore 104. The number of through holes 302 in the bore 104 can be equal to the number of camera modules or detectors. For example, a system with four camera modules or detectors 108A, 108B, 108C, 108D can also have four through holes 302A, 302B, 302C, 303D in the bore. In other embodiments, the bore can comprise more through holes 302 than the number of camera modules or detectors 108 to allow for installation of additional camera modules or detectors 108.
The camera modules or detectors 108 can be fixated or anchored to the outside or exterior surface of the bore 104 in a manner such that the lens or optics of each of the camera module or detector 108 faces inwards towards the interior of the bore 104. By attaching the detectors 108 in a manner such that substantially all of the detectors 108 are located outside of the interior of the bore 104 can allow for maximization of space inside the bore 104 of the medical imaging scanner or MRI scanner or therapeutic device. Accordingly, in some embodiments of the integrated motion tracking and/or correction system, the total interior volume of the bore 104 is not changed due to installation of one or more camera modules or detectors 108 of the motion tracking and/or correction system.
Retrofit Motion Tracking and/or Correction System
However, integrating a motion tracking and/or correction system and/or device may not be possible for some pre-existing medical imaging scanners, for example for one or more compatibility issues. As such, in some embodiments, a retrofit motion tracking and/or correction system can be installed and used in conjunction with a medical imaging scanner, such as an MRI scanner.
Contrary to the integrated system, however, one or more camera modules or detectors 108 can be anchored or fixated on the interior surface of the bore 104 in a retrofit system. Accordingly, the whole camera module or detector 108 can be located inside the bore 104 in a retrofit system. This configuration may decrease the total volume inside the bore 104 when compared to before installation of the one or more camera modules or detectors 108 of the retrofit motion tracking and/or correction system. As such, it can be advantageous in such embodiments to minimize the thickness of the one or more camera modules or detectors 108 and/or the motion correction device in order to maximize space inside the bore 104 of the medical imaging scanner or therapeutic device for the patient subject.
Further, retrofit motion tracking and/or correction systems may not require any through holes as described above in relation to the integrated system to be punctured in the bore 104. As such, while an integrated motion tracking and/or correction system may provide for a more permanent system, a retrofit motion tracking and/or correction system can provide a flexible system in which the motion correction device or one or more components thereof, such as one or more camera modules or detectors 108, may be installed and removed as needed. Further, without the need to puncture any relatively large through holes in the bore 104, as described above in relation to the integrated system, radiofrequency (RF) emissions may be controlled in a similar manner as originally designed for the medical imaging scanner or other therapeutic system to which the motion tracking and/or correction system is coupled to.
As illustrated, a retrofit motion correction device is attached to the interior surface of the bore 104 at the top of a pre-existing medical imaging scanner and/or MRI scanner. The particular embodiment as illustrated also comprises a head coil 124. However, other embodiments may not comprise a head coil 124 or may comprise a head coil of a different shape, configuration or size.
The motion correction device can comprise a device housing 502. The device housing 502 can comprise a wing or curved shape. It can be advantageous to provide a housing 502 for the motion correction device. For example, when installing each camera module or detector 108 of a retrofit motion detection and/or correction system to a pre-existing medical imaging scanner or therapeutic device separately, one may need to alter or modify the positioning and/or angle of one or more camera modules or detectors 108 of the motion detection and/or correction system for optimal results every time the motion detection and/or correction system reinstalled. However, if these one or more camera or detector modules 108 are pre-formed or pre-configured in a particular position and/or angle within the housing, the motion detection and/or correction system can be installed, removed, and/or reinstalled without losing alignment or without substantially losing alignment of one or more camera or detector modules. As such, with a housing 502, calibration or recalibration process of the one or more camera modules or detectors 108 may not be necessary or may be simplified when attaching or reattaching a motion correction device to a medical imaging scanner, such as an MRI scanner, or therapeutic device. Further, by use of a device housing 502 and/or mounting configuration as described herein, the exact position of the device housing 502 and detector modules 108 thereof relative to the isocenter of the scanner or therapeutic device can be controlled and may eliminate a need for cross calibration between a plurality of detectors and scanner or therapeutic device. Cross calibration can refer to the calibration to ensure identical coordinate systems of the motion detection system and the scanner or therapeutic device.
The device housing 502 can comprise a bottom surface facing towards the subject of interest. The bottom surface can be arcuate. The bottom surface can be substantially parallel to the interior surface of the bore 104. The bottom surface can comprise an arcuate shape that is substantially equal to the arcuate shape or configuration of the interior surface of the bore 104. As such, the interior space within the bore 104 can be maximized.
The device housing 502 can also comprise one or more side portions or surfaces. The one or more side portions or surfaces can be substantially perpendicular to the bottom surface. The one or more side portions or surfaces can also be substantially perpendicular to the interior surface of the bore when the device housing 502 is installed or coupled to the bore 104.
The device housing 502 can comprise one or more optics openings on the bottom surface to allow for the one or more camera modules or detectors 108 to view the subject. The one or more openings can be RF shielded, for example by a dual layer comprising indium tin oxide (ITO) coated glass window and/or wire mesh. Other materials and/or configurations can be used for RF shielding as well. The one or more openings can be uncovered in certain embodiments. The one or more openings can include optical filters to protect the inside device housing 502, and block light emissions originating from inside or outside the scanner bore from affecting the image detection of the detector modules 108. The one or more optics openings can comprise an oblong shape. In other embodiments, the one or more optics openings can be substantially circular, rectangular, triangular, or any other shape. The optics openings can comprise a shape or configuration that is substantially similar or the same with the shape or configuration of a camera module or detector as described herein. In the illustrated embodiment, each of the one or more optics openings comprises two substantially straight sides and two arcuate sides connecting the two substantially straight sides.
In the illustrated embodiment, the device housing 502 comprises four optics openings. In other embodiments, the device housing 502 can comprise one, two, three, four, five, six, seven, eight, nine, or ten optics openings. The number of optics openings can be within a range defined by two of the aforementioned values. In certain embodiments, the number of optics openings can be equal to the number of camera modules or detectors 108 present in the device or system. In other embodiments, the number of optics openings can be greater than the number of camera modules or detectors 108 and can allow for installation of additional camera modules or detectors.
The one or more optics openings can be angled with respect to the bottom surface of the device housing 502. For example, the one or more openings can protrude at an angle from bottom surface of the device housing 502 at about 5°, about 10°, about 15°, about 20°, about 25°, about 30°, about 35°, about 40°, about 45°, about 50°, about 55°, about 60°, about 65°, about 70°, about 75°, about 80°, about 85°, about 90°, and/or within a range defined by two of the aforementioned angles. All of the optics openings can protrude from the bottom surface of the device housing 502 at a substantially equal angle. In certain embodiments, some of the optics openings can protrude from the bottom surface of the device housing 502 at a substantially equal angle while others protrude from the bottom surface at different angles. For example, in the illustrated embodiment, optics openings for top detector modules 108B and 108C can protrude from the bottom surface at a substantially equal angle but in opposite directions or mirror images with respect to a vertical plane drawn at the center of the device housing 502 along a longitudinal axis of the bore 104. Similarly, optics openings for side detector modules 108A and 108D can protrude from the bottom surface at a substantially equal angle but in opposite directions or mirror images with respect to a vertical plane drawn at the center of the device housing 502 along a longitudinal axis of the bore 104. In other embodiments, each of the plurality of optics openings can protrude from the bottom surface at different angles.
The angle of protrusion of the one or more optics openings can be made to optimize viewing by the one or more camera modules or detectors 108. The angle of protrusion of the one or more optics openings can be made to be optimal for viewing for a particular medical imaging scanner, such as an MRI scanner, or therapeutic device according to the shape of the interior of the bore 104. The angle of protrusion of the one or more optics openings can be made to be optimal for viewing for a plurality of medical imaging scanners, such as MRI scanners, or therapeutic devices.
In some embodiments, a single device housing 502, comprising one or more optics openings and one or more camera modules or detectors 108, can be installed or coupled to a medical imaging scanner or therapeutic device. In other embodiments, a plurality of device housings 502, each comprising one or more optics openings and one or more camera modules or detectors 108, can be installed or coupled to a medical imaging scanner or therapeutic device.
In some embodiments, the motion correction device housing 502 and/or a portion thereof is made of a material that does not affect the medical imaging scanner or therapeutic device. For example the motion correction device housing and/or a portion thereof, for example other than the optics openings, can 502 be made from a plastic material that is transparent to medical imaging scanners or therapeutic devices in general and/or to a particular medical imaging scanner, such as an MRI scanner, or therapeutic device. In some embodiments, the motion correction device housing 502 and/or a portion thereof, for example other than the optics openings, comprises ABS plastic.
Mounting Bracket
As discussed above, the device housing 502 can be attached or coupled to the top of the interior surface of the bore 104. More specifically, the device housing 502 can be attached mechanically using a mounting bracket and/or mounting clips.
As illustrated, in some embodiments, a retrofit motion detection and/or correction system comprises one or more mounting brackets 702 for attaching a device housing 502 to a medical imaging scanner or therapeutic device. In certain embodiments, one or more mounting brackets 702 can be configured to be attached to the interior surface of the bore 104 of a medical imaging scanner or therapeutic device. For example, one or more mounting brackets 702 can be configured to be attached to the top, left, right, side, bottom, and/or diagonal position inside a bore 104 of a medical imaging scanner or therapeutic device along the interior wall of the bore 104. In some embodiments, one or more mounting brackets 702 can be configured to be attached to the bore 104 via one or more adhesives and/or one or more mechanical configurations.
In some embodiments, one or more mounting brackets 702 can be configured to be semi-permanently or permanently attached to the bore 104 of a medical imaging scanner or therapeutic device, and a motion correction device housing 502 can be configured to attach to the one or more mounting brackets 702. For example, in such embodiments, the motion correction device housing 502 can be configured to be easily attached and/or removed from the mounting bracket 702. In certain embodiments, one or more mounting brackets 702 can be configured to be attached to a device housing 502 via one or more adhesives and/or one or more mechanical configurations. For example, a mounting bracket 702 can be permanently or semi-permanently mounted to the top of an MRI bore 104 and a motion correction device housing 502 can be configured to attach to the mounting bracket 702 via a mechanical locking configuration.
The device housing 502 can comprise one or more mounting clips 706. In the illustrated embodiment, the device housing 502 comprises four mounting clips 706A, 706B, 706C, 706D. Similarly, the mounting bracket 702 can comprise four corresponding mechanical receivers for receiving the four mounting clips 706A, 706B, 706C, 706D. In some embodiments, the device housing 502 can comprise one, two, three, four, five, six, seven, eight, nine, or ten mounting clips 706. The number of mounting clips 706 of a device housing 502 can also be within a range defined by two of the aforementioned values. Similarly, in certain embodiments, the mounting bracket 702 can comprise one, two, three, four, five, six, seven, eight, nine, or ten receivers for receiving mounting clips 706 of the device housing 502. The number of receivers on the mounting bracket 702 for receiving mounting clips 706 of a device housing 502 can also be within a range defined by two of the aforementioned values
The mechanical locking procedure for attaching the device housing 502 to the mounting bracket 702 can comprise a single step. For example, an operator may only need to push the device housing 502 in a generally upward direction towards the mounting bracket 702 to attach and fixate the device housing 502 to the mounting bracket 702. In other embodiments, the attachment procedure can be twofold. For example, an operator may first push the device housing 502 in a generally upward direction towards the mounting bracket 702 and then horizontally push or pull the device housing 502 in a longitudinal direction along the bore 104 to attach and fixate the device housing 502 to the mounting bracket 702. In certain embodiments, the procedure for attaching the device housing 502 to a mounting bracket 702 can comprise three or more steps.
Device Components
As illustrated, the device housing 502 can provide a cover for and/or comprise one or more device components. In some embodiments, as shown in
In the embodiment shown in
As discussed herein, in some embodiments, a motion correction device housing 502 can comprise one or more cables and/or wires to connect one or more device components. For example, the one or more cables and/or wires can comprise one or more power cables and/or signal transmission cables, such as fiber optics. The one or more power cables and/or one or more signal transmission cables can be configured to connect to an image processing unit. The image processing unit can be an image processing computer, digital signal processor (DSP), field-programmable gate array (FPGA), or others. In some embodiments, an FPGA on a sensor module is used for imaging processing, the results of which are transmitted to a Raspberry PI type processor (ARM) for further analysis. In certain embodiments, an image processing unit can be configured to send to the scanner data comprising recent or most recent head pose data in six degrees of freedom of the subject. In some embodiments, a motion correction device and/or system and/or medical imaging scanner is configured to utilize such data in order to alter the image acquisition plane.
Camera Module/Detector
As discussed above, a motion tracking and/or correction system and/or device can comprise one or more camera modules or detectors 108 configured to detect, track, and/or collect motion data of a subject.
In some embodiments, the top cover 1002 or other portion of the camera module or detector housing comprises one or more rounded corners to reduce RF emission. In certain embodiments, the top cover 1002, bottom cover 1004, and/or other portions of the camera module or detector housing comprises non-parallel walls to eliminate standing waves. In some embodiments, the top cover 1002, bottom cover 1004, optics module 1006, and/or any portion of the camera module or detector housing and/or optics module 1006 comprises ceramic material for rigidity and/or high thermal conductivity. Further, in some embodiments, the bottom cover 1004 and/or other portion of the camera module or detector housing comprises one or more waveguides 1008 to provide an exit for one or more wires, fiber-optics, and/or cables.
In some embodiments, the camera module or detector housing is configured such that one or more sensor modules 1005 and/or optics modules 1006 thereof can be switched. For example, it may be advantageous to easily replace one or more sensor modules 10056 and/or optics modules 1006 configured to be used in conjunction with and/or optimized for use with a particular medical image scanner with another sensor module 1005 and/or optics modules 1006 configured to be used in conjunction with and/or optimized for use with another particular medical image scanner. In some embodiments, to replace a sensor module 1005 and/or optics modules 1006, an operator can selectively remove a top cover 1002 from the bottom cover 1004 and replace the pre-installed sensor module 1005 comprising optics module 1006A with another sensor module comprising optics module 1006B. The sensor module 1005 and/or optics modules 1006 can be replaced for maintenance or repair reasons. Also or alternatively, a sensor module 1005 comprising optics module 1006A for use with a medical imaging scanner or therapeutic device with a 60 cm bore can be replaced with a sensor module comprising optics module 1006B for use with a medical imaging scanner or therapeutic device with a 70 cm bore.
In some embodiments, the camera module or detector housing is goniometrically mounted to the device housing 502 to allow for optical alignment with the subject. Further, in certain embodiments, the camera module or detector housing comprises an optics opening 1104. The optics opening, in some embodiments, can be RF shielded by a dual layer comprising ITO coated glass window and/or wire mesh. Other materials and/or configurations can be used for RF shielding as well.
The line of vision and/or visual field of the optics 1204 through a first end of the optics 1204 can be configured to be bent by a mirror 1206 and through the opening 1212 to view the subject. In other words, the first end of the optics 1204 and the opening 1212 can be configured in a perpendicular or angular configuration. Such configuration can be advantageous to allow for the optics 1204 to be placed horizontally along the longitudinal axis of the bore 104 of the medical imaging scanner, therapeutic device, and/or the sensor module 1005 to minimize space of the motion correction device. In other embodiments, the optics 1204 can be placed vertically perpendicular to the longitudinal axis of the bore 104 of the medical imaging scanner, therapeutic device, and/or the sensor module 1005 and may not require a mirror 1206A for the first end of the optics 1204 to view the subject through the opening 1212. In other words, the first end of the optics 1204 and the opening 1212 can generally be along a straight line. The mirror 1206A may also be configured to bend the light source towards the opening 1212. In certain embodiments, the light source may be configured to directly shine light through the opening 1212 without the light being bent through a mirror 1206A.
The motion data and/or visual data collected by the optics 1204 can then transmitted through a second end of the optics 1204 and be bent by another mirror 1206B to reach a sensor or imager 1208. In other words, the second end of the optics 1204 and the sensor 1208 can be configured in a perpendicular or angular configuration. Such configuration can be advantageous to allow for the optics 1204 to be placed horizontally along the longitudinal axis of the bore 104 of the medical imaging scanner, therapeutic device, and/or the sensor module to minimize space of the motion correction device. In other embodiments, the second end of the optics 1204 and the sensor 1208 can generally be along a straight line. For example, the sensor 1208 can be in a vertical configuration that is perpendicular or angular to the sensor module 1005. In the illustrated embodiment, the sensor 1208 is in a horizontal configuration and generally parallel to the sensor module 1005.
In some embodiments, screws, nuts, or the like can be used to mechanically fixate the optics 1204 within the optics module housing 1202. For example, one or more screws, nuts, or the like can be placed through one or more holes 1302 to fixate the optics 1204. In the illustrated embodiment with three holes 1302A, 1302B, 1302C, screws, nuts, or the like can be placed through one, two, or all three holes 1302A, 1302B, 1302C to fixate the optics 1204. In some embodiments, the optics 1204 can be fixated by use of a single screw through a single hole 1302B. The optics 1204 can be fixated by use of glue or other chemical compound.
The optics 1204 can be 16 mm optics, for example for a large medical imaging scanner or therapeutics device with a diameter of about 70 cm. In addition or alternatively, the optics 1204 can be 12 mm optics, for example for a small medical imaging scanner or therapeutics devices with a diameter of about 60 cm. In certain embodiments, wider angle optics, such as 10 mm optics or 8 mm optics, can also or alternatively be used. To ensure appropriate focus distance to the patient subject, the position of the optics 1204 can be aligned through one or more of the holes 1302, prior to fixating the optics 1204.
Parameters
As discussed above, in some embodiments, one or more camera modules or detectors of a motion tracking and/or correction system and/or device can be optimally located in order to maximize the quality of motion data that is collected of a subject. Some of the following parameters are defined and/or described in the context of obtaining motion data of a head of a subject. However, it is to be understood that similar parameters can also be defined for other body portions of a subject that are of interest in a medical imaging scan and/or therapeutic procedure.
As illustrated, a subject of interest 112 can lie on a bed 114 of a medical imaging scanner and/or therapeutic device. One or more detectors or cameras 108 can be placed on or along the bore 104 of the scanner or therapeutic device.
A longitudinal detector position 1402 along the Z-axis between the center of the MRI scanner or therapeutic device and the camera module or detector 108 can exist for different detectors used in conjunction with different medical imaging scanners and/or therapeutic devices. In some embodiments, the longitudinal detector position 1402 can be about +/−0 mm, about +/−10 mm, about +/−20 mm, about +/−30 mm, about +/−40 mm, about +/−50 mm, about +/−60 mm, about +/−70 mm, about +/−80 mm, about +/−90 mm, about +/−100 mm, about +/−150 mm, about +/−200 mm, about +/−250 mm, about +/−300 mm, about +/−350 mm, about +/−400 mm, about +/−450 mm, about +/−500 mm, about +/−550 mm, about +/−600 mm, about +/−650 mm, about +/−700 mm, about +/−750 mm, about +/−800 mm, about +/−850 mm, about +/−900 mm, about +/−950 mm, about +/−1000 mm, and/or in a range defined by any of the two aforementioned values.
Similarly, a transversal detector position 1404, 1406 on the X-Y plane as defined herein can exist for different detectors used in conjunction with different medical imaging scanners and/or therapeutic devices. In certain embodiments, the transversal detector position 1404, 1406 can be about 0°, about 10°, about 20°, about 30°, about 40°, about 50°, about 60°, about 70°, about 80°, about 90°, about 110°, about 120°, about 130°, about 140°, about 150°, about 160°, about 170°, about 180°, and/or in a range defined by any of the two aforementioned values.
A transversal detector direction 1408, 1410 on the X-Y plane as defined herein can exist for different detectors used in conjunction with different medical imaging scanners and/or therapeutic devices, as the direction of a detector can be different from the position thereof. In some embodiments, the transversal detector direction 1408, 1410 can be about 0°, about 10°, about 20°, about 30°, about 40°, about 50°, about 60°, about 70°, about 80°, about 90°, about 110°, about 120°, about 130°, about 140°, about 150°, about 160°, about 170°, about 180°, and/or in a range defined by any of the two aforementioned values.
A transversal detector offset on the X-Y plane as defined herein can exist for different detectors used in conjunction with different medical imaging scanners and/or therapeutic devices, in which the transversal detector offset can be defined as the difference between the transversal detector position and the transversal detector direction. In certain embodiments, the transversal camera offset can be about 0°, about 10°, about 20°, about 30°, about 40°, about 50°, about 60°, about 70°, about 80°, about 90°, about 110°, about 120°, about 130°, about 140°, about 150°, about 160°, about 170°, about 180°, and/or in a range defined by any of the two aforementioned values.
An overlap point 1412 between the field of view of the different detectors can exist on the X-Y plane as defined herein for different detectors used in conjunction with different medical imaging scanners and/or therapeutic devices. In some embodiments, x-y coordinates for the overlap point 1412 can each comprise about 0 mm, about 10 mm, about 20 mm, about 30 mm, about 40 mm, about 50 mm, about 60 mm, about 70 mm, about 80 mm, about 90 mm, about 100 mm, about 110 mm, about 120 mm, about 130 mm, about 140 mm, about 150 mm, about 160 mm, about 170 mm, about 180 mm, about 190 mm, about 200 mm, and/or in a range defined by any of the two aforementioned values.
Lastly, a scissor angle 1414, 1416 can exist between two detectors on the X-Y plane as defined herein for different detectors used in conjunction with different medical imaging scanners and/or therapeutic devices. In certain embodiments, a larger scissor angle can result in increased accuracy of tracking. In some embodiments, the scissor angle can be about 0°, about 10°, about 20°, about 30°, about 40°, about 50°, about 60°, about 70°, about 80°, about 90°, about 110°, about 120°, about 130°, about 140°, about 150°, about 160°, about 170°, about 180°, and/or in a range defined by any of the two aforementioned values.
Each of the above-identified parameters can be different from an integrated motion tracking and/or correction system, a retrofit motion tracking and/or correction system, and/or specific head coils. Some example values of the above-identified parameters are included below. However, other values are also possible for each of the parameters depending on the detector and/or specifics of the medical imaging scanner and/or therapeutic device.
As illustrated, a subject of interest 112 can lie on a bed 114 of a medical imaging scanner and/or therapeutic device. A marker 110 as described herein may be placed on or near the nose of the subject 112. The head of the subject 112 may rotate, thereby rotating the marker 110 as well. As illustrated, a distance between the body coil and the marker 110 may be obtained.
More specifically, a minimum distance between the body coil and the marker 110 may be present, measured when the nose of the subject 112 is at the end of the head coil and/or is in contact with the head coil. A default distance between the body coil and the marker 110 may exist, measured when the head of subject 112 is at a default position in the center of the head coil facing straight up towards the center of the top of the bore 104. Lastly a maximum distance between the body coil and the marker 110 can be measured, for example when the head of the subject 112 is rotated 10 degrees for certain head coils. For other head coils, smaller or larger rotations are also or alternatively possible. For example, in certain embodiments, the head of the subject 112 can be rotated about 5 degrees, about 10 degrees, about 15 degrees, about 20 degrees, about 25 degrees, about 30 degrees, about 35 degrees, about 40 degrees, about 45 degrees, about 50 degrees, about 55 degrees, about 60 degrees, about 65 degrees, about 70 degrees, about 75 degrees, about 80 degrees, about 85 degrees, about 90 degrees, and/or within a range defined by two of the aforementioned values.
The minimum distance, default distance, and maximum distance between the body coil and the marker 110 can each be about 100 mm, about 110 mm, about 120 mm, about 130 mm, about 140 mm, about 150 mm, about 160 mm, about 170 mm, about 180 mm, about 190 mm, about 200 mm, about 210 mm, about 220 mm, about 230 mm, about 240 mm, about 250 mm, about 260 mm, about 270 mm, about 280 mm, about 290 mm, about 300 mm, about 310 mm, about 320 mm, about 330 mm, about 340 mm, about 350 mm, about 360 mm, about 370 mm, about 380 mm, about 390 mm, about 400 mm, and/or within a range defined by two of the aforementioned values.
A range of the distance between the body coil to the marker 110 can be obtained based on the above-identified parameters. Further, based on this range of the distance between the body coil to the marker 110, an alignment distance, defined as the distance between the detector module 108C, 108D to the marker 110 can be obtained, accounting for a range of +10 mm and rounded by −2/+4 mm.
Each of the above-identified parameters can be different from an integrated motion tracking and/or correction system and a retrofit motion tracking and/or correction system. Further, each of the above-identified parameters can be different for different head coils being used, for example a head coil with a 64 head-neck (HN64) configuration or a 20 head-neck (HN20) configuration.
Some example values of the above-identified parameters are included below. However, other values are also possible for each of the parameters depending on the detector and/or specifics of the medical imaging scanner and/or therapeutic device.
In certain embodiments, the minimum distance 1602, 1604 between a body coil and a head coil 508 can be about 0 mm, about 10 mm, about 20 mm, about 30 mm, about 40 mm, about 50 mm, about 60 mm, about 70 mm, about 80 mm, about 90 mm, about 100 mm, about 110 mm, about 120 mm, about 130 mm, about 140 mm, about 150 mm, about 160 mm, about 170 mm, about 180 mm, about 190 mm, about 200 mm, about 210 mm, about 220 mm, about 230 mm, about 240 mm, about 250 mm, and/or within a range defined by two of the aforementioned values.
Each of the above-identified parameters can be different from an integrated motion tracking and/or correction system and a retrofit motion tracking and/or correction system. Further, each of the above-identified parameters can be different for different head coils being used, for example a head coil with a 64 head-neck configuration or a 20 head-neck configuration.
Some example values of the above-identified parameters are included below. However, other values are also possible for each of the parameters depending on the detector and/or specifics of the medical imaging scanner and/or therapeutic device.
Tables 11-14 below list parameters of various distances and fields of view of four embodiments of camera modules or detectors. More specifically, Tables 11 and 12 list parameters of various distances and fields of view of two embodiments of camera modules or detectors of an integrated motion tracking and/or correction system. Tables 13 and 14 list parameters of various distances and fields of view of two embodiments of camera modules or detectors of a retrofit motion tracking and/or correction system.
In Tables 11-14, the optics value refers to the field of view, depth of field, and resolution for tracking. The marker distance refers to the distance between the body coil and the marker. The lens working distance refers to the distance between the marker and the front lens of the detector module. The lens center distance refers to the distance between the marker and the center of the lens of the detector module. The camera alignment distance refers to the distance between the front of the detector module to the marker. The back focal length refers to the lens data at the lens working distance. The field of view radius refers to the distance between the optics center and the scanner center. The field of view (half angle) and field of view at marker refer to lens data. The body coil to head coil distance refers to the distance between the body coil and the exterior of the head coil. The LED hazard distance refers to the distance between the LED source and the head coil. The LED illumination distance refers to the distance between the LED source to the marker.
L can refer to a length between the center of an optics opening or opening of the camera or detector module and the LED or other light source. In some embodiments, the length between the center of an optics opening or opening of the camera or detector module and the LED or other light source can be about 5 mm, about 6 mm, about 7 mm, about 8 mm, about 9 mm, about 10 mm, about 11 mm, about 12 mm, about 13 mm, about 14 mm, about 15 mm, about 16 mm, about 17 mm, about 18 mm, about 19 mm, about 20 mm, and/or within a range defined by two of the aforementioned values.
An observation angle of the camera or detector module, denoted α, can be calculated according to the following formula:
In some embodiments, the observation angle of the camera or detector module can be about 1.0°, about 1.2°, about 1.4°, about 1.6°, about 1.8°, about 2.0°, about 2.2°, about 2.4°, about 2.6°, about 2.8°, about 3.0°, about 3.2°, about 3.4°, about 3.6°, about 3.8°, about 4.0°, about 4.2°, about 4.4°, about 4.6°, about 4.8°, about 5.0°, and/or within a range defined by two of the aforementioned values.
The observation angle can be different with respect to a particular LED or other light source for a particular camera or detector module. For example, the observation angle with respect to a first LED or light source on the right side of
Some example values of the above-identified parameters are included below. The following example values are for embodiments in which L1 is about 8.75 mm and L2 is about 14.25 mm. However, other values are also possible for each of the parameters depending on the particular positioning and/or distance of the LED or other light source relative to the detector module and/or specifics of the medical imaging scanner and/or therapeutic device.
The entrance angle can also depend on the rotational configuration of the head of the subject. When the head of the subject is tiled, for example by 10°, the entrance angle can be varied. Further, the entrance angle can be different for an integrated motion tracking and/or correction system and a retrofit motion tracking and/or correction system. The entrance angle can also depend on the particular size of a medical imaging scanner or therapeutic device. Further, the entrance angle can be different for different head coils being used, for example a head coil with a 64 head-neck configuration or a 20 head-neck configuration. This can be because the entrance angle of the marker may be blocked by the particular shape or configuration of the head coil.
In some embodiments, the entrance angle can be about −90°, about −85°, about −80°, about −75°, about −70°, about −65°, about −60°, about −55°, about −50°, about −45°, about −40°, about −35°, about −30°, about −25°, about −20°, about −15°, about −10°, about −5°, about 0°, about 5°, about 10°, about 15°, about 20°, about 25°, about 30°, about 35°, about 40°, about 45°, about 50°, about 55°, about 60°, about 65°, about 70°, about 75°, about 80°, about 85°, about 90°, and/or within a range defined by two of the aforementioned values.
Some example values of the entrance angle are included below. However, other values are also possible for the entrance angle depending on the detector and/or specifics of the medical imaging scanner and/or therapeutic device.
As illustrated, in embodiments comprising one or more detector modules, the line of sight or visual field of the detector modules may overlap. For example, the visual field of one or more detectors 108A, 108B, 108C, 108D may overlap with the visual field of one or more other detectors 108A, 108B, 108C, 108D. The visual field of two top-positioned detectors 108B, 108C may overlap. A horizontal distance of overlap 2008 between the two top detectors 108B, 108C, when viewed from a front view into the bore of the medical imaging scanner or therapeutic device, can be formed on the bottom or bed of the scanner or device. Similarly, a horizontal distance of overlap 2010 between the two top detectors 108B, 108C, when viewed from a side view into the bore of the medical imaging scanner or therapeutic device, can be formed on the bottom or bed of the scanner or device. These two distances 2008, 2010 can form the dimensions of the overlap in visual field between two top detectors 108B, 108C.
The motion tracking and/or correction system may have a tracking working distance, which can be defined as a range of distances in which the motion tracking is functional or operative. In some embodiments, the tracking working distance can be about 100 mm, about 110 mm, about 120 mm, about 130 mm, about 140 mm, about 150 mm, about 160 mm, about 170 mm, about 180 mm, about 190 mm, about 200 mm, about 210 mm, about 220 mm, about 230 mm, about 240 mm, about 250 mm, about 260 mm, about 270 mm, about 280 mm, about 290 mm, about 300 mm, about 310 mm, about 320 mm, about 330 mm, about 340 mm, about 350 mm, and/or within a range defined by two of the aforementioned values.
The motion tracking and/or correction system may also have a calibration target distance. The calibration target distance may be defined as a range of distance between a visual length 2004 of a top detector 108B, as limited by the particulars of the medical imaging scanner or therapeutic device, and a visual length 2006 of a side detector 108A, similarly limited by the particulars of the medical imaging scanner or therapeutic device.
In certain embodiments, the calibration target distance can be about 100 mm, about 110 mm, about 120 mm, about 130 mm, about 140 mm, about 150 mm, about 160 mm, about 170 mm, about 180 mm, about 190 mm, about 200 mm, about 210 mm, about 220 mm, about 230 mm, about 240 mm, about 250 mm, about 260 mm, about 270 mm, about 280 mm, about 290 mm, about 300 mm, about 310 mm, about 320 mm, about 330 mm, about 340 mm, about 350 mm, about 360 mm, about 370 mm, about 380 mm, about 390 mm, about 400 mm, about 410 mm, about 420 mm, about 430 mm, about 440 mm, about 450 mm, about 460 mm, about 470 mm, about 480 mm, about 490 mm, about 500 mm, about 510 mm, about 520 mm, about 530 mm, about 540 mm, about 550 mm, about 560 mm, about 570 mm, about 580 mm, about 590 mm, about 600 mm, about 610 mm, about 620 mm, about 630 mm, about 640 mm, about 650 mm, about 660 mm, about 670 mm, about 680 mm, about 680 mm, about 700 mm, and/or within a range defined by two of the aforementioned values.
In certain embodiments, an area of overlap between two detectors, for example two top detectors or two side detectors, can be a rectangle. Each of the length and width of the area of overlap can be about 10 mm, about 20 mm, about 30 mm, about 40 mm, about 50 mm, about 60 mm, about 70 mm, about 80 mm, about 90 mm, about 100 mm, about 110 mm, about 120 mm, about 130 mm, about 140 mm, about 150 mm, about 160 mm, about 170 mm, about 180 mm, about 190 mm, about 200 mm, about 210 mm, about 220 mm, about 230 mm, about 240 mm, about 250 mm, about 260 mm, about 270 mm, about 280 mm, about 290 mm, about 300 mm, and/or within a range defined by two of the aforementioned values.
Some values of the tracking working distance, calibration target distance, and detector overlap of the top detectors and/or side detectors are provided below. Each of the above-identified parameters can be different from an integrated motion tracking and/or correction system, a retrofit motion tracking and/or correction system, and/or size or particulars of the medical imaging scanner or therapeutic device. Other values are also possible for each of the parameters depending on the detector and/or specifics of the medical imaging scanner and/or therapeutic device.
Marker Position with Respect to Subject
In some embodiments, one or more markers are configured to be used in conjunction with a motion tracking and/or correction system and/or device. The operation of a motion tracking and/or correction system can vary depending on the positioning of one or more markers, for example with respect to the subject.
As illustrated in
As illustrated in
As illustrated in
In addition or as an alternative to positioning one or more markers 110 on the side of the nose for motion compensation of head movement, one or more markers 110 can be also positioned anywhere on the subject's head and within the field of view of the detector or camera systems. Such positions include but are not limited to the forehead, the cheeks, the chin, the upper lip, and the mouth, for example using a mouth guard or holder.
One or more markers 110 can also be positioned on other body portions for motion compensation of movements of the other body portions. As for head movements, one or more markers 110 can be positioned on the body portion and within the field of view of the detector or camera systems.
Marker Position with Respect to Medical Imaging Scanner and/or Therapeutic Device
The operation of a motion tracking and/or correction system can also or alternatively vary depending on the positioning of one or more markers, for example with respect to the medical imaging scanner and/or therapeutic device.
The particular position of a marker with respect to a medical imaging scanner and/or therapeutic device can be ascertained. For example, when viewed in a front view as illustrated in
Further, a first vertical distance 2204 along the y axis between the back or bottom of the subject's head and the center of the marker can be determined. In certain embodiments, the first vertical distance 2204 can be about 150 mm, about 160 mm, about 170 mm, about 180 mm, about 190 mm, about 200 mm, about 210 mm, about 220 mm, about 230 mm, about 240 mm, about 250 mm, and/or within a range defined by two of the aforementioned values.
Similarly, a second vertical distance 2206 along the y axis between the center of the subject's head and the center of a marker 110 can be determined. In some embodiments, the second vertical distance 2206 can be about 50 mm, about 60 mm, about 70 mm, about 80 mm, about 90 mm, about 100 mm, about 110 mm, about 120 mm, about 130 mm, about 140 mm, about 150 mm, and/or within a range defined by two of the aforementioned values.
Further, when viewed in a side view as illustrated in
Some values of the first horizontal distance 2202, first vertical distance 2204, and second vertical distance 2206, and second horizontal distance 2210 are provided below. Each of the above-identified parameters can be different for an integrated motion tracking and/or correction system, a retrofit motion tracking and/or correction system, particulars of the head coil, and/or size or particulars of the medical imaging scanner or therapeutic device. Other values are also possible for each of the parameters depending on the detector and/or specifics of the medical imaging scanner and/or therapeutic device.
As discussed herein, in certain embodiments, one or more markers 110 can also be positioned on other body portions for motion compensation of movements of the other body portions. One or more of the above-identified distances may be different in embodiments for one or other body portions.
Marker Rotation
In some embodiments, an optimal rotation of one or more markers and/or a range thereof can exist for optimal motion detection and correction by a motion tracking and/or correction system.
As illustrated in
In certain embodiments, each of the angle 2302 and a camera or detector entrance angle viewing the subject from the top of the subject's head as illustrated in
In other embodiments, a marker 110 can be attached to a nose mount instead of being attached to the side of a subject's nose. By attaching a marker 110 to a nose mount, the angle 2304 between the vertical line along the y axis, as illustrated in
In certain embodiments, each of the angle 2304 and the camera or detector entrance angle viewing the subject from the top of the subject's head as illustrated in
Further, due to such configuration, retroreflection on one or more cameras or detectors may be improved and/or be symmetric or more symmetric compared to an embodiment in which a marker 110 is attached to the side of the nose for viewing the subject in the illustrated direction of
Similarly, in embodiments in which a marker 110 is attached to the side of a nose, an angle 2306 may be exist between an extended line along the marker 110 and a vertical line along the z axis when viewed from a front view as illustrated in
In certain embodiments, each of the angle 2306 and camera or detector entrance angle viewing the subject from above the subject's face as illustrated in
In other embodiments, by attaching a marker 110 to a nose mount, the angle 2308 between an extended line along the marker 110 and a vertical line along the z axis when viewed from a front view as illustrated in
In certain embodiments, each of the angle 2308 and the camera or detector entrance angle viewing the subject from the top of the subject's head as illustrated in FIG. 23B can be about −90°, about −85°, about −80°, about −75°, about −70°, about −65°, about −60°, about −55°, about −50°, about −45°, about −40°, about −35°, about −30°, about −25°, about −20°, about −15°, about −10°, about −5°, about 0°, about 5°, about 10°, about 15°, about 20°, about 25°, about 30°, about 35°, about 40°, about 45°, about 50°, about 55°, about 60°, about 65°, about 70°, about 75°, about 80°, about 85°, about 90° and/or within a range defined by two of the aforementioned values.
Further, due to such configuration, retroreflection on one or more cameras or detectors may be improved and/or be symmetric or more symmetric compared to an embodiment in which a marker 110 is attached to the side of the nose for viewing the subject in the illustrated direction of
Although this invention has been disclosed in the context of certain preferred embodiments and examples, it will be understood by those skilled in the art that the present invention extends beyond the specifically disclosed embodiments to other alternative embodiments and/or uses of the invention and obvious modifications and equivalents thereof. Additionally, the skilled artisan will recognize that any of the above-described methods can be carried out using any appropriate apparatus 100. Further, the disclosure herein of any particular feature, aspect, method, property, characteristic, quality, attribute, element, or the like in connection with an embodiment can be used in all other embodiments set forth herein. For all of the embodiments described herein the steps of the methods need not be performed sequentially. Thus, it is intended that the scope of the present invention herein disclosed should not be limited by the particular disclosed embodiments described above.
Conditional language, such as, among others, “can,” “could,” “might,” or “may,” unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without user input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment. The headings used herein are for the convenience of the reader only and are not meant to limit the scope of the inventions or claims.
The ranges disclosed herein also encompass any and all overlap, sub-ranges, and combinations thereof. Language such as “up to,” “at least,” “greater than,” “less than,” “between,” and the like includes the number recited. Numbers preceded by a term such as “about” or “approximately” include the recited numbers. For example, “about 3 mm” includes “3 mm.”
The headings provided herein, if any, are for convenience only and do not necessarily affect the scope or meaning of the devices and methods disclosed herein.
The present application claims the benefit under 35 U.S.C. §119(c) of U.S. Provisional Application No. 62/258,915, filed Nov. 23, 2015, and entitled “SENSORS FOR MOTION COMPENSATION IN MEDICAL IMAGING SCANS,” U.S. Provisional Application No. 62/332,402, filed May 5, 2016, and entitled “SYSTEMS, DEVICES, AND METHODS FOR TRACKING AND COMPENSATING FOR PATIENT MOTION DURING A MEDICAL IMAGING SCAN, and U.S. Provisional Application No. 62/333,023, filed May 6, 2016, and entitled “SYSTEMS, DEVICES, AND METHODS FOR TRACKING AND COMPENSATING FOR PATIENT MOTION DURING A MEDICAL IMAGING SCAN.” The foregoing applications are hereby incorporated herein by reference in their entirety under 37 C.F.R. §1.57.
Number | Name | Date | Kind |
---|---|---|---|
3811213 | Eaves | May 1974 | A |
4689999 | Shkedi | Sep 1987 | A |
4724386 | Haacke et al. | Feb 1988 | A |
4894129 | Leiponen et al. | Jan 1990 | A |
4923295 | Sireul et al. | May 1990 | A |
4953554 | Zerhouni et al. | Sep 1990 | A |
4988886 | Palum et al. | Jan 1991 | A |
5075562 | Greivenkamp et al. | Dec 1991 | A |
5318026 | Pelc | Jun 1994 | A |
5515711 | Hinkle | May 1996 | A |
5545993 | Taguchi et al. | Aug 1996 | A |
5615677 | Pelc et al. | Apr 1997 | A |
5687725 | Wendt | Nov 1997 | A |
5728935 | Czompo | Mar 1998 | A |
5802202 | Yamada et al. | Sep 1998 | A |
5808376 | Gordon et al. | Sep 1998 | A |
5835223 | Zawemer et al. | Nov 1998 | A |
5877732 | Ziarati | Mar 1999 | A |
5886257 | Gustafson et al. | Mar 1999 | A |
5889505 | Toyama | Mar 1999 | A |
5891060 | McGregor | Apr 1999 | A |
5936722 | Armstrong et al. | Aug 1999 | A |
5936723 | Schmidt et al. | Aug 1999 | A |
5947900 | Derbyshire et al. | Sep 1999 | A |
5987349 | Schulz | Nov 1999 | A |
6016439 | Acker | Jan 2000 | A |
6031888 | Ivan et al. | Feb 2000 | A |
6044308 | Huissoon | Mar 2000 | A |
6057680 | Foo et al. | May 2000 | A |
6061644 | Leis | May 2000 | A |
6088482 | He | Jul 2000 | A |
6144875 | Schweikard et al. | Nov 2000 | A |
6175756 | Ferre | Jan 2001 | B1 |
6236737 | Gregson et al. | May 2001 | B1 |
6246900 | Cosman et al. | Jun 2001 | B1 |
6279579 | Riaziat et al. | Aug 2001 | B1 |
6285902 | Kienzle, III et al. | Sep 2001 | B1 |
6289235 | Webber | Sep 2001 | B1 |
6292683 | Gupta et al. | Sep 2001 | B1 |
6298262 | Franck et al. | Oct 2001 | B1 |
6381485 | Hunter et al. | Apr 2002 | B1 |
6384908 | Schmidt et al. | May 2002 | B1 |
6390982 | Bova et al. | May 2002 | B1 |
6402762 | Hunter et al. | Jun 2002 | B2 |
6405072 | Cosman | Jun 2002 | B1 |
6421551 | Kuth et al. | Jul 2002 | B1 |
6467905 | Stahl et al. | Oct 2002 | B1 |
6474159 | Foxlin et al. | Nov 2002 | B1 |
6484131 | Amoral-Moriya et al. | Nov 2002 | B1 |
6490475 | Seeley et al. | Dec 2002 | B1 |
6501981 | Schweikard et al. | Dec 2002 | B1 |
6587707 | Nehrke et al. | Jul 2003 | B2 |
6621889 | Mostafavi | Sep 2003 | B1 |
6650920 | Schaldach et al. | Nov 2003 | B2 |
6662036 | Cosman | Dec 2003 | B2 |
6687528 | Gupta et al. | Feb 2004 | B2 |
6690965 | Riaziat et al. | Feb 2004 | B1 |
6711431 | Sarin et al. | Mar 2004 | B2 |
6731970 | Schlossbauer et al. | May 2004 | B2 |
6758218 | Anthony | Jul 2004 | B2 |
6771997 | Schaffer | Aug 2004 | B2 |
6794869 | Brittain | Sep 2004 | B2 |
6856827 | Seeley et al. | Feb 2005 | B2 |
6856828 | Cossette et al. | Feb 2005 | B2 |
6876198 | Watanabe et al. | Apr 2005 | B2 |
6888924 | Claus et al. | May 2005 | B2 |
6891374 | Brittain | May 2005 | B2 |
6892089 | Prince et al. | May 2005 | B1 |
6897655 | Brittain et al. | May 2005 | B2 |
6913603 | Knopp et al. | Jul 2005 | B2 |
6937696 | Mostafavi | Aug 2005 | B1 |
6959266 | Mostafavi | Oct 2005 | B1 |
6973202 | Mostafavi | Dec 2005 | B2 |
6980679 | Jeung et al. | Dec 2005 | B2 |
7007699 | Martinelli et al. | Mar 2006 | B2 |
7107091 | Jutras et al. | Sep 2006 | B2 |
7110805 | Machida | Sep 2006 | B2 |
7123758 | Jeung et al. | Oct 2006 | B2 |
7171257 | Thomson | Jan 2007 | B2 |
7173426 | Bulumulla et al. | Feb 2007 | B1 |
7176440 | Cofer et al. | Feb 2007 | B2 |
7191100 | Mostafavi | Mar 2007 | B2 |
7204254 | Riaziat et al. | Apr 2007 | B2 |
7209777 | Saranathan et al. | Apr 2007 | B2 |
7209977 | Acharya et al. | Apr 2007 | B2 |
7260253 | Rahn et al. | Aug 2007 | B2 |
7260426 | Schweikard et al. | Aug 2007 | B2 |
7295007 | Dold | Nov 2007 | B2 |
7313430 | Urquhart et al. | Dec 2007 | B2 |
7327865 | Fu et al. | Feb 2008 | B2 |
7348776 | Aksoy et al. | Mar 2008 | B1 |
7403638 | Jeung et al. | Jul 2008 | B2 |
7494277 | Setala | Feb 2009 | B2 |
7498811 | Macfarlane et al. | Mar 2009 | B2 |
7502413 | Guillaume | Mar 2009 | B2 |
7505805 | Kuroda | Mar 2009 | B2 |
7535411 | Falco | May 2009 | B2 |
7551089 | Sawyer | Jun 2009 | B2 |
7561909 | Pai et al. | Jul 2009 | B1 |
7567697 | Mostafavi | Jul 2009 | B2 |
7573269 | Yao | Aug 2009 | B2 |
7602301 | Stirling et al. | Oct 2009 | B1 |
7603155 | Jensen | Oct 2009 | B2 |
7623623 | Raanes et al. | Nov 2009 | B2 |
7657300 | Hunter et al. | Feb 2010 | B2 |
7657301 | Mate et al. | Feb 2010 | B2 |
7659521 | Pedroni | Feb 2010 | B2 |
7660623 | Hunter et al. | Feb 2010 | B2 |
7668288 | Conwell et al. | Feb 2010 | B2 |
7689263 | Fung et al. | Mar 2010 | B1 |
7702380 | Dean | Apr 2010 | B1 |
7715604 | Sun et al. | May 2010 | B2 |
7742077 | Sablak et al. | Jun 2010 | B2 |
7742621 | Hammoud et al. | Jun 2010 | B2 |
7742804 | Faul et al. | Jun 2010 | B2 |
7744528 | Wallace et al. | Jun 2010 | B2 |
7760908 | Curtner et al. | Jul 2010 | B2 |
7766837 | Pedrizzetti et al. | Aug 2010 | B2 |
7769430 | Mostafavi | Aug 2010 | B2 |
7772569 | Bewersdorf et al. | Aug 2010 | B2 |
7787011 | Zhou et al. | Aug 2010 | B2 |
7787935 | Dumoulin et al. | Aug 2010 | B2 |
7791808 | French et al. | Sep 2010 | B2 |
7792249 | Gertner et al. | Sep 2010 | B2 |
7796154 | Senior et al. | Sep 2010 | B2 |
7798730 | Westerweck | Sep 2010 | B2 |
7801330 | Zhang et al. | Sep 2010 | B2 |
7805987 | Smith | Oct 2010 | B1 |
7806604 | Bazakos et al. | Oct 2010 | B2 |
7817046 | Coveley et al. | Oct 2010 | B2 |
7817824 | Liang et al. | Oct 2010 | B2 |
7819818 | Ghajar | Oct 2010 | B2 |
7833221 | Voegele | Nov 2010 | B2 |
7834846 | Bell | Nov 2010 | B1 |
7835783 | Aletras | Nov 2010 | B1 |
7839551 | Lee et al. | Nov 2010 | B2 |
7840253 | Tremblay et al. | Nov 2010 | B2 |
7844094 | Jeung et al. | Nov 2010 | B2 |
7844320 | Shahidi | Nov 2010 | B2 |
7850526 | Zalewski et al. | Dec 2010 | B2 |
7860301 | Se et al. | Dec 2010 | B2 |
7866818 | Schroeder et al. | Jan 2011 | B2 |
7868282 | Lee et al. | Jan 2011 | B2 |
7878652 | Chen et al. | Feb 2011 | B2 |
7883415 | Larsen et al. | Feb 2011 | B2 |
7889907 | Engelbart et al. | Feb 2011 | B2 |
7894877 | Lewin et al. | Feb 2011 | B2 |
7902825 | Bammer et al. | Mar 2011 | B2 |
7907987 | Dempsey | Mar 2011 | B2 |
7908060 | Basson et al. | Mar 2011 | B2 |
7908233 | Angell et al. | Mar 2011 | B2 |
7911207 | Macfarlane et al. | Mar 2011 | B2 |
7912532 | Schmidt et al. | Mar 2011 | B2 |
7920250 | Robert et al. | Apr 2011 | B2 |
7920911 | Hoshino et al. | Apr 2011 | B2 |
7925066 | Ruohonen et al. | Apr 2011 | B2 |
7925549 | Looney et al. | Apr 2011 | B2 |
7931370 | Prat Bartomeu | Apr 2011 | B2 |
7944354 | Kangas et al. | May 2011 | B2 |
7944454 | Zhou et al. | May 2011 | B2 |
7945304 | Feinberg | May 2011 | B2 |
7946921 | Ofek et al. | May 2011 | B2 |
7962197 | Rioux et al. | Jun 2011 | B2 |
7971999 | Zinser | Jul 2011 | B2 |
7977942 | White | Jul 2011 | B2 |
7978925 | Souchard | Jul 2011 | B1 |
7988288 | Donaldson | Aug 2011 | B2 |
7990365 | Marvit et al. | Aug 2011 | B2 |
8005571 | Sutherland et al. | Aug 2011 | B2 |
8009198 | Alhadef | Aug 2011 | B2 |
8019170 | Wang et al. | Sep 2011 | B2 |
8021231 | Walker et al. | Sep 2011 | B2 |
8022982 | Thorn | Sep 2011 | B2 |
8024026 | Groszmann | Sep 2011 | B2 |
8031909 | Se et al. | Oct 2011 | B2 |
8031933 | Se et al. | Oct 2011 | B2 |
8036425 | Hou | Oct 2011 | B2 |
8041077 | Bell | Oct 2011 | B2 |
8041412 | Glossop et al. | Oct 2011 | B2 |
8048002 | Ghajar | Nov 2011 | B2 |
8049867 | Bridges et al. | Nov 2011 | B2 |
8055020 | Meuter et al. | Nov 2011 | B2 |
8055049 | Stayman et al. | Nov 2011 | B2 |
8060185 | Hunter et al. | Nov 2011 | B2 |
8063929 | Kurtz et al. | Nov 2011 | B2 |
8073197 | Xu et al. | Dec 2011 | B2 |
8077914 | Kaplan | Dec 2011 | B1 |
8085302 | Zhang et al. | Dec 2011 | B2 |
8086026 | Schulz | Dec 2011 | B2 |
8086299 | Adler et al. | Dec 2011 | B2 |
RE43147 | Aviv | Jan 2012 | E |
8094193 | Peterson | Jan 2012 | B2 |
8095203 | Wright et al. | Jan 2012 | B2 |
8095209 | Flaherty | Jan 2012 | B2 |
8098889 | Zhu et al. | Jan 2012 | B2 |
8113991 | Kutliroff | Feb 2012 | B2 |
8116527 | Sabol | Feb 2012 | B2 |
8121356 | Friedman | Feb 2012 | B2 |
8121361 | Ernst et al. | Feb 2012 | B2 |
8134597 | Thorn | Mar 2012 | B2 |
8135201 | Smith et al. | Mar 2012 | B2 |
8139029 | Boillot | Mar 2012 | B2 |
8139896 | Ahiska | Mar 2012 | B1 |
8144118 | Hildreth | Mar 2012 | B2 |
8144148 | El Dokor | Mar 2012 | B2 |
8150063 | Chen | Apr 2012 | B2 |
8150498 | Gielen et al. | Apr 2012 | B2 |
8160304 | Rhoads | Apr 2012 | B2 |
8165844 | Luinge et al. | Apr 2012 | B2 |
8167802 | Baba et al. | May 2012 | B2 |
8172573 | Sonenfeld et al. | May 2012 | B2 |
8175332 | Herrington | May 2012 | B2 |
8179604 | Prada Gomez et al. | May 2012 | B1 |
8180428 | Kaiser et al. | May 2012 | B2 |
8180432 | Sayeh | May 2012 | B2 |
8187097 | Zhang | May 2012 | B1 |
8189869 | Bell | May 2012 | B2 |
8189889 | Pearlstein et al. | May 2012 | B2 |
8189926 | Sharma | May 2012 | B2 |
8190233 | Dempsey | May 2012 | B2 |
8191359 | White et al. | Jun 2012 | B2 |
8194134 | Furukawa | Jun 2012 | B2 |
8195084 | Xiao | Jun 2012 | B2 |
8199983 | Qureshi | Jun 2012 | B2 |
8206219 | Shum | Jun 2012 | B2 |
8207967 | El Dokor | Jun 2012 | B1 |
8208758 | Wang | Jun 2012 | B2 |
8213693 | Li | Jul 2012 | B1 |
8214012 | Zuccolotto et al. | Jul 2012 | B2 |
8214016 | Lavallee et al. | Jul 2012 | B2 |
8216016 | Yamagishi et al. | Jul 2012 | B2 |
8218818 | Cobb | Jul 2012 | B2 |
8218819 | Cobb | Jul 2012 | B2 |
8218825 | Gordon | Jul 2012 | B2 |
8221399 | Amano | Jul 2012 | B2 |
8223147 | El Dokor | Jul 2012 | B1 |
8224423 | Faul | Jul 2012 | B2 |
8226574 | Whillock | Jul 2012 | B2 |
8229163 | Coleman | Jul 2012 | B2 |
8229166 | Teng | Jul 2012 | B2 |
8229184 | Benkley | Jul 2012 | B2 |
8232872 | Zeng | Jul 2012 | B2 |
8235529 | Raffle | Aug 2012 | B1 |
8235530 | Maad | Aug 2012 | B2 |
8241125 | Hughes | Aug 2012 | B2 |
8243136 | Aota | Aug 2012 | B2 |
8243269 | Matousek | Aug 2012 | B2 |
8243996 | Steinberg | Aug 2012 | B2 |
8248372 | Saila | Aug 2012 | B2 |
8249691 | Chase et al. | Aug 2012 | B2 |
8253770 | Kurtz | Aug 2012 | B2 |
8253774 | Huitema | Aug 2012 | B2 |
8253778 | Atsushi | Aug 2012 | B2 |
8259109 | El Dokor | Sep 2012 | B2 |
8260036 | Hamza et al. | Sep 2012 | B2 |
8279288 | Son | Oct 2012 | B2 |
8284157 | Markovic | Oct 2012 | B2 |
8284847 | Adermann | Oct 2012 | B2 |
8287373 | Marks et al. | Oct 2012 | B2 |
8289390 | Aggarwal | Oct 2012 | B2 |
8289392 | Senior et al. | Oct 2012 | B2 |
8290208 | Kurtz | Oct 2012 | B2 |
8290229 | Qureshi | Oct 2012 | B2 |
8295573 | Bredno et al. | Oct 2012 | B2 |
8301226 | Csavoy et al. | Oct 2012 | B2 |
8306260 | Zhu | Nov 2012 | B2 |
8306267 | Gossweiler, III | Nov 2012 | B1 |
8306274 | Grycewicz | Nov 2012 | B2 |
8306663 | Wickham | Nov 2012 | B2 |
8310656 | Zalewski | Nov 2012 | B2 |
8310662 | Mehr | Nov 2012 | B2 |
8311611 | Csavoy et al. | Nov 2012 | B2 |
8314854 | Yoon | Nov 2012 | B2 |
8315691 | Sumanaweera et al. | Nov 2012 | B2 |
8316324 | Boillot | Nov 2012 | B2 |
8320621 | McEldowney | Nov 2012 | B2 |
8320709 | Arartani et al. | Nov 2012 | B2 |
8323106 | Zalewski | Dec 2012 | B2 |
8325228 | Mariadoss | Dec 2012 | B2 |
8330811 | Maguire, Jr. | Dec 2012 | B2 |
8330812 | Maguire, Jr. | Dec 2012 | B2 |
8331019 | Cheong | Dec 2012 | B2 |
8334900 | Qu et al. | Dec 2012 | B2 |
8339282 | Noble | Dec 2012 | B2 |
8351651 | Lee | Jan 2013 | B2 |
8368586 | Mohamadi | Feb 2013 | B2 |
8369574 | Hu | Feb 2013 | B2 |
8374393 | Cobb | Feb 2013 | B2 |
8374411 | Ernst et al. | Feb 2013 | B2 |
8374674 | Gertner | Feb 2013 | B2 |
8376226 | Dennard | Feb 2013 | B2 |
8376827 | Cammegh | Feb 2013 | B2 |
8379927 | Taylor | Feb 2013 | B2 |
8380284 | Saranathan et al. | Feb 2013 | B2 |
8386011 | Wieczorek | Feb 2013 | B2 |
8390291 | Macfarlane et al. | Mar 2013 | B2 |
8390729 | Long | Mar 2013 | B2 |
8395620 | El Dokor | Mar 2013 | B2 |
8396654 | Simmons et al. | Mar 2013 | B1 |
8400398 | Schoen | Mar 2013 | B2 |
8400490 | Apostolopoulos | Mar 2013 | B2 |
8405491 | Fong | Mar 2013 | B2 |
8405656 | El Dokor | Mar 2013 | B2 |
8405717 | Kim | Mar 2013 | B2 |
8406845 | Komistek et al. | Mar 2013 | B2 |
8411931 | Zhou | Apr 2013 | B2 |
8427538 | Ahiska | Apr 2013 | B2 |
8428319 | Tsin et al. | Apr 2013 | B2 |
8571293 | Ernst et al. | Oct 2013 | B2 |
8600213 | Mestha et al. | Dec 2013 | B2 |
8615127 | Fitzpatrick | Dec 2013 | B2 |
8617081 | Mestha et al. | Dec 2013 | B2 |
8744154 | Van Den Brink | Jun 2014 | B2 |
8747382 | D'Souza | Jun 2014 | B2 |
8768438 | Mestha et al. | Jul 2014 | B2 |
8790269 | Xu et al. | Jul 2014 | B2 |
8792969 | Bernal et al. | Jul 2014 | B2 |
8805019 | Jeanne et al. | Aug 2014 | B2 |
8848977 | Bammer et al. | Sep 2014 | B2 |
8855384 | Kyal et al. | Oct 2014 | B2 |
8862420 | Ferran et al. | Oct 2014 | B2 |
8873812 | Larlus-Larrondo et al. | Oct 2014 | B2 |
8953847 | Moden | Feb 2015 | B2 |
8971985 | Bernal et al. | Mar 2015 | B2 |
8977347 | Mestha et al. | Mar 2015 | B2 |
8995754 | Wu et al. | Mar 2015 | B2 |
8996094 | Schouenborg et al. | Mar 2015 | B2 |
9020185 | Mestha et al. | Apr 2015 | B2 |
9036877 | Kyal et al. | May 2015 | B2 |
9076212 | Ernst et al. | Jul 2015 | B2 |
9082177 | Sebok | Jul 2015 | B2 |
9084629 | Rosa | Jul 2015 | B1 |
9103897 | Herbst et al. | Aug 2015 | B2 |
9138175 | Ernst et al. | Sep 2015 | B2 |
9173715 | Baumgartner | Nov 2015 | B2 |
9176932 | Baggen et al. | Nov 2015 | B2 |
9194929 | Siegert et al. | Nov 2015 | B2 |
9226691 | Bernal et al. | Jan 2016 | B2 |
9305365 | Lovberg et al. | Apr 2016 | B2 |
9318012 | Johnson | Apr 2016 | B2 |
9336594 | Kyal et al. | May 2016 | B2 |
9395386 | Corder et al. | Jul 2016 | B2 |
9433386 | Mestha et al. | Sep 2016 | B2 |
9436277 | Furst et al. | Sep 2016 | B2 |
9443289 | Xu et al. | Sep 2016 | B2 |
9451926 | Kinahan et al. | Sep 2016 | B2 |
9453898 | Nielsen et al. | Sep 2016 | B2 |
9504426 | Kyal et al. | Nov 2016 | B2 |
9606209 | Ernst et al. | Mar 2017 | B2 |
9607377 | Lovberg et al. | Mar 2017 | B2 |
9629595 | Walker | Apr 2017 | B2 |
9693710 | Mestha et al. | Jul 2017 | B2 |
9734589 | Yu et al. | Aug 2017 | B2 |
9779502 | Lovberg et al. | Oct 2017 | B1 |
20020082496 | Kuth | Jun 2002 | A1 |
20020087101 | Barrick et al. | Jul 2002 | A1 |
20020091422 | Greenberg et al. | Jul 2002 | A1 |
20020115931 | Strauss et al. | Aug 2002 | A1 |
20020180436 | Dale et al. | Dec 2002 | A1 |
20020188194 | Cosman | Dec 2002 | A1 |
20030063292 | Mostafavi | Apr 2003 | A1 |
20030088177 | Totterman et al. | May 2003 | A1 |
20030116166 | Anthony | Jun 2003 | A1 |
20030130574 | Stoyle | Jul 2003 | A1 |
20030195526 | Vilsmeir | Oct 2003 | A1 |
20040071324 | Norris et al. | Apr 2004 | A1 |
20040116804 | Mostafavi | Jun 2004 | A1 |
20040140804 | Polzin et al. | Jul 2004 | A1 |
20040171927 | Lowen et al. | Sep 2004 | A1 |
20050027194 | Adler et al. | Feb 2005 | A1 |
20050054910 | Tremblay et al. | Mar 2005 | A1 |
20050070784 | Komura et al. | Mar 2005 | A1 |
20050105772 | Voronka et al. | May 2005 | A1 |
20050107685 | Seeber | May 2005 | A1 |
20050137475 | Dold et al. | Jun 2005 | A1 |
20050148845 | Dean et al. | Jul 2005 | A1 |
20050148854 | Ito et al. | Jul 2005 | A1 |
20050265516 | Haider | Dec 2005 | A1 |
20050283068 | Zuccoloto et al. | Dec 2005 | A1 |
20060004281 | Saracen | Jan 2006 | A1 |
20060045310 | Tu et al. | Mar 2006 | A1 |
20060074292 | Thomson et al. | Apr 2006 | A1 |
20060241405 | Leitner et al. | Oct 2006 | A1 |
20070049794 | Glassenberg et al. | Mar 2007 | A1 |
20070093709 | Abernathie | Apr 2007 | A1 |
20070206836 | Yoon | Sep 2007 | A1 |
20070239169 | Plaskos et al. | Oct 2007 | A1 |
20070280508 | Ernst et al. | Dec 2007 | A1 |
20080039713 | Thomson et al. | Feb 2008 | A1 |
20080181358 | Van Kampen et al. | Jul 2008 | A1 |
20080183074 | Carls et al. | Jul 2008 | A1 |
20080212835 | Tavor | Sep 2008 | A1 |
20080221442 | Tolowsky et al. | Sep 2008 | A1 |
20080273754 | Hick et al. | Nov 2008 | A1 |
20080287728 | Hassan et al. | Nov 2008 | A1 |
20080287780 | Chase et al. | Nov 2008 | A1 |
20080317313 | Goddard et al. | Dec 2008 | A1 |
20090028411 | Pfeuffer | Jan 2009 | A1 |
20090052760 | Smith et al. | Feb 2009 | A1 |
20090185663 | Gaines, Jr. et al. | Jul 2009 | A1 |
20090187112 | Meir et al. | Jul 2009 | A1 |
20090209846 | Bammer | Aug 2009 | A1 |
20090253985 | Shachar et al. | Oct 2009 | A1 |
20090304297 | Adabala et al. | Dec 2009 | A1 |
20090306499 | Van Vorhis et al. | Dec 2009 | A1 |
20100054579 | Okutomi | Mar 2010 | A1 |
20100057059 | Makino | Mar 2010 | A1 |
20100059679 | Albrecht | Mar 2010 | A1 |
20100069742 | Partain et al. | Mar 2010 | A1 |
20100091089 | Cromwell et al. | Apr 2010 | A1 |
20100099981 | Fishel | Apr 2010 | A1 |
20100125191 | Sahin | May 2010 | A1 |
20100137709 | Gardner et al. | Jun 2010 | A1 |
20100148774 | Kamata | Jun 2010 | A1 |
20100149099 | Elias | Jun 2010 | A1 |
20100149315 | Qu | Jun 2010 | A1 |
20100160775 | Pankratov | Jun 2010 | A1 |
20100164862 | Sullivan | Jul 2010 | A1 |
20100165293 | Tanassi et al. | Jul 2010 | A1 |
20100167246 | Ghajar | Jul 2010 | A1 |
20100172567 | Prokoski | Jul 2010 | A1 |
20100177929 | Kurtz | Jul 2010 | A1 |
20100178966 | Suydoux | Jul 2010 | A1 |
20100179390 | Davis | Jul 2010 | A1 |
20100179413 | Kadour et al. | Jul 2010 | A1 |
20100183196 | Fu et al. | Jul 2010 | A1 |
20100191631 | Weidmann | Jul 2010 | A1 |
20100194879 | Pasveer | Aug 2010 | A1 |
20100198067 | Mahfouz | Aug 2010 | A1 |
20100198101 | Song | Aug 2010 | A1 |
20100198112 | Maad | Aug 2010 | A1 |
20100199232 | Mistry | Aug 2010 | A1 |
20100210350 | Walker | Aug 2010 | A9 |
20100214267 | Radivojevic | Aug 2010 | A1 |
20100231511 | Henty | Sep 2010 | A1 |
20100231692 | Perlman | Sep 2010 | A1 |
20100245536 | Huitema | Sep 2010 | A1 |
20100245593 | Kim | Sep 2010 | A1 |
20100251924 | Taylor | Oct 2010 | A1 |
20100253762 | Cheong | Oct 2010 | A1 |
20100268072 | Hall et al. | Oct 2010 | A1 |
20100277571 | Xu | Nov 2010 | A1 |
20100282902 | Rajasingham | Nov 2010 | A1 |
20100283833 | Yeh | Nov 2010 | A1 |
20100284119 | Coakley | Nov 2010 | A1 |
20100289899 | Hendron | Nov 2010 | A1 |
20100290668 | Friedman | Nov 2010 | A1 |
20100292841 | Wickham | Nov 2010 | A1 |
20100295718 | Mohamadi | Nov 2010 | A1 |
20100296701 | Hu | Nov 2010 | A1 |
20100302142 | French | Dec 2010 | A1 |
20100303289 | Polzin | Dec 2010 | A1 |
20100311512 | Lock | Dec 2010 | A1 |
20100321505 | Kokubun | Dec 2010 | A1 |
20100328055 | Fong | Dec 2010 | A1 |
20100328201 | Marbit | Dec 2010 | A1 |
20100328267 | Chen | Dec 2010 | A1 |
20100330912 | Saila | Dec 2010 | A1 |
20110001699 | Jacobsen | Jan 2011 | A1 |
20110006991 | Elias | Jan 2011 | A1 |
20110007939 | Teng | Jan 2011 | A1 |
20110007946 | Liang | Jan 2011 | A1 |
20110008759 | Usui | Jan 2011 | A1 |
20110015521 | Faul | Jan 2011 | A1 |
20110019001 | Rhoads | Jan 2011 | A1 |
20110025853 | Richardson | Feb 2011 | A1 |
20110038520 | Yui | Feb 2011 | A1 |
20110043631 | Marman | Feb 2011 | A1 |
20110043759 | Bushinsky | Feb 2011 | A1 |
20110050562 | Schoen | Mar 2011 | A1 |
20110050569 | Marvit | Mar 2011 | A1 |
20110050947 | Marman | Mar 2011 | A1 |
20110052002 | Cobb | Mar 2011 | A1 |
20110052003 | Cobb | Mar 2011 | A1 |
20110052015 | Saund | Mar 2011 | A1 |
20110054870 | Dariush | Mar 2011 | A1 |
20110057816 | Noble | Mar 2011 | A1 |
20110058020 | Dieckmann | Mar 2011 | A1 |
20110064290 | Punithakaumar | Mar 2011 | A1 |
20110069207 | Steinberg | Mar 2011 | A1 |
20110074675 | Shiming | Mar 2011 | A1 |
20110081000 | Gertner | Apr 2011 | A1 |
20110081043 | Sabol | Apr 2011 | A1 |
20110085704 | Han | Apr 2011 | A1 |
20110092781 | Gertner | Apr 2011 | A1 |
20110102549 | Takahashi | May 2011 | A1 |
20110105883 | Lake et al. | May 2011 | A1 |
20110105893 | Akins et al. | May 2011 | A1 |
20110115793 | Grycewicz | May 2011 | A1 |
20110115892 | Fan | May 2011 | A1 |
20110116683 | Kramer et al. | May 2011 | A1 |
20110117528 | Marciello et al. | May 2011 | A1 |
20110118032 | Zalewski | May 2011 | A1 |
20110133917 | Zeng | Jun 2011 | A1 |
20110142411 | Camp | Jun 2011 | A1 |
20110150271 | Lee | Jun 2011 | A1 |
20110157168 | Bennett | Jun 2011 | A1 |
20110157358 | Bell | Jun 2011 | A1 |
20110157370 | Livesey | Jun 2011 | A1 |
20110160569 | Cohen et al. | Jun 2011 | A1 |
20110172060 | Morales | Jul 2011 | A1 |
20110172521 | Zdeblick et al. | Jul 2011 | A1 |
20110175801 | Markovic | Jul 2011 | A1 |
20110175809 | Markovic | Jul 2011 | A1 |
20110175810 | Markovic | Jul 2011 | A1 |
20110176723 | Ali et al. | Jul 2011 | A1 |
20110180695 | Li | Jul 2011 | A1 |
20110181893 | MacFarlane | Jul 2011 | A1 |
20110182472 | Hansen | Jul 2011 | A1 |
20110187640 | Jacobsen | Aug 2011 | A1 |
20110193939 | Vassigh | Aug 2011 | A1 |
20110199461 | Horio | Aug 2011 | A1 |
20110201916 | Duyn et al. | Aug 2011 | A1 |
20110201939 | Hubschman et al. | Aug 2011 | A1 |
20110202306 | Eng | Aug 2011 | A1 |
20110205358 | Aota | Aug 2011 | A1 |
20110207089 | Lagettie | Aug 2011 | A1 |
20110208437 | Teicher | Aug 2011 | A1 |
20110216002 | Weising | Sep 2011 | A1 |
20110216180 | Pasini | Sep 2011 | A1 |
20110221770 | Kruglick | Sep 2011 | A1 |
20110229862 | Parikh | Sep 2011 | A1 |
20110230755 | MacFarlane et al. | Sep 2011 | A1 |
20110234807 | Jones | Sep 2011 | A1 |
20110234834 | Sugimoto | Sep 2011 | A1 |
20110235855 | Smith | Sep 2011 | A1 |
20110237933 | Cohen | Sep 2011 | A1 |
20110242134 | Miller | Oct 2011 | A1 |
20110244939 | Cammegh | Oct 2011 | A1 |
20110250929 | Lin | Oct 2011 | A1 |
20110251478 | Wieczorek | Oct 2011 | A1 |
20110255845 | Kikuchi | Oct 2011 | A1 |
20110257566 | Burdea | Oct 2011 | A1 |
20110260965 | Kim | Oct 2011 | A1 |
20110262002 | Lee | Oct 2011 | A1 |
20110267427 | Goh | Nov 2011 | A1 |
20110267456 | Adermann | Nov 2011 | A1 |
20110275957 | Bhandari | Nov 2011 | A1 |
20110276396 | Rathod | Nov 2011 | A1 |
20110279663 | Fan | Nov 2011 | A1 |
20110285622 | Marti | Nov 2011 | A1 |
20110286010 | Kusik et al. | Nov 2011 | A1 |
20110291925 | Israel | Dec 2011 | A1 |
20110293143 | Narayanan et al. | Dec 2011 | A1 |
20110293146 | Grycewicz | Dec 2011 | A1 |
20110298708 | Hsu | Dec 2011 | A1 |
20110298824 | Lee | Dec 2011 | A1 |
20110300994 | Verkaaik | Dec 2011 | A1 |
20110301449 | Maurer, Jr. | Dec 2011 | A1 |
20110301934 | Tardis | Dec 2011 | A1 |
20110303214 | Welle | Dec 2011 | A1 |
20110304541 | Dalal | Dec 2011 | A1 |
20110304650 | Campillo | Dec 2011 | A1 |
20110304706 | Porter | Dec 2011 | A1 |
20110306867 | Gopinadhan | Dec 2011 | A1 |
20110310220 | McEldowney | Dec 2011 | A1 |
20110310226 | McEldowney | Dec 2011 | A1 |
20110316994 | Lemchen | Dec 2011 | A1 |
20110317877 | Bell | Dec 2011 | A1 |
20120002112 | Huang | Jan 2012 | A1 |
20120004791 | Buelthoff | Jan 2012 | A1 |
20120007839 | Tsao et al. | Jan 2012 | A1 |
20120019645 | Maltz | Jan 2012 | A1 |
20120020524 | Ishikawa | Jan 2012 | A1 |
20120021806 | Naltz | Jan 2012 | A1 |
20120027226 | Desenberg | Feb 2012 | A1 |
20120029345 | Mahfouz et al. | Feb 2012 | A1 |
20120032882 | Schlachta | Feb 2012 | A1 |
20120033083 | Horvinger | Feb 2012 | A1 |
20120035462 | Maurer, Jr. et al. | Feb 2012 | A1 |
20120039505 | Vastide | Feb 2012 | A1 |
20120044363 | Lu | Feb 2012 | A1 |
20120045091 | Kaganovich | Feb 2012 | A1 |
20120049453 | Morichau-Beauchant et al. | Mar 2012 | A1 |
20120051588 | McEldowney | Mar 2012 | A1 |
20120051664 | Gopalakrishnan et al. | Mar 2012 | A1 |
20120052949 | Weitzner | Mar 2012 | A1 |
20120056982 | Katz | Mar 2012 | A1 |
20120057640 | Shi | Mar 2012 | A1 |
20120065492 | Gertner et al. | Mar 2012 | A1 |
20120065494 | Gertner et al. | Mar 2012 | A1 |
20120072041 | Miller | Mar 2012 | A1 |
20120075166 | Marti | Mar 2012 | A1 |
20120075177 | Jacobsen | Mar 2012 | A1 |
20120076369 | Abramovich | Mar 2012 | A1 |
20120081504 | Ng | Apr 2012 | A1 |
20120083314 | Ng | Apr 2012 | A1 |
20120083960 | Zhu | Apr 2012 | A1 |
20120086778 | Lee | Apr 2012 | A1 |
20120086809 | Lee | Apr 2012 | A1 |
20120092445 | McDowell | Apr 2012 | A1 |
20120092502 | Knasel | Apr 2012 | A1 |
20120093481 | McDowell | Apr 2012 | A1 |
20120098938 | Jin | Apr 2012 | A1 |
20120101388 | Tripathi | Apr 2012 | A1 |
20120105573 | Apostolopoulos | May 2012 | A1 |
20120106814 | Gleason et al. | May 2012 | A1 |
20120108909 | Slobounov et al. | May 2012 | A1 |
20120113140 | Hilliges | May 2012 | A1 |
20120113223 | Hilliges | May 2012 | A1 |
20120116202 | Bangera | May 2012 | A1 |
20120119999 | Harris | May 2012 | A1 |
20120120072 | Se | May 2012 | A1 |
20120120237 | Trepess | May 2012 | A1 |
20120120243 | Chien | May 2012 | A1 |
20120120277 | Tsai | May 2012 | A1 |
20120121124 | Bammer | May 2012 | A1 |
20120124604 | Small | May 2012 | A1 |
20120127319 | Rao | May 2012 | A1 |
20120133616 | Nishihara | May 2012 | A1 |
20120133889 | Bergt | May 2012 | A1 |
20120143029 | Silverstein | Jun 2012 | A1 |
20120143212 | Madhani | Jun 2012 | A1 |
20120147167 | Manson | Jun 2012 | A1 |
20120154272 | Hildreth | Jun 2012 | A1 |
20120154511 | Hsu | Jun 2012 | A1 |
20120154536 | Stoker | Jun 2012 | A1 |
20120154579 | Hanpapur | Jun 2012 | A1 |
20120156661 | Smith | Jun 2012 | A1 |
20120158197 | Hinman | Jun 2012 | A1 |
20120162378 | El Dokor et al. | Jun 2012 | A1 |
20120165964 | Flaks | Jun 2012 | A1 |
20120167143 | Longet | Jun 2012 | A1 |
20120169841 | Chemali | Jul 2012 | A1 |
20120176314 | Jeon | Jul 2012 | A1 |
20120184371 | Shum | Jul 2012 | A1 |
20120188237 | Han | Jul 2012 | A1 |
20120188371 | Chen | Jul 2012 | A1 |
20120194422 | El Dokor | Aug 2012 | A1 |
20120194517 | Izadi et al. | Aug 2012 | A1 |
20120194561 | Grossinger | Aug 2012 | A1 |
20120195466 | Teng | Aug 2012 | A1 |
20120196660 | El Dokor et al. | Aug 2012 | A1 |
20120197135 | Slatkine | Aug 2012 | A1 |
20120200676 | Huitema | Aug 2012 | A1 |
20120201428 | Joshi et al. | Aug 2012 | A1 |
20120206604 | Jones | Aug 2012 | A1 |
20120212594 | Nick Barns | Aug 2012 | A1 |
20120218407 | Chien | Aug 2012 | A1 |
20120218421 | Chien | Aug 2012 | A1 |
20120220233 | Teague | Aug 2012 | A1 |
20120224666 | Speller | Sep 2012 | A1 |
20120224743 | Rodriguez | Sep 2012 | A1 |
20120225718 | Zhang | Sep 2012 | A1 |
20120229643 | Chidanand | Sep 2012 | A1 |
20120229651 | Takizawa | Sep 2012 | A1 |
20120230561 | Qureshi | Sep 2012 | A1 |
20120235896 | Jacobsen | Sep 2012 | A1 |
20120238337 | French | Sep 2012 | A1 |
20120242816 | Cruz | Sep 2012 | A1 |
20120249741 | Maciocci | Oct 2012 | A1 |
20120253201 | Reinhold | Oct 2012 | A1 |
20120253241 | Levital et al. | Oct 2012 | A1 |
20120262540 | Rondinelli | Oct 2012 | A1 |
20120262558 | Boger | Oct 2012 | A1 |
20120262583 | Bernal | Oct 2012 | A1 |
20120268124 | Herbst et al. | Oct 2012 | A1 |
20120275649 | Cobb | Nov 2012 | A1 |
20120276995 | Lansdale | Nov 2012 | A1 |
20120277001 | Lansdale | Nov 2012 | A1 |
20120281093 | Fong | Nov 2012 | A1 |
20120281873 | Brown | Nov 2012 | A1 |
20120288142 | Gossweiler, III | Nov 2012 | A1 |
20120288143 | Ernst | Nov 2012 | A1 |
20120288852 | Willson | Nov 2012 | A1 |
20120289334 | Mikhailov | Nov 2012 | A9 |
20120289822 | Shachar et al. | Nov 2012 | A1 |
20120293412 | El Dokor | Nov 2012 | A1 |
20120293506 | Vertucci | Nov 2012 | A1 |
20120293663 | Liu | Nov 2012 | A1 |
20120294511 | Datta | Nov 2012 | A1 |
20120300961 | Moeller | Nov 2012 | A1 |
20120303839 | Jackson | Nov 2012 | A1 |
20120304126 | Lavigne | Nov 2012 | A1 |
20120307075 | Margalit | Dec 2012 | A1 |
20120307207 | Abraham | Dec 2012 | A1 |
20120314066 | Lee | Dec 2012 | A1 |
20120315016 | Fung | Dec 2012 | A1 |
20120319946 | El Dokor | Dec 2012 | A1 |
20120319989 | Argiro | Dec 2012 | A1 |
20120320178 | Siegert et al. | Dec 2012 | A1 |
20120320219 | David | Dec 2012 | A1 |
20120326966 | Rauber | Dec 2012 | A1 |
20120326976 | Markovic | Dec 2012 | A1 |
20120326979 | Geisert | Dec 2012 | A1 |
20120327241 | Howe | Dec 2012 | A1 |
20120327246 | Senior et al. | Dec 2012 | A1 |
20130002866 | Hampapur | Jan 2013 | A1 |
20130002879 | Weber | Jan 2013 | A1 |
20130002900 | Gossweiler, III | Jan 2013 | A1 |
20130009865 | Valik | Jan 2013 | A1 |
20130010071 | Valik | Jan 2013 | A1 |
20130013452 | Dennard | Jan 2013 | A1 |
20130016009 | Godfrey | Jan 2013 | A1 |
20130016876 | Wooley | Jan 2013 | A1 |
20130021434 | Ahiska | Jan 2013 | A1 |
20130021578 | Chen | Jan 2013 | A1 |
20130024819 | Rieffel | Jan 2013 | A1 |
20130030283 | Vortman et al. | Jan 2013 | A1 |
20130033640 | Lee | Feb 2013 | A1 |
20130033700 | Hallil | Feb 2013 | A1 |
20130035590 | Ma et al. | Feb 2013 | A1 |
20130035612 | Mason | Feb 2013 | A1 |
20130040720 | Cammegh | Feb 2013 | A1 |
20130041368 | Cunninghan | Feb 2013 | A1 |
20130049756 | Ernst et al. | Feb 2013 | A1 |
20130053683 | Hwang et al. | Feb 2013 | A1 |
20130057702 | Chavan | Mar 2013 | A1 |
20130064426 | Watkins, Jr. | Mar 2013 | A1 |
20130064427 | Picard | Mar 2013 | A1 |
20130065517 | Svensson | Mar 2013 | A1 |
20130066448 | Alonso | Mar 2013 | A1 |
20130066526 | Mondragon | Mar 2013 | A1 |
20130069773 | Li | Mar 2013 | A1 |
20130070201 | Shahidi | Mar 2013 | A1 |
20130070257 | Wong | Mar 2013 | A1 |
20130072787 | Wallace et al. | Mar 2013 | A1 |
20130076863 | Rappel | Mar 2013 | A1 |
20130076944 | Kosaka | Mar 2013 | A1 |
20130077823 | Mestha | Mar 2013 | A1 |
20130079033 | Gupta | Mar 2013 | A1 |
20130084980 | Hammontree | Apr 2013 | A1 |
20130088584 | Malhas | Apr 2013 | A1 |
20130093866 | Ohlhues et al. | Apr 2013 | A1 |
20130096439 | Lee | Apr 2013 | A1 |
20130102879 | MacLaren et al. | Apr 2013 | A1 |
20130102893 | Vollmer | Apr 2013 | A1 |
20130108979 | Daon | May 2013 | A1 |
20130113791 | Isaacs et al. | May 2013 | A1 |
20130211421 | Abovitz et al. | Aug 2013 | A1 |
20130281818 | Vija et al. | Oct 2013 | A1 |
20140073908 | Biber | Mar 2014 | A1 |
20140088410 | Wu | Mar 2014 | A1 |
20140148685 | Liu et al. | May 2014 | A1 |
20140159721 | Grodzki | Jun 2014 | A1 |
20140171784 | Ooi et al. | Jun 2014 | A1 |
20140343344 | Saunders | Nov 2014 | A1 |
20140378816 | Oh et al. | Dec 2014 | A1 |
20150085072 | Yan | Mar 2015 | A1 |
20150094597 | Mestha et al. | Apr 2015 | A1 |
20150094606 | Mestha et al. | Apr 2015 | A1 |
20150212182 | Nielsen et al. | Jul 2015 | A1 |
20150245787 | Kyal et al. | Sep 2015 | A1 |
20150257661 | Mestha et al. | Sep 2015 | A1 |
20150265187 | Bernal et al. | Sep 2015 | A1 |
20150265220 | Ernst et al. | Sep 2015 | A1 |
20150297120 | Son et al. | Oct 2015 | A1 |
20150297314 | Fowler | Oct 2015 | A1 |
20150316635 | Stehning et al. | Nov 2015 | A1 |
20150323637 | Beck et al. | Nov 2015 | A1 |
20150331078 | Speck et al. | Nov 2015 | A1 |
20150359464 | Oleson | Dec 2015 | A1 |
20150366527 | Yu et al. | Dec 2015 | A1 |
20160000383 | Lee et al. | Jan 2016 | A1 |
20160000411 | Raju et al. | Jan 2016 | A1 |
20160035108 | Yu et al. | Feb 2016 | A1 |
20160045112 | Weissler et al. | Feb 2016 | A1 |
20160073962 | Yu et al. | Mar 2016 | A1 |
20160091592 | Beall et al. | Mar 2016 | A1 |
20160166205 | Ernst et al. | Jun 2016 | A1 |
20160189372 | Lovberg et al. | Jun 2016 | A1 |
20160198965 | Mestha et al. | Jul 2016 | A1 |
20160228005 | Bammer et al. | Aug 2016 | A1 |
20160249984 | Janssen | Sep 2016 | A1 |
20160256713 | Saunders et al. | Sep 2016 | A1 |
20160262663 | MacLaren et al. | Sep 2016 | A1 |
20160287080 | Olesen et al. | Oct 2016 | A1 |
20160310093 | Chen | Oct 2016 | A1 |
20160310229 | Bammer et al. | Oct 2016 | A1 |
20160313432 | Feiweier et al. | Oct 2016 | A1 |
20170032538 | Ernst et al. | Feb 2017 | A1 |
20170038449 | Voigt et al. | Feb 2017 | A1 |
20170303859 | Robertson et al. | Oct 2017 | A1 |
20170319143 | Yu et al. | Nov 2017 | A1 |
20170345145 | Nempont et al. | Nov 2017 | A1 |
Number | Date | Country |
---|---|---|
100563551 | Dec 2009 | CN |
105392423 | Mar 2016 | CN |
106572810 | Apr 2017 | CN |
106714681 | May 2017 | CN |
29519078 | Mar 1996 | DE |
102004024470 | Dec 2005 | DE |
0904733 | Mar 1991 | EP |
1319368 | Jun 2003 | EP |
1354564 | Oct 2003 | EP |
1524626 | Apr 2005 | EP |
2515139 | Oct 2012 | EP |
2948056 | Dec 2015 | EP |
2950714 | Dec 2015 | EP |
03023838 | May 1991 | JP |
WO 9617258 | Jun 1996 | WO |
WO 9938449 | Aug 1999 | WO |
WO 0072039 | Nov 2000 | WO |
WO 03003796 | Jan 2003 | WO |
WO 2004023783 | Mar 2004 | WO |
WO 2005077293 | Aug 2005 | WO |
WO 2007025301 | Mar 2007 | WO |
WO 2007085241 | Aug 2007 | WO |
WO 2007136745 | Nov 2007 | WO |
WO 2009101566 | Aug 2009 | WO |
WO 2009129457 | Oct 2009 | WO |
WO 2010066824 | Jun 2010 | WO |
WO 2011047467 | Apr 2011 | WO |
WO 2011113441 | Sep 2011 | WO |
WO 2012046202 | Apr 2012 | WO |
WO 2013032933 | Mar 2013 | WO |
WO 2014005178 | Jan 2014 | WO |
WO 2014116868 | Jul 2014 | WO |
WO 2014120734 | Aug 2014 | WO |
WO 2015022684 | Feb 2015 | WO |
WO 2015042138 | Mar 2015 | WO |
WO 2015092593 | Jun 2015 | WO |
WO 2015148391 | Oct 2015 | WO |
WO 2016014718 | Jan 2016 | WO |
WO2017091479 | Jun 2017 | WO |
WO2017189427 | Nov 2017 | WO |
Entry |
---|
Armstrong et al., RGR-6D: Low-cost, high-accuracy measurement of 6-DOF Pose from a Single Image. Publication date unknown. |
Hoff et al., “Analysis of Head Pose Accuracy in Augmented Reality”, IEEE Transactions on Visualization and Computer Graphics 6, No. 4 (Oct.-Dec. 2000): 319-334. |
Katsuki, et al., “Design of an Artificial Mark to Determine 3D Pose by Monocular Vision”, 2003 IEEE International Conference on Robotics and Automation (Cat. No. 03CH37422), Sep. 14-19, 2003, pp. 995-1000 vol. 1. |
Kiebel et al., “MRI and PET coregistration-a cross validation of statistical parametric mapping and automated image registration”, Neuroimage 5(4):271-279 (1997). |
Lerner, “Motion correction in fmri images”, Technion-lsrael Institute of Technology, Faculty of Computer Science (Feb. 2006). |
Speck, et al., “Prospective real-time slice-by-slice Motion Correction for fMRI in Freely Moving Subjects”, Magnetic Resonance Materials in Physics, Biology and Medicine, 19(2), 55-61, published May 9, 2006. |
Yeo, et al. Motion correction in fMRI by mapping slice-to-volume with concurrent field-inhomogeneity correction:, International Conference on Medical Image Computing and Computer-Assisted Intervention, pp. 752-760 (2004). |
Ashouri, H., L. et al., Unobtrusive Estimation of Cardiac Contractility and Stroke vol. Changes Using Ballistocardiogram Measurements on a High Bandwidth Force Plate, Sensors 2016, 16, 787; doi:10.3390/s16060787. |
Benchoff, Brian, “Extremely Precise Positional Tracking”, https://hackaday.com/2013/10/10/extremely-precise-positional-tracking/, printed on Sep. 16, 2017, in 7 pages. |
Communication pursuant to Article 94(3) EPC for application No. 14743670.3, which is an EP application related to the present application, dated Feb. 6, 2018. |
Extended Europen Search Report for application No. 14743670.3 which is a EP application related to the present application, dated Aug. 17, 2017. |
Extended Europen Search Report for application No. 15769296.3 which is a EP application related to the present application, dated Dec. 22, 2017. |
Extended European Search Report for application No. 15824707.2 which is a EP application related to the present dated Apr. 16, 2018. |
Gordon, J. W. Certain molar movements of the human body produced by the circulation of the blood. J. Anat. Physiol. 11, 533-536 (1877). |
Herbst et al., “Reproduction of Motion Artifacts for Performance Analysis of Prospective Motion Correction in MRI”, Magnetic Resonance in Medicine., vol. 71, No. 1, p. 182-190 (Feb. 25, 2013). |
Jochen Triesch, et al.“Democratic Integration: Self-Organized Integration of Adaptive Cues”, Neural Computation., vol. 13, No. 9, dated Sep. 1, 2001, pp. 2049-2074. |
Kim, Chang-Sei et al. “Ballistocardiogram: Mechanism and Potential for Unobtrusive Cardiovascular Health Monitoring”, Scientific Reports, Aug. 9, 2016. |
Maclaren et al., “Prospective Motion Correction in Brain Imaging: A Review” Online Magnetic Resonance in Medicine, vol. 69, No. 3, pp. 621-636 (Mar. 1, 2013. |
Olesen et al., “Structured Light 3D Tracking System for Measuring Motions in PET Brain Imaging”, Proceedings of SPIE, The International Society for Optical Engineering (ISSN: 0277-786X), vol. 7625:76250X (2010). |
Olesen et al., “Motion Tracking in Narrow Spaces: A Structured Light Approach”, Lecture Notes in Computer Science (ISSN: 0302-9743)vol. 6363:253-260 (2010). |
Olesen et al., “Motion Tracking for Medical Imaging: A Nonvisible Structured Light Tracking Approach”, IEEE Transactions on Medical Imaging, vol. 31(1), Jan. 2012. |
Tarvainen, M.P. et al., “An advanced de-trending method with application to HRV analysis,” IEEE Trans. Biomed. Eng., vol. 49, No. 2, pp. 172-175, Feb. 2002. |
Wilm et al., “Accurate and Simple Calibration of DLP Projector Systems”, Proceedings of SPIE, The International Society for Optical Engineering (ISSN: 0277786X), vol. 8979 (2014). |
Wilm et al., “Correction of Motion Artifacts for Real-Time Structured Light”, R.R. Paulsen and K.S. Pedersen (Eds.): SCIA 2015, LNCS 9127, pp. 142-151 (2015). |
US 7,906,604, 10/2010, Bazakos (withdrawn) |
Aksoy et al., “Hybrind Prospective and Retrospective Head Motion Correction to Mitigate Cross-Calibration Errors”, NIH Publication, Nov. 2012. |
Aksoy et al., “Real-Time Optical Motion Correction for Diffusion Tensor Imaging, Magnetic Resonance in Medicine” (Mar. 22, 2011) 66 366-378. |
Andrews et al., “Prospective Motion Correction for Magnetic Resonance Spectroscopy Using Single Camera Retro-Grate Reflector Optical Tracking, Journal of Magnetic Resonance Imaging” (Feb. 2011) 33(2): 498-504. |
Angeles et al., “The Online Solution of the Hand-Eye Problem”, IEEE Transactions on Robotics and Automation, 16(6): 720-731 (Dec. 2000). |
Anishenko et al., “A Motion Correction System for Brain Tomography Based on Biologically Motivated Models.” 7th IEEE International Conference on Cybernetic Intelligent Systems, dated Sep. 9, 2008, in 9 pages. |
Armstrong et al., “RGR-3D: Simple, cheap detection of 6-DOF pose for tele-operation, and robot programming and calibration”, In Proc. 2002 Int. Conf. on Robotics and Automation, IEEE, Washington (May 2002). |
Bandettini, Peter A., et al., “Processing Strategies for Time-Course Data Sets in Functional MRI of the Human Breain”, Magnetic Resonance in Medicine 30: 161-173 (1993). |
Barmet et al, Spatiotemporal Magnetic Field Monitoring for MR, Magnetic Resonance in Medicine (Feb. 1, 2008) 60: 187-197. |
Bartels, LW, et al., “Endovascular interventional magnetic resonance imaging”, Physics in Medicine and Biology 48: R37-R64 (2003). |
Carranza-Herrezuelo et al, “Motion estimation of tagged cardiac magnetric resonance images using variational techniques” Elsevier, Computerized Medical Imaging and Graphics 34 (2010), pp. 514-522. |
Chou, Jack C. K., et al., “Finding the Position and Orientation of a Sensor on a Robot Manipulator Using Quaternions”, The International Journal of Robotics Research, 10(3): 240-254 (Jun. 1991). |
Cofaru et al “Improved Newton-Raphson digital image correlation method for full-field displacement and strain calculation,” Department of Materials Science and Engineering, Ghent University St-Pietersnieuwstraat, Nov. 20, 2010. |
Ernst et al., “A Novel Phase and Frequency Navigator for Proton Magnetic Resonance Spectroscopy Using Water-Suppression Cycling, Magnetic Resonance in Medicine” (Jan. 2011) 65(1): 13-7. |
Eviatar et al., “Real time head motion correction for functional MRI”, In: Proceedings of the International Society for Magnetic Resonance in Medicine (1999) 269. |
Forbes, Kristen P. N., et al., “Propeller MRI: Clinical Testing of a Novel Technique for Quantification and Compensation of Head Motion”, Journal of Magnetic Resonance Imaging 14: 215-222 (2001). |
Fulton et al., “Correction for Head Movements in Positron Emission Tomography Using an Optical Motion-Tracking System”, IEEE Transactions on Nuclear Science, vol. 49(1):116-123 (Feb. 2002). |
Glover, Gary H., et al., “Self-Navigated Spiral fMRI: Interleaved versus Single-shot”, Magnetic Resonance in Medicine 39: 361-368 (1998). |
Gumus et al., “Elimination of DWI signal dropouts using blipped gradients for dynamic restoration of gradient moment”, ISMRM 20th Annual Meeting & Exhibition, May 7, 2012. |
Herbst et al., “Preventing Signal Dropouts in DWI Using Continous Prospective Motion Correction”, Proc. Intl. Soc. Mag. Reson. Med. 19 (May 2011) 170. |
Herbst et al., “Prospective Motion Correction With Continuous Gradient Updates in Diffusion Weighted Imaging, Magnetic Resonance in Medicine” (2012) 67:326-338. |
Horn, Berthold K. P., “Closed-form solution of absolute orientation using unit quaternions”, Journal of the Optical Society of America, vol. 4, p. 629-642 (Apr. 1987). |
International Preliminary Report on Patentability for Application No. PCT/US2015/022041, dated Oct. 6, 2016, in 8 pages. |
International Preliminary Report on Patentability for Application No. PCT/US2007/011899, dated Jun. 8, 2008, in 13 pages. |
International Search Report and Written Opinion for Application No. PCT/US2007/011899, dated Nov. 14, 2007. |
International Search Report and Written Opinion for Application No. PCT/US2014/012806, dated May 15, 2014, in 15 pages. |
International Search Report and Written Opinion for Application No. PCT/US2015/041615, dated Oct. 29, 2015, in 13 pages. |
International Preliminary Report on Patentability for Application No. PCT/US2014/013546, dated Aug. 4, 2015, in 9 pages. |
International Search Report and Written Opinion for Application No. PCT/US2015/022041, dated Jun. 29, 2015, in 9 pages. |
Josefsson et al. “A flexible high-precision video system for digital recording of motor acts through lightweight reflect markers”, Computer Methods and Programs in Biomedicine, vol. 49:111-129 (1996). |
Kiruluta et al., “Predictive Head Movement Tracking Using a Kalman Filter”, IEEE Trans. On Systems, Man, and Cybernetics—Part B: Cybernetics, 27(2):326-331 (Apr. 1997). |
Maclaren et al., “Combined Prospective and Retrospective Motion Correction to Relax Navigator Requirements”, Magnetic Resonance in Medicine (Feb. 11, 2011) 65:1724-1732. |
MacLaren et al., “Navigator Accuracy Requirements for Prospective Motion Correction”, Magnetic Resonance in Medicine (Jan. 2010) 63(1): 162-70. |
MacLaren, “Prospective Motion Correction in MRI Using Optical Tracking Tape”, Book of Abstracts, ESMRMB (2009). |
Maclaren et al., “Measurement and correction of microscopic head motion during magnetic resonance imaging of the brain”, PLOS One, vol. 7(11):1-9 (2012). |
McVeigh et al., “Real-time, Interactive MRI for Cardiovascular Interventions”, Academic Radiology, 12(9): 1121-1127 (2005). |
Nehrke et al., “Prospective Correction of Affine Motion for Arbitrary MR Sequences on a Clinical Scanner”, Magnetic Resonance in Medicine (Jun. 28, 2005) 54:1130-1138. |
Norris et al., “Online motion correction for diffusion-weighted imaging using navigator echoes: application to RARE imaging without sensitivity loss”, Magnetic Resonance in Medicine, vol. 45:729-733 (2001). |
Ooi et al., “Prospective Real-Time Correction for Arbitrary Head Motion Using Active Markers”, Magnetic Resonance in Medicine (Apr. 15, 2009) 62(4): 943-54. |
Orchard et al., “MRI Reconstruction using real-time motion tracking: A simulation study”, Signals, Systems and Computers, 42nd Annual Conference IEEE, Piscataway, NJ, USA (Oct. 26, 2008). |
Park, Frank C. and Martin, Bryan J., “Robot Sensor Calibration: Solving AX-XB on the Euclidean Group”, IEEE Transaction on Robotics and Automation, 10(5): 717-721 (Oct. 1994). |
PCT Search Report from the International Searching Authority, dated Feburary 28, 2013, in 16 pages, regarding International Application No. PCT/US2012/052349. |
Qin et al., “Prospective Head-Movement Correction for High-Resolution MRI Using an In-Bore Optical Tracking System”, Magnetic Resonance in Medicine (Apr. 13, 2009) 62: 924-934. |
Schulz et al., “First Embedded In-Bore System for Fast Optical Prospective Head Motion-Correction in MRI”, Proceedings of the 28th Annual Scientific Meeting of the ESMRMB (Oct. 8, 2011) 369. |
Shiu et al., “Calibration of Wrist-Mounted Robotic Sensors by Solving Homogeneous Transform Equations of the Form AX=XB”, IEEE Transactions on Robotics and Automation, 5(1): 16-29 (Feb. 1989). |
Tremblay et al., “Retrospective Coregistration of Functional Magnetic Resonance Imaging Data using External monitoring”, Magnetic Resonance in Medicine 53:141-149 (2005). |
Tsai et al., “A New Technique for Fully Autonomous and Efficient 3D Robotics Hand/Eye Calibration”, IEEE Transaction on Robotics and Automation, 5(3): 345-358 (Jun. 1989). |
Wang, Ching-Cheng, “Extrinsic Calibration of a Vision Sensor Mounted on a Robot”, IEEE Transactions on Robotics and Automation, 8(2):161-175 (Apr. 1992). |
Ward et al., “Prospective Multiaxial Motion Correction for fMRI”, Magnetic Resonance in Medicine 43:459-469 (2000). |
Welch at al., “Spherical Navigator Echoes for Full 3D Rigid Body Motion Measurement in MRI”, Magnetic Resonance in Medicine 47:32-41 (2002). |
Zaitsev, M., et al., “Prospective Real-Time Slice-by-Slice 3D Motion Correction for EPI Using an External Optical Motion Tracking System”, Proc.Intl.Soc.Mag.Reson.Med.11:517(2004). |
Zeitsev et al., “Magnetic resonance imaging of freely moving objects: Prospective real-time motion correction using an external optical motion tracking system”, NeuroImage 31 (Jan. 29, 2006) 1038-1050. |
Number | Date | Country | |
---|---|---|---|
20170143271 A1 | May 2017 | US |
Number | Date | Country | |
---|---|---|---|
62333023 | May 2016 | US | |
62332402 | May 2016 | US | |
62258915 | Nov 2015 | US |