Videographic display of real-time medical treatment

Information

  • Patent Grant
  • 10561861
  • Patent Number
    10,561,861
  • Date Filed
    Wednesday, May 2, 2012
    12 years ago
  • Date Issued
    Tuesday, February 18, 2020
    4 years ago
Abstract
Systems and methods for creation of a videographic representation of a real-time medical treatment by a processor operating in conjunction with a treatment device and an imaging device are described. The representation may include real-time imaging data, a representation of real-time treatment data and contours of relevant anatomy among other things. Related systems and techniques are also described.
Description
TECHNICAL FIELD

The present disclosure relates to systems and methods for creation of a videographic representation of a real-time medical treatment performed by a treatment device in conjunction with an imaging device.


BACKGROUND

During certain medical treatments, for example, surgery or radiation therapy, movement of portions of the body being treated can decrease the effectiveness of a treatment or even cause harmful results. For example, movement of a tumor during radiation therapy can cause the radiation beam to miss the tumor and hit healthy tissue. Thus, it would be advantageous to provide visual confirmation or documentation of medical treatments, so as to enable convenient assessment of treatment accuracy.


SUMMARY

Creation of a videographic display of a real-time treatment by a treatment device operating in conjunction with an imaging device is described. An initial image of a treatment region can be obtained. One or more contours of relevant anatomy in the initial image can be provided for or calculated. Real-time images of the treatment region and real-time treatment data can be obtained during a medical treatment. One or more real-time contours of relevant anatomy in a plurality of the real-time images can be determined and a videographic display can be created. The videographic display can include the real time images, corresponding real-time contours and a representation of the real-time treatment data. Related apparatus, systems and techniques are also described.


In one aspect, real-time images of a treatment region can be obtained during a medical treatment. One or more real-time contours of relevant anatomy in a plurality of the real-time images can be determined. A videographic display of a plurality of the real-time images can be provided for. The videographic display can include the corresponding real-time contours and a representation of the real-time treatment data.


The real-time images can be obtained from a magnetic resonance imaging system and can be determined using deformable image registration. In one implementation, the medical treatment can be radiation therapy and the real time treatment data can comprise one or more of: firing status of one or more radiation beams of a radiation therapy device, location of one or more radiation beams, shape of one or more radiation beams, intensity of radiation associated with one or more radiation beams, and delivered radiation dose. The representation of the real-time treatment data in the videographic display can include one or more of: a graphical representation of a beam for one or more radiation beams when the one or more radiation beams are on, an indication of accumulated dose deposition, and textual indications of the amount of accumulated deposited dose.


Initial and real-time images can be obtained from one or more of: a magnetic resonance imaging system, a positron emission tomography scanner, a computed tomography scanner and an X-ray machine. The real-time contours can be determined using deformable image registration.


Real-time treatment data can comprise at least one of surgical instrument location information, surgical incision location information, a graphical representation of an instrument and a portion of a medical professional that can be captured in the real-time images. The videographic display can be provided on a remote device and can be three dimensional.


In another aspect, a system can comprise an imaging device, a treatment device, a display, and a processor. The imaging device can be configured to acquire real-time images of a treatment region during a medical treatment. The treatment device can be configured to acquire real-time treatment data during the medical treatment. The processor can be configured to determine real-time contours of relevant anatomy in a plurality of real-time images and to output to the display a videographic display of a plurality of the real-time images. The videographic display can include the corresponding real-time contours and a representation of the real-time treatment data.


The imaging device can be a magnetic resonance imaging system and the processor can determine the real-time contours using deformable image registration. The treatment device can be a radiation therapy device having one or more radiation beams and the real time treatment data comprising one or more of: firing status of one or more radiation beams, location of one or more radiation beams radiation beams, shape of one or more radiation beams, intensity of radiation associated with one or more radiation beams, and delivered radiation dose. The representation of the real-time treatment data in the videographic display can include one or more of: a graphical representation of a beam for one or more radiation beams when the radiation beam is on, an indication of accumulated dose deposition, and textual indications of the amount of accumulated deposited dose.


The imaging device can be selected from: a magnetic resonance imaging system, a positron emission tomography scanner, a computed tomography scanner and an X-ray machine. The system can further comprise a second imaging device that can be configured to acquire a portion of the images of the treatment region.


The processor can determine the real-time contours using deformable image registration. The real-time treatment data can comprise at least one of surgical instrument location information, surgical incision location information, a graphical representation of at least one of an instrument that is captured in the real-time images and a portion of a medical professional that is captured in real-time images. The display can be on a remote device and can be three dimensional. The videographic display can be provided over a web interface to a social media site.


These and other features, aspects, and advantages of the present disclosure will become better understood with reference to the following description and claims.





BRIEF DESCRIPTION OF DRAWINGS

Features, aspects, and implementations of the disclosure are described in conjunction with the attached drawings, in which:



FIG. 1 is a simplified diagram illustrating aspects of a system consistent with implementations of the current subject matter;



FIG. 2 illustrates an example of a treatment device operating in conjunction with an exemplary imaging device;



FIG. 3 illustrates another example treatment device operating in conjunction with an exemplary imaging device;



FIG. 4 illustrates yet another example of treatment device operating in conjunction with an imaging device;



FIG. 5 is a simplified diagram illustrating aspects of a method consistent with implementations of the current subject matter; and



FIG. 6 illustrates a videographic display consistent with implementations of the current subject matter.





DETAILED DESCRIPTION

The subject matter described herein provides many advantages, some of which are noted below. The creation of a videographic display of real-time medical treatment allows visual documentation of the treatment to be shared with a doctor, patient, family member, caretaker, etc. The viewer will be able to see, via a videographic display, portions of the body that were treated, and the operation of a treatment device acting on those body portions to treat the patient (for example, incisions by a robotic surgery scalpel, radiation by a radiation therapy device, etc.). Accordingly, the videographic display can provide evidence of whether the patient was treated properly, thereby providing information and potential psychological satisfaction to the patient or other viewers. The videographic display can also be used to educate and to assess whether additional treatments or medical-care may be required. Such real-time videographic display of treatment can also ensure efficiency amongst clinicians as their role in treatment can be recorded and shared with the patient and other individuals.



FIG. 1 is a simplified diagram illustrating aspects of a system 100 consistent with implementations of the current subject matter. The system 100 can include at least a treatment device 102, an imaging device 104, a processor 106, and a display 108. The treatment device 102 can be a radiation therapy device, one or more robotic surgery arms, one or more surgical scalpels, or any other suitable treatment devices, for example, high intensity focused ultrasound (HIFU) ablation devices, cryo-ablation devices, laser ablation devices, RF ablation devices, and catheters or other devices for delivering treatments such as brachytherapy, stents, embolic devices, vascular grafts, suture markers, and orthopedic devices including but not limited to screws and plates. The imaging device 104 can be a magnetic resonance imaging (MRI) system, a positron emission tomography (PET) scanner, a computed tomography (CT) scanner, an X-ray machine, other imaging devices or a combination of imaging devices. The processor 106 can include a computer that includes at least one microprocessor and a machine-readable medium that can store instructions, or the processor may comprise any other computer components configured to implement the systems and methods of the present disclosure. The display 108 can include a terminal device, such as a desktop computer, laptop, tablet computer, cellular phone, television, holograph generator or any other display medium. In some implementations, the terminal device can be a remote device, remote from the treatment device 102 and the imaging device 104. Each of the arrows 110, 112, and 114 can separately characterize either a wired or wireless connection, such as cellular network, internet, local area network, or wide area network, and each connection may include multiple connections and may pass through and interact with additional elements such as servers, cloud computing devices and the like. Real-time treatment data passes over connection 110 from the treatment device 102 to the processor 106 and real-time image data passes through connection 112 from imaging device 104 to processor 106. Information required to create a videographic display travels across connection 114 from the processor 106 to the display 108.



FIG. 2 illustrates an example of a treatment device 102 operating in conjunction with an imaging device 104 as described in the present disclosure. The treatment device 102 in this example is a radiation therapy device 206 mounted on a rotatable gantry 216. The imaging device 104 in this example is a magnetic resonance imaging (MRI) system 208. Radiation therapy device 206 can include one or more radiation beam sources, such as a cobalt-60 source or a linear accelerator, which can be controlled to transmit radiation to a predetermined anatomy of a patient 212. The radiation therapy device 206 can further include one or more multi-leaf collimators to shape the radiation beams transmitted to patient 212. The radiation beam or beams may thus be turned on and off, can be rotated on the gantry 116, and can have their shape and intensity modified by multi-leaf collimators or other devices.



FIG. 3 illustrates an additional example of a treatment device 102 operating in conjunction with an imaging device 104 as described in the present disclosure. The treatment device 102 in this example is a robotic surgery device 306. The imaging device 104 in this example is a magnetic resonance imaging (MRI) system 308. The robotic surgery device 306 can be controlled to move with respect to the body of a patient 310 so as to operate on the patient 310.



FIG. 4 illustrates an additional example of a treatment device 102 operating in conjunction with an imaging device 104 as described in the present disclosure. The treatment device 102 in this example is a surgical scalpel 406. The imaging device 104 in this example is a magnetic resonance imaging (MRI) system 412. Surgical scalpel 406 can be configured to be used by a clinician 408 on a portion of the body of patient 410 to perform a surgical procedure.


Further details regarding the exemplary implementations of FIGS. 2-4 will be understood with reference to the below explanation of an exemplary method of the present disclosure.



FIG. 5 is a simplified diagram illustrating aspects of an exemplary process 500 that may be used to implement the method of the present disclosure. For example, an initial image of a treatment region may be obtained from imaging device 104 at 502. One or more contours of relevant anatomy may then be provided or calculated in the initial image at 504. As an example, contours may include the boundaries of a tumor to be treated and also the boundaries of nearby anatomy that must be avoided in the treatment. These contours may be developed manually by a clinician. In an alternate implementation, the contours can be developed using automated computer programs.


After the medical treatment begins, real-time image data from the treatment region can be obtained using imaging device 104 at 506. Additionally, real-time treatment data can be obtained from treatment device 102 at 508.


Real-time treatment data for the implementation depicted in FIG. 2 (wherein treatment device 102 is a radiation therapy device 206) may include one or more of: the firing status of the one or more radiation beams, the angular location of the beam(s), the shape of the beam(s), the intensity of radiation associated with the beam(s), the delivered radiation dose, and the like. Such treatment data can be acquired through, for example, sensors associated with the radiation source, the rotatable gantry, the leaves on the multi-leaf collimator(s), and the control system of the radiation therapy device. Treatment data could also be gathered from the imaging device 102 itself. Other data could be gathered as well, as would be understood by those skilled in the art.


Real-time treatment data for the implementation depicted in FIG. 3 (wherein treatment device 102 is a robotic surgery device 306) may include one or more of: the location and trajectory of the robotic surgery device 306, coordinates (for example, Cartesian coordinates, spherical coordinates, or cylindrical coordinates) of the incision into the body of the patient 310, the shape of the operative implement of robotic surgery device 306, audio and/or visual recordings of one or more clinicians involved in the surgery, and the like. Such treatment data can be acquired through, for example, sensors associated with the robotic surgery device 306 to record its trajectory, location or change in shape or a graphic or videographic camera or other imaging modality used in conjunction with the robotic surgery device 306. Treatment data could also be gathered from the imaging device 102 as well. Other data could be gathered as well, as would be understood by those skilled in the art.


Real-time treatment data for the implementation depicted in FIG. 4 (wherein treatment device 102 is a surgical scalpel 406) may include one or more of: the location and trajectory of the surgical scalpel 406, coordinates (for example, Cartesian coordinates, spherical coordinates, or cylindrical coordinates) of the incision into the body of the patient 416, audio and/or visual recordings of one or more clinicians involved in the surgery, and the like. Such treatment data can be acquired through, for example, sensors associated with the surgical scalpel 406 to record its trajectory or location or a graphic or videographic camera or other imaging modality used in conjunction with the surgical scalpel 406. Treatment data could also be gathered from the imaging device 102 as well. Other data could be gathered as well, as would be understood by those skilled in the art.


At 510 in FIG. 5, real-time contours of the relevant anatomy in the real-time images may be determined. The real-time contours can be determined using deformable image registration and the initial image and contours acquired in 502 and 504, as well as with real-time images and contours. Contours may also be developed using other image processing techniques individually or in combination, such as, edge detection, pattern recognition, and the like.


A videographic display of a plurality of the real-time images can be created at 512 in FIG. 5. The display can include the corresponding real-time contours of the plurality of the real-time images and a representation of the real-time treatment data. Display 108 may be a terminal device, such as a desktop computer, laptop, tablet computer, cellular phone, television, holograph generator or any device capable of displaying a video. The display 108 may be located with the treatment device 102 and imaging device 104 or it may be remote from that location. As previously mentioned, the videographic display may utilize a wired connection or a wireless connection, such as cellular network, internet, local area network, wide area network, metropolitan area network, Bluetooth network, infrared network, Zigbee wireless network, Wibree network, and the like, and each connection may include multiple connections and may pass through and interact with additional elements such as servers and cloud computing devices. In one example, the videographic display may be provided over a web interface to a site such as a social media site where individuals can post, share and comment on various documented medical treatments.


In one implementation, the components of the treatment data and image data can be assembled together at the processor 106 such that a videographic file can be sent to display 108. In an alternate implementation, the components of the treatment data and image data can be sent separately from the processor 106 to display 108 where the separately received components can be assembled to form the videographic display. The assembly can be performed based upon selections by a viewer of the videographic display.


In one implementation, acquired treatment data can be overlaid on respective real-time images to form a set of final images, and then images of the set can be combined to form the videographic display. This overlaying can be performed using a correlation, in time and space, of treatment data and corresponding image data.


At least a portion (for example, a predetermined number) of images of the set of final image data may be combined to form the videographic display. The predetermined number can vary depending factors such as, the communication channel utilized (for example, direct wire connection, cellular network, internet, local area network, wide area network, metropolitan area network, Bluetooth network, etc.), the available bandwidth of the communication channel, the display type (for example, laptop, desktop computer, tablet computer, cellular phone, wireless device) and other factors. In one example, the predetermined number when the display 108 is a mobile phone can be lower than the predetermined number when the display 108 is a high processing capability computer. In some implementations, the predetermined number can be determined by a clinician based on a personal preference of the clinician or the patient.



FIG. 6 illustrates one example of a videographic display 600 that can be created utilizing the method of the present disclosure and the particular treatment device of a radiation therapy device. It is understood that numerous other types of displays can be created based upon the teachings herein. The videographic display 600 can include a graphic or videographic representation of anatomy 602 of a patient. In the example of FIG. 6, the anatomy 602 includes the lungs, heart, spine and abdomen. The anatomy 602 can also include, for example, a tumor 603 that is being treated.


The videographic display 600 may be two or three dimensional. In one implementation, the multidimensional data in videographic display 600 can be a combination of the data obtained by the processor 106 from the imaging device 104 and treatment device 102. The two dimensional image data for different orientations of the patient can be obtained by processor 106 from the imaging device 104 and treatment device 102, and this two dimensional data can be combined to form the three dimensional data. The combining may be performed using reconstruction algorithms, such as three-dimensional reconstruction, triangulation, and the like. Some such reconstruction algorithms can involve at least one of: matching pixels in different 2D images, sampling a 3D space, determining a consistency function that correlates (based on variance, covariance, and the like statistical techniques) one or more image parameters (for example, color, anatomical texture, and the like), mapping each pixel in each 2D image to a vector in 3D, and like techniques. When the videographic display 600 has more than two dimensions, a button such as 604 can be provided that can be pressed by a user to view the treatment from different angles; alternatively, a swipe on a touch screen could perform the same function. In one implementation, a user can zoom in or out of the multidimensional display with buttons such as 606 and 608.


Real-time contours enclosing different anatomical structures or other areas such as a region of anatomy targeted for treatment may be included in the videographic display. Contours may be represented in different colors. For example, the display may include a contour 610 around tumor 603, which may red in color while neighboring structures may be enclosed by a blue contour. Various structures may also include internal shading if desired, which may be partially translucent. In the example where the treatment device 102 is a radiation therapy device, the display may include an additional contour 612 representing the location of the planned treatment beam (which typically extends up to some distance outside of the tumor 603).


In one implementation, the videographic display 600 can include an option through which a user can select the contours that the user (for example, a clinician or a user of display 108) may desire to view. For example, if the user desires to view only the contour for the tumor, the user can select just that contour.


Treatment data may be included in videographic display 600. For example, as shown in the FIG. 6 example utilizing a radiation treatment device, beam representations 614 can be included. The beam representations may turn on and off or change shape or angle or intensity as the video progresses, corresponding to the operation of the actual beams during treatment (as determined by the acquired real-time treatment data). The treatment data can be made translucent so that the real-time image data is still visible. This representation of the beams may depict the beams turning on and off in line with the planned treatment regimen, and they may depict turning on and off as a result of the target tumor 603 moving beyond the contour 612 that represents the location of the planned treatment beam (as may occur as a result of patient motion or normal organ motion). The radiation beam treatment data may be depicted in numerous other ways. Another example representation of treatment data includes an indication of accumulated dose deposition. For example, as dose accumulates over the course of the treatment, the region in which the dose is deposited may turn darker in color.


The videographic display 600 may also include an information display 622 for displaying additional data such as a patient identifier (ID), the color of various contours, and additional treatment data. Examples of additional treatment data include total deposited dose, treatment time, beam on time, etc. In some implementations, the amount of dose deposited can vary in real-time as the amount increases during the treatment.


Other treatment data can be included in videographic display 600 when other treatment devices are used. For example, when treatment device 102 is a robotic surgery arm 306 or surgery scalpel 406, a representation of the scalpel or surgery device can be depicted. Alternatively, a representation of where an incision is taking place can be depicted or the representation of treatment data may simply be visible from the imaging data itself. The display may also be configured to depict all or a portion of a medical professional performing the treatment. Numerous other types of treatment data or combinations of treatment data and other data can be depicted. In some implementations, the user can be provided an option to select which set of treatment data the user desires to view and also to select whether or not to depict any other data or even the imaging data.


Although a few implementations have been described in detail above, other modifications are possible. For example, the method depicted in FIG. 5 and described herein may not require the particular order shown, or sequential order, to achieve desirable results.


While various implementations in accordance with the disclosed principles have been described above, it should be understood that they have been presented by way of example only, and are not limiting. Thus, the breadth and scope of the invention(s) should not be limited by any of the above-described exemplary implementations, but should be defined only in accordance with the claims and their equivalents issuing from this disclosure. The present disclosure contemplates that the calculations disclosed in the implementations herein may be performed in a number of ways, applying the same concepts taught herein, and that such calculations are equivalent to the implementations disclosed. Furthermore, the above described advantages are not intended to limit the application of any issued claims to processes and structures accomplishing any or all of the advantages.


Additionally, section headings shall not limit or characterize the invention(s) set out in any claims that may issue from this disclosure. Specifically, and by way of example, although the headings refer to a “Technical Field,” such claims should not be limited by the language chosen under this heading to describe the so-called technical field. Further, the description of a technology in the “Background” is not to be construed as an admission that technology is prior art to any invention(s) in this disclosure. Neither is the “Summary” to be considered as a characterization of the invention(s) set forth in issued claims. Furthermore, any reference to this disclosure in general or use of the word “invention” in the singular is not intended to imply any limitation on the scope of the claims set forth below. Multiple inventions may be set forth according to the limitations of the multiple claims issuing from this disclosure, and such claims accordingly define the invention(s), and their equivalents, that are protected thereby.

Claims
  • 1. A method comprising: emitting one or more radiation beams of ionizing radiation from a radiation beam source controlled by a radiation treatment system to a treatment region according to a planned treatment regimen;obtaining a plurality of real-time images of the treatment region using an imaging device that is a magnetic resonance imaging system while the radiation beam source emits the one or more radiation beams;obtaining real-time treatment data relating to the one or more radiation beams;determining one or more real-time contours of relevant anatomy being irradiated in the plurality of the real-time images using an image processing technique; andproviding for a videographic display of one or more of the plurality of the real-time images, the videographic display including: one or more corresponding real-time contours enclosing at least one anatomical structure; andone or more beam representations generated based at least on the real-time treatment data relating to the one or more radiation beams, wherein the one or more beam representations depict the one or more radiation beams turning on or off during progression of the videographic display based on the obtained real-time treatment data.
  • 2. The method of claim 1, wherein the image processing technique is deformable image registration.
  • 3. The method of claim 1, wherein the videographic display is provided on a remote device.
  • 4. The method of claim 1, wherein the videographic display is three dimensional.
  • 5. The method of claim 1, wherein the one or more beam representations change angle during progression of the videographic display based on the obtained real-time treatment data.
  • 6. The method of claim 1, wherein the one or more beam representations change shape during progression of the videographic display based on the obtained real-time treatment data.
  • 7. The method of claim 1, wherein the one or more beam representations are translucent so that the videographic display of the one or more of the plurality of the real-time images is visible through the one or more beam representations.
  • 8. A system comprising: a radiation beam source configured to emit to a treatment region one or more radiation beams of ionizing radiation according to a planned treatment regimen, the radiation beam source further configured to acquire real-time treatment data relating to the one or more radiation beams;a magnetic resonance imaging system configured to acquire a plurality of real-time images of the treatment region while the one or more radiation beams are emitted from the radiation beam source; anda processor programmed to determine one or more real-time contours of relevant anatomy being irradiated in the plurality of the real-time images using an image processing technique and to output, to a display, a videographic display of one or more of the plurality of the real-time images, the videographic display including: one or more corresponding real-time contours enclosing at least one anatomical structure; andone or more beam representations generated based at least on the real-time treatment data relating to the one or more radiation beams, wherein the one or more beam representations depict the one or more radiation beams turning on or off during progression of the videographic display based on the obtained real-time treatment data.
  • 9. The system of claim 8, wherein the processor is further programmed to implement the image processing technique using deformable image registration.
  • 10. The system of claim 8, wherein the display is on a remote device.
  • 11. The system of claim 8, wherein the processor is further programmed to implement a three dimensional videographic display.
  • 12. The system of claim 8, wherein the processor is further programmed to output a depiction of the one or more beam representations: changing angle during progression of the videographic display based on the obtained real-time treatment data; orchanging shape during progression of the videographic display based on the obtained real-time treatment data.
  • 13. The system of claim 8, wherein the processor is further programmed to output a depiction of the one or more beam representations being translucent so that the videographic display of the one or more of the plurality of real-time images is visible through the one or more beam representations.
  • 14. A computer program product comprising a non-transitory, machine-readable medium storing instructions which, when executed by at least one programmable processor, cause the at least one programmable processor to perform operations comprising: obtaining a plurality of real-time images of a treatment region as acquired with an imaging device that is a magnetic resonance imaging system while a radiation beam source emits one or more radiation beams;obtaining real-time treatment data relating to the one or more radiation beams;obtaining one or more real-time contours of relevant anatomy being irradiated in the plurality of the real-time images; andproviding for a videographic display of one or more of the plurality of the real-time images, the videographic display including: one or more corresponding real-time contours enclosing at least one anatomical structure; andone or more beam representations generated based at least on the real-time treatment data relating to the one or more radiation beams, wherein the one or more beam representations depict the one or more radiation beams turning on or off during progression of the videographic display based on the obtained real-time treatment data.
  • 15. The computer program product of claim 14, the videographic display further including a depiction of the one or more beam representations changing angle during progression of the videographic display based on the obtained real-time treatment data.
  • 16. The computer program product of claim 14, the videographic display further including a depiction of the one or more beam representations changing shape during progression of the videographic display based on the obtained real-time treatment data.
  • 17. The computer program product of claim 14, wherein the videographic display is on a remote device.
  • 18. The computer program product of claim 14, wherein the at least one programmable processor is further programmed to implement a three dimensional videographic display.
  • 19. The computer program product of claim 14, wherein the one or more beam representations are translucent so that the videographic display of the one or more of the plurality of the real-time images is visible through the one or more beam representations.
US Referenced Citations (286)
Number Name Date Kind
3428307 Hunter Feb 1969 A
4019059 Brundin Apr 1977 A
4233662 LeMay Nov 1980 A
4481657 Larsson Nov 1984 A
4589126 Augustsson May 1986 A
4694837 Blakeley Sep 1987 A
4771785 Duer Sep 1988 A
4851778 Kaufman Jul 1989 A
5027818 Bova Jul 1991 A
5039867 Nishihara Aug 1991 A
5117829 Miller Jun 1992 A
5216255 Weidlich Jun 1993 A
5317616 Swerdloff May 1994 A
5327884 Hardy Jul 1994 A
5328681 Kito Jul 1994 A
5332908 Weidlich Jul 1994 A
5351280 Swerdloff Sep 1994 A
5365927 Roemer Nov 1994 A
5373844 Smith Dec 1994 A
5377678 Dumoulin Jan 1995 A
5378989 Barber Jan 1995 A
5391139 Edmundson Feb 1995 A
5412823 Sitta May 1995 A
5442675 Swerdloff Aug 1995 A
5443068 Cline Aug 1995 A
5458125 Schweikard Oct 1995 A
5511549 Legg Apr 1996 A
5513238 Leber Apr 1996 A
5537452 Shepherd Jul 1996 A
5538494 Matsuda Jul 1996 A
5547454 Horn Aug 1996 A
5555283 Shiu Sep 1996 A
5596619 Carol Jan 1997 A
5602892 Llacer Feb 1997 A
5602982 Llacer Feb 1997 A
5647361 Damadian Jul 1997 A
5659281 Pissanetzky Aug 1997 A
5722411 Suzuki Mar 1998 A
5724400 Swerdloff Mar 1998 A
5734384 Yanof Mar 1998 A
5740225 Nabatame Apr 1998 A
5748700 Shepherd May 1998 A
5751781 Brown May 1998 A
5757881 Hughes May 1998 A
5790996 Narfstrom Aug 1998 A
5802136 Carol Sep 1998 A
5815547 Shepherd Sep 1998 A
5851182 Sahadevan Dec 1998 A
5894503 Shepherd Apr 1999 A
5993373 Nonaka Nov 1999 A
6038283 Carol Mar 2000 A
6052430 Siochi Apr 2000 A
6094760 Nonaka Aug 2000 A
6104779 Shepherd Aug 2000 A
6112112 Gilhuijs Aug 2000 A
6125335 Simon Sep 2000 A
6144875 Schweikard Nov 2000 A
6175761 Frandsen Jan 2001 B1
6198957 Green Mar 2001 B1
6207952 Kan Mar 2001 B1
6223067 Vilsmeier Apr 2001 B1
6240162 Hernandez-Guerra May 2001 B1
6260005 Yang Jul 2001 B1
6314159 Siochi Nov 2001 B1
6330300 Siochi Dec 2001 B1
6349129 Siochi Feb 2002 B1
6366798 Green Apr 2002 B2
6381486 Mistretta Apr 2002 B1
6385286 Fitchard May 2002 B1
6385477 Werner May 2002 B1
6393096 Carol May 2002 B1
6405072 Cosman Jun 2002 B1
6411675 Llacer Jun 2002 B1
6414487 Anand Jul 2002 B1
6422748 Shepherd Jul 2002 B1
6424856 Vilsmeier Jul 2002 B1
6466813 Shukla Oct 2002 B1
6487435 Mistretta Nov 2002 B2
6504899 Pugachev Jan 2003 B2
6512813 Krispel Jan 2003 B1
6512942 Burdette et al. Jan 2003 B1
6516046 Frohlich Feb 2003 B1
6526123 Ein-Gal Feb 2003 B2
6527443 Vilsmeier Mar 2003 B1
6542767 McNichols Apr 2003 B1
6546073 Lee Apr 2003 B1
6560311 Shepard May 2003 B1
6564084 Allred May 2003 B2
6570475 Lvovsky May 2003 B1
6584174 Schubert Jun 2003 B2
6594516 Steckner Jul 2003 B1
6600810 Hughes Jul 2003 B1
6609022 Vilsmeier Aug 2003 B2
6611700 Vilsmeier Aug 2003 B1
6618467 Ruchala Sep 2003 B1
6657391 Ding Dec 2003 B2
6661870 Kapatoes Dec 2003 B2
6708054 Shukla Mar 2004 B2
6719683 Frohlich Apr 2004 B2
6724922 Vilsmeier Apr 2004 B1
6728336 Bortfeld Apr 2004 B2
6731970 Schlossbauer May 2004 B2
6735277 McNutt May 2004 B2
6757355 Siochi Jun 2004 B1
6772002 Schmidt Aug 2004 B2
6778850 Adler Aug 2004 B1
6792074 Erbel Sep 2004 B2
6849129 Bilz et al. Feb 2005 B2
6853704 Collins Feb 2005 B2
6859660 Vilsmeier Feb 2005 B2
6862469 Bucholz Mar 2005 B2
6865253 Blumhofer Mar 2005 B2
6865411 Erbel Mar 2005 B2
6879714 Hutter Apr 2005 B2
6885886 Bauch Apr 2005 B2
6891375 Goto May 2005 B2
6891924 Yoda May 2005 B1
6898456 Erbel May 2005 B2
6915005 Ruchala Jul 2005 B1
6937696 Mostafavi Aug 2005 B1
6947582 Vilsmeier Sep 2005 B1
6965847 Wessol Nov 2005 B2
6980679 Jeung Dec 2005 B2
6999555 Morf Feb 2006 B2
7012385 Kulish Mar 2006 B1
7046762 Lee May 2006 B2
7046765 Wong May 2006 B2
7046831 Ruchala May 2006 B2
7050845 Vilsmeier May 2006 B2
7095823 Topolnjak Aug 2006 B2
7096055 Schweikard Aug 2006 B1
7123758 Jeung Oct 2006 B2
7130372 Kusch Oct 2006 B2
7154991 Earnst Dec 2006 B2
7162005 Bjorkholm Jan 2007 B2
7166852 Saracen Jan 2007 B2
7171257 Thomson Jan 2007 B2
7180366 Roos Feb 2007 B2
7191100 Mostafavi Mar 2007 B2
7204640 Fu Apr 2007 B2
7221733 Takai May 2007 B1
7227925 Mansfield Jun 2007 B1
7231075 Raghavan Jun 2007 B2
7231076 Fu Jun 2007 B2
7260426 Schweikard Aug 2007 B2
7266175 Romesberg Sep 2007 B1
7266176 Allison Sep 2007 B2
7289599 Seppi Oct 2007 B2
7298819 Dooley Nov 2007 B2
7302038 Mackie Nov 2007 B2
7315636 Kuduvalli Jan 2008 B2
7317782 Bjorkholm Jan 2008 B2
7318805 Schweikard Jan 2008 B2
7324626 Vilsmeier Jan 2008 B2
7327865 Fu Feb 2008 B2
7366278 Fu Apr 2008 B2
7394081 Okazaki Jul 2008 B2
7403638 Jeung Jul 2008 B2
7412029 Myles Aug 2008 B2
7415095 Wofford Aug 2008 B2
7423273 Clayton Sep 2008 B2
7426318 Fu Sep 2008 B2
7444178 Goldbach Oct 2008 B2
7463823 Birkenbach Dec 2008 B2
7471813 Ulmer Dec 2008 B2
7477776 Lachner Jan 2009 B2
7480399 Fu Jan 2009 B2
7505037 Wang Mar 2009 B2
7505617 Fu Mar 2009 B2
7522779 Fu Apr 2009 B2
7558617 Vilsmeier Jul 2009 B2
7570987 Raabe Aug 2009 B2
7577474 Vilsmeier Aug 2009 B2
7589326 Mollov Sep 2009 B2
7634122 Bertram Dec 2009 B2
7636417 Bjorkholm Dec 2009 B2
7638752 Partain Dec 2009 B2
7657304 Mansfield Feb 2010 B2
7688998 Tuma Mar 2010 B2
7728311 Gall Jun 2010 B2
7741624 Sahadevan Jun 2010 B1
7785358 Lach Aug 2010 B2
7901357 Boctor Mar 2011 B2
7902530 Sahadevan Mar 2011 B1
7907987 Dempsey Mar 2011 B2
7957507 Cadman Jun 2011 B2
8139714 Sahadevan Mar 2012 B1
8190233 Dempsey May 2012 B2
8214010 Courtney Jul 2012 B2
8331531 Fahrig Dec 2012 B2
8460195 Courtney Jun 2013 B2
8812077 Dempsey Aug 2014 B2
8836332 Shvartsman Sep 2014 B2
8983573 Carlone Mar 2015 B2
9114253 Dempsey Aug 2015 B2
20010049475 Bucholz Dec 2001 A1
20020046010 Wessol Apr 2002 A1
20020087101 Barrick Jul 2002 A1
20020091315 Spetz Jul 2002 A1
20020150207 Kapatoes Oct 2002 A1
20020151786 Shukla Oct 2002 A1
20020193685 Mate et al. Dec 2002 A1
20030011451 Katznelson Jan 2003 A1
20030086526 Clark May 2003 A1
20030112922 Burdette et al. Jun 2003 A1
20030155530 Adnani Aug 2003 A1
20030181804 Gagnon Sep 2003 A1
20030219098 McNutt Nov 2003 A1
20040106869 Tepper Jun 2004 A1
20040254448 Amies Dec 2004 A1
20040254773 Zhang Dec 2004 A1
20050020917 Scherch Jan 2005 A1
20050053267 Mostafavi Mar 2005 A1
20050054916 Mostafavi Mar 2005 A1
20050143965 Failla Jun 2005 A1
20050197564 Dempsey Sep 2005 A1
20050201516 Ruchala Sep 2005 A1
20050254623 Kamath Nov 2005 A1
20060058636 Wemple Mar 2006 A1
20060074292 Thomson Apr 2006 A1
20060170679 Wang Aug 2006 A1
20060193441 Cadman Aug 2006 A1
20060280287 Esham Dec 2006 A1
20060291621 Yan Dec 2006 A1
20070003021 Guertin Jan 2007 A1
20070016014 Hara Jan 2007 A1
20070038058 West et al. Feb 2007 A1
20070043286 Lu Feb 2007 A1
20070197908 Ruchala Aug 2007 A1
20070244386 Steckner Oct 2007 A1
20080033287 Schwarze et al. Feb 2008 A1
20080093567 Gall Apr 2008 A1
20080123927 Miga May 2008 A1
20080177138 Courtney Jul 2008 A1
20080208036 Amies Aug 2008 A1
20080235052 Node-Langlois Sep 2008 A1
20080303457 Maltz Dec 2008 A1
20090060130 Wilkens Mar 2009 A1
20090129545 Adler May 2009 A1
20090129659 Deutschmann May 2009 A1
20090149735 Fallone Jun 2009 A1
20090161826 Gertner Jun 2009 A1
20090171184 Jenkins Jul 2009 A1
20090175418 Sakurai Jul 2009 A1
20090264768 Courtney Oct 2009 A1
20100033186 Overweg Feb 2010 A1
20100056900 Whitcomb Mar 2010 A1
20100113911 Dempsey May 2010 A1
20100119032 Yan et al. May 2010 A1
20100239066 Fahrig Sep 2010 A1
20100312095 Jenkins Dec 2010 A1
20110012593 Shvartsman Jan 2011 A1
20110051893 McNutt Mar 2011 A1
20110118588 Komblau May 2011 A1
20110121832 Shvartsman May 2011 A1
20110218420 Carlone Sep 2011 A1
20110237859 Kuhn Sep 2011 A1
20110241684 Dempsey Oct 2011 A1
20110284757 Butuceanu Nov 2011 A1
20120022363 Dempsey Jan 2012 A1
20120070056 Krueger Mar 2012 A1
20120150017 Yamaya et al. Jun 2012 A1
20120165652 Dempsey Jun 2012 A1
20120253172 Loeffler Oct 2012 A1
20130066135 Rosa Mar 2013 A1
20130086163 Neff Apr 2013 A1
20130090549 Meltsner Apr 2013 A1
20130245425 Dempsey Sep 2013 A1
20130296687 Dempsey Nov 2013 A1
20130345556 Courtney Dec 2013 A1
20140003023 Weibler Jan 2014 A1
20140112453 Prince Apr 2014 A1
20140121495 Dempsey May 2014 A1
20140135615 Krulp May 2014 A1
20140263990 Kawrykow Sep 2014 A1
20140266206 Dempsey Sep 2014 A1
20140266208 Dempsey Sep 2014 A1
20140275963 Shvartsman Sep 2014 A1
20140330108 Dempsey Nov 2014 A1
20140336442 Keppel Nov 2014 A1
20140347053 Dempsey Nov 2014 A1
20150065860 Shvartsman Mar 2015 A1
20150077118 Shvartsman Mar 2015 A1
20150154756 Gerganov Jun 2015 A1
20150165233 Dempsey Jun 2015 A1
20150185300 Shvartsman Jul 2015 A1
Foreign Referenced Citations (44)
Number Date Country
1612713 May 2005 CN
1669599 Sep 2005 CN
1946339 Apr 2007 CN
101000689 Jul 2007 CN
101267858 Sep 2008 CN
101268474 Sep 2008 CN
101278361 Oct 2008 CN
101443819 May 2009 CN
102369529 Mar 2012 CN
102472830 May 2012 CN
102641561 Aug 2012 CN
3828639 Mar 1989 DE
2359905 Aug 2011 EP
2424430 Jan 2013 EP
2839894 Nov 2003 FR
2393373 Mar 2004 GB
63-294839 Dec 1998 JP
2001517132 Oct 2001 JP
2002186676 Jul 2002 JP
2002522129 Jul 2002 JP
2005103295 Apr 2005 JP
2007526036 Sep 2007 JP
2009501043 Jan 2009 JP
2009511222 Mar 2009 JP
2009160309 Jul 2009 JP
2009538195 Nov 2009 JP
2010269067 Dec 2010 JP
9932189 Jul 1999 WO
02072190 Sep 2002 WO
WO-03008986 Jan 2003 WO
2004024235 Mar 2004 WO
2005081842 Sep 2005 WO
2006007277 Jan 2006 WO
2006097274 Sep 2006 WO
WO-2007007276 Jan 2007 WO
2007014106 Feb 2007 WO
2007045076 Apr 2007 WO
2007126842 Nov 2007 WO
WO-2008013598 Jan 2008 WO
2009155700 Dec 2009 WO
WO-2010103644 Sep 2010 WO
2010113050 Oct 2010 WO
2011008969 Jan 2011 WO
2012164527 Dec 2012 WO
Non-Patent Literature Citations (48)
Entry
English machine translation of JP63-294839 (1998), as provided by the Japanese Patent Office.
Hong et al. “Interventional navigation for abdominal therapy based on simultaneous use of MRI and ultrasound.” Medical and Biological Engineering and Computing. (2006). 44(12):1127-1134.
Partial International Search Report issued in International Application No. PCT/US2013/039009, dated Oct. 18, 2013.
Lagendijk J. J. et al. “MRI guided radiotherapy: A MRI based linear accelerator.” Radiotherapy & Oncology. vol. 56, No. Supplement 1. Sep. 2000. (Sep. 2000):S60-S61. XP008012866. 19th Annual Meeting of the European Society for Therapeutic Radiology and Oncology. Istanbul, Turkey. Sep. 19-23, 2000.
Batter, James M., et al. ‘Accuracy of a Wireless Localization System for Radiotherapy’ Int. J. Radiation Oncology Biol. Phys., vol. 61, No. 3. pp. 933-937, Nov. 1, 2004, Elsevier Inc., USA.
Baro, J et al. ‘Penelope: An algorithm for Monte Carlo simulation of the penetration and energy loss of electrons and positrons in matter’ Nuclear Instruments and Methods in Physics Research B 100 (1995) 31-46, Elsevier Science B.V.
Bemier, Jacques et al. ‘Radiation oncology: a century of achievements’ Nature Reviews—Cancer, vol. 4, Sep. 2004. pp. 737-747.
Buchanan, Roger ‘Cobalt on the way out’ British Medical Journal, vol. 292, Feb. 1, 1986. p. 290.
Chng, N. et al. ‘Development of inverse planning and limited angle CT reconstruction for cobalt-60 tomotherapy’ Proceedings of 51st Annual Meeting of Canadian Organization of Medical Physicists and the Canadian College of Physicists in Medicine, 2005, McMaster University, Hamilton Ontario. Medical Physics, 2005, pp. 2426, Abstract Only.
De Poorter J. et al. ‘Noninvasive MRI Thermometry with the Proton Resonance Frequencey (PRF) Method: In Vivo Results in Human Muscle Magnetic Resonance in Medicine.’ Academic Press, Duluth, vol. 33, No. 1, Jan. 1995 pp. 74-81 XP000482971.
EP App. No. 10195476.6; Extended EP Search Report dated Jul. 4, 2011.
Ep App. No. 10800553.9; Extended EP Search Report dated Oct. 17, 2013.
Extended European Search Report in European Patent Application No. EP11850577, dated Jul. 9, 2014.
Goitein, Michael. ‘Organ and Tumor Motion: An Overview.’ Seminars in Radiation Oncology. vol. 14, No. 1 Jan. 2004: pp. 2-9.
Goldberg, S. Nahum; G. Scott Gazelle, and Peter R. Mueller. ‘Thermal Ablation Therapy for Focal Malignancy: A Unified Approach to Underlying Principles, Techniques, and Diagnostic Imaging Guidance’ Amer. J. of Roentgenology, vol. 174, Feb. 2000 pp. 323-331 XP002431995.
Hajdok, George. ‘An Investigation of Megavoltage Computed Tomography Using a Radioactive Cobalt-60 Gamma Ray Source for Radiation Therapy Treatment Verification.’ Thesis. May 2002. 150 pages.
International Search Report and Written Opinion dated Apr. 13, 2012, for corresponding international application No. PCT/US2011/066605.
Jaffray, David A., et al. ‘Flat-Panel Cone Beam Computed Tomography for Image-Guided Radiation Therapy’ Int. J. Radiation Oncology Biol. Phys., vol. 53, No. 5, pp. 1337-1349, Apr. 3, 2002, Elsevier Science Inc., USA.
Jursinic, Paul et al. ‘Characteristics of secondary electrons produced by 6, 10 and 24 MV x-ray beams’ Phys. Med. Biol. 41 (1996) 1499-1509, United Kingdom.
Khan, Faiz M., ‘The Physics of Radiation Therapy (second edition)’, Lippincott Williams & Wilkins. Chapter 13. 1985. pp. 323-332.
Lagendijk et al, ‘MRI/linac integration’, Radiotherapy and Oncology, Elsevier, Ireland, (Nov. 26, 2007), vol. 86, No. 1, doi:10.1016/J.RADONC.2007.10.034, ISSN 0167-8140, pp. 25-29, XP022423061, Year 2008.
Langen, K.M. et al. ‘Organ Motion and its Management.’ Int J. Radiation Oncology Biol. Phys., vol. 50, No. 1, pp. 265-278. 2001. Elsevier Science Inc., USA.
Liang, J. and D. Yan. ‘Reducing Uncertainties in Volumetric Image Based Deformable Organ Registration.’ Medical Physics, vol. 30, No. 8, 2003, pp. 2116-2122.
Lopez, Mike R. et al. ‘Relativistic Magnetron Driven by a Microsecond E-Beam Accelerator with a Ceramic Insulator’ IEEE Transactions on Plasma Science vol. 32, No. 3, Jun. 2004. 10 pages.
Lurie, D.J., PhD. ‘Free radical imaging’ The British Journal of Radiology. 74 (2001). pp. 782-784.
Macura, Katarzyna J., MD, PhD. ‘Advancements in Magnetic Resonance-Guided Robotic Interventions in the Prostate’. Top Magn Reson Imaging. vol. 19, No. 6. Dec. 2008. pp. 297-304.
Mah et al., “Measurement of intrafractional prostate motion using magnetic resonance imaging,” Int. J. Radiation Oncology Boil. Phys. Vo.54, No. 2, pp. 568-575, 2002.
Medtronic, Inc.. ‘Image-Guided Surgery Overview’. 2010.
Mozer, Pierre C, MD, PhD. ‘Robotic Image-Guided Needle Interventions of the Prostate’. Reviews in Urology. vol. 11, No. 1. 2009. pp. 7-15.
Muntener, Michael, MD et al. ‘Transperineal Prostate Intervention: Robot for fully Automated MR Imaging—System Description and Proof of Principle in a Canine Model’. Radiology. vol. 247, No. 2. May 2008. pp. 543-549.
Overweg et al. ‘System for MRI guided Radiotherapy.’ Proc. Intl. Soc. Mag. Reson. Med. 17(2009):594.
Patriciu, Alexandru, et al., ‘Automatic Brachytherapy Seed Placement Under MRI Guidance’. IEEE Transactions on Biomedical Engineering. vol. 54, No. 8. Aug. 2007. pp. 1-8.
PCT App. No. PCT/US2010/042156; International Search Report and Written Opinion dated Sep. 10, 2010, dated Sep. 14, 2010.
Raaijmakers, A.J.E. et al. ‘Integrating a MRI scanner with a 6 MV radiotherapy accelerator: dose increase at tissue-air interfaces in a lateral magnetic field due to returning electrons.’ Phys. Med. Biol. 50 (2005) pp. 1363-1376.
Raaymakers, B.W. et al. ‘Integrating a MRI scanner with a 6 MV radiotherapy accelerator dose deposition in a transverse magnetic field’, Phys. Med. Biol. 49 (2004) 4109-4118.
Schreiner, John; Kerr, Andrew; Salomons, Greg; Dyck, Christine, and Hajdok, George, ‘The Potential for Image Guided Radiation Therapy with Cobalt-60 Tomotherapy’, MICCAI 2003, LNCS 2879, pp. 449-456, 2003.
Schreiner, L. John, et al. ‘The role of Cobalt-60 in modem radiation therapy: Dose delivery and image guidance’. Journal of Medical Physics, vol. 34, No. 3, 2009, 133-136.
Sempau, Josep et al. ‘DPM, a fast, accurate Monte Carlo code optimized for photon and electron radiotherapy treatment planning dose calculations.’ Phys. Med. Biol. 45 (2000) pp. 2263-2291, Printed in the UK.
Sherouse, George W. et al. ‘Virtual Simulation in the Clinical Setting: Some Practical Considerations’, Int. J. Radiation Oncology Biol. Phys. vol. 19, pp. 1059-1065, Apr. 26, 1990, Pergamon Press, USA.
St. Aubin et al,, ‘Magnetic decoupling on the linac in a low field biplanar linac-MR system’, Med. Phys, 37 (9), Sep. 2010, pp. 4755-4761.
Stoianovici, Dan, et al. MRI Stealth Robot for Prostate Interventions. Minimally Invasive Therapy. 2007. pp. 241-248.
Tamada and Kose. ‘Two-Dimensional Compressed Sensing Using the Cross-sampling Approach for Low-Field MRI Systems.’ IEEE Transactions on Medical Imaging. vol. 33, No. 9. Sep. 2014. pp. 1905-1912.
Tokuda, J. et al. ‘Real-Time Organ Motion Tracking and Fast Image Registration System for MRI-Guided Surgery.’ Systems and Computers in Japan Scripta Technica USA. vol. 37, No. 1. Jan. 2006: 83-92. Database Inspec [Online]. The Institution of Electrical Engineers, Stevenage, GB; Jan. 2006.
Tokuda, Junichi; Morikawa, Shigehiro; Dohi, Takeyoshi; Hata, Nobuhiko; Motion Tracking in MR-Guided Liver Therapy by Using Navigator Echoes and Projection Profile Matching, 2004. vol. 11. No. 1. pp. 111-120.
Warrington, Jim et al. ‘Cobalt 60 Teletherapy for Cancer. A Revived Treatment Modality for the 21st Century’, 2002 The Institution of Electrical Engineers, pp. 19-1-19/19.
Wazer, David E. et al. ‘Principles and Practice of Radiation Oncology (fifth edition).’, Wolters Kluwer/Lippincott Williams & Wilkins. 2008. 2 pages.
Webb, S. ‘The physical basis of IMRT and inverse planning’ The British Journal of Radiology, 76 (2003), 678-689, 2003 The British Institute of Radiology.
Webb, Steve, ‘Intensity-modulated radiation therapy using only jaws and a mask: II. A simplified concept of relocatable single-bixel attenuators’, published May 22, 2002, Institute of Physics Publishing, Physics in Medicine and Biology, Phys. Med. Biol. 47 (2002) 1869-1879.
Related Publications (1)
Number Date Country
20130296687 A1 Nov 2013 US