This disclosure relates to devices and methods for acquiring image data of a patient within a bore of a radiation therapy system. Acquired patient images may be used to determine and/or monitor the location and/or position of a patient inside and outside the bore.
During a radiation treatment session, a patient is placed on a patient platform which moves the patient into a beam path or treatment plane of a therapeutic radiation source of a radiation therapy system. Radiation is applied to the patient in a prescribed manner based on a treatment plan. In many radiation therapy systems, both the patient platform and the therapeutic radiation source can move relative to each other so that radiation can be delivered to tumors while limiting radiation delivery to surrounding healthy tissue.
While the movement of the patient platform and/or other radiation therapy system components may be calculated or predicted in advance, patient movements are often difficult to predict. For example, patients may change their position on the platform voluntarily or involuntarily (e.g., breathing, fidgeting, shifting due to discomfort, etc.). Because treatment plans are calculated based on assumed patient position(s), location(s), and/or patient geometry (e.g., size, shape, etc.), patient movement can impact the quality and efficacy of a treatment session. For example, if radiation is applied without accounting for changes in patient position, location, and/or geometry relative to the therapeutic radiation source, radiation may be delivered to non-target regions and target regions may not receive the radiation dose prescribed by the treatment plan.
Typically, the position and/or location of a patient in a radiation therapy system is monitored using one or more cameras mounted on the walls or ceilings of the treatment system bunker. Images from these cameras may be viewed by a practitioner to identify any large changes in position and/or location. However, depending on the geometry of the radiation therapy system, views of the patient may be obstructed. For example, the robotic arms or partial ring gantry or full-ring gantry may prevent a camera from acquiring a clear view of the patient. Some systems include a camera that is mounted at the foot of the patient platform. In some radiation therapy systems, the therapeutic radiation source may be located within a bore, and as the patient is moved deeper into the bore, ceiling and/or wall-mounted cameras are less able to provide a clear view of the patient. As such, improvements to monitoring changes in patient position and/or location are desirable.
Disclosed herein are systems and methods for real-time monitoring of patient position and/or location during a radiation treatment session. Images acquired of a patient during a treatment session may be used to calculate the patient's position and/or location with respect to the components of the radiation therapy system. Depending on the characteristics of the changes in patient position and/or location, the radiation therapy system may determine whether to adjust radiation delivery. Real-time monitoring of patient position and/or location may confirm that the patient is correctly positioned (e.g., set up) at the start of a treatment session and throughout the duration of the treatment session. Patient position and/or location information calculated from image data acquired during the treatment session may also help prevent collisions between the patient and components of the radiation therapy system. Some of the radiation therapy systems described herein may comprise a controller that is configured to generate a patient 3-D model (e.g., full or partial 3-D model) and/or patient 2-D surface throughout the duration of a treatment session based on the images acquired during the treatment session. While the examples described herein may refer to generating a 3-D patient model, it should be understood that similar methods and/or devices may be used to generate a 2-D patient surface.
One variation of a radiation therapy system may comprise a rotatable gantry having a longitudinal bore therethrough, and a patient-monitoring imaging system comprising one or more sensors (such as a camera or any image detector) that may be attached to the rotational gantry within the bore. The sensors may acquire image data as the gantry rotates and may generate a 360° view of the patient. Image data from the one or more sensors may be used to calculate the current location of a patient relative to radiation therapy system components and to compare the current location of a patient with previous patient locations (e.g., from previous scans or session, from earlier in the same treatment session, etc.). For example, image data may be used to register the patient in the same coordinate system as the radiation therapy system. Image data may also be used to detect undesirable patient movement during a radiation treatment session, diagnostic CT or localization CT scan (e.g., any one or more imaging modalities used to acquire one or more treatment planning images). Patient position and/or location data acquired by the rotating sensors that are located within the bore region of the radiation therapy system may be used to confirm patient position before and during radiation treatment sessions to help reduce setup variations and facilitate delivery of the prescribed radiation dose to tumor regions. Patient position and/or location data may optionally be used to help guide movement of the patient platform and/or generate a notification of any possible patient-system collisions.
One variation of a radiation therapy system may comprise a circular gantry comprising a stationary frame and a rotatable ring coupled to the stationary frame, a therapeutic radiation source coupled to the rotatable ring, an optical imaging system coupled to the rotatable ring, wherein the optical system is configured to acquire images of a patient area during rotation of the ring, and a controller in communication with the optical imaging system, wherein the controller is configured to process the acquired images to calculate information relating to the radiation therapy system. The optical imaging system may comprise an image sensor having a field-of-view. The image sensor may be located adjacent to the therapeutic radiation source. Alternatively or additionally, the optical imaging system may further comprise an emission source located on the rotatable ring and configured to illuminate the field-of view of the image sensor. The therapeutic radiation source may have a treatment plane defined by its field-of-view and the optical imaging system may have an imaging plane defined by its field-of-view, and the treatment plane may be coplanar with the imaging plane or may be distinct from the imaging plane. The optical imaging system may comprise a laser profiler, and/or a time-of-flight reflectometer, and/or a monoscopic or stereoscopic camera. In some variations, the optical imaging system has a field-of-view that includes the patient area that is outside of the bore. The optical imaging system may be located within the bore and may have a field-of view that includes the patient area that is inside the bore.
In some variations, the rotatable ring may be configured to rotate at a rate of at least about 10 RPM, and may be configured to rotate at a rate of at least about 60 RPM and even up to 200 RPM. The rotatable ring may define a longitudinal bore and the optical system may be located within the bore. The system may further comprise a housing disposed over the circular gantry where the housing may comprise a longitudinal lumen that corresponds with the bore and may comprise a light-transmitting region along an inner surface of the housing that corresponds with a field-of-view of the optical imaging system. The light-transmitting region is a strip along an inner circumference of the bore, and/or may comprise two or more windows located along an inner circumference of the bore. The system may comprise a patient platform that is movable within the patient area.
In some variations, the controller may be configured to generate a 3-D model of a patient within the patient area at a specified time interval using images acquired at the specific time interval. The controller may be configured to compare the 3-D model of the patient at the specified time interval and a 3-D model of the patient at an earlier time interval to identify a positional difference in the 3-D models. For example, the identified positional difference in the 3-D models may correspond with the patient's respiratory cycle and optionally, the controller may be configured to identify the portion of the respiratory cycle with which the identified difference corresponds and may cease activation of the therapeutic radiation source if the identified portion of the respiratory cycle does not match with a predetermined activation portion of the respiratory cycle. Alternatively or additionally, the controller may be configured to compare the identified difference in the 3-D models with a predetermined movement threshold and to cease activation of the therapeutic radiation source if the identified difference exceeds the movement threshold. The controller may be configured to identify surface points on the patient 3-D model and to track the position of the identified surface points over time. The controller may be configured to generate a 3-D model of the patient platform and to compare the generated 3-D model of the patient platform with a reference 3-D model of the patient platform to calculate a deviation from the reference 3-D model. The reference 3-D model of the patient platform may be a 3-D model of the patient platform generated before the patient is loaded on the platform. The controller may adjust the position of the patient platform to compensate for the calculated deviation. Optionally, the controller may be configured to generate a surface model of the patient at a specified time interval using images acquired at the specific time interval.
In some variations, the controller may be configured to compare the identified difference in the 3-D models with a motion envelope and to generate a notification if the identified difference exceeds the motion envelope. The motion envelope may be determined based on whether the therapeutic radiation source is emitting radiation, and the motion envelope may have a first range of motion when the therapeutic radiation source is not emitting radiation and a second range of motion when the therapeutic radiation source is emitting radiation, e.g., the first range of motion may be greater than the second range of motion. Alternatively or additionally, the motion envelope may be determined based on whether a patient platform movable within the patient region is moving or stationary, and motion envelope may have a first range of motion when the patient platform is moving and a second range of motion when the patient platform is stationary, e.g., the first range of motion may be greater than the second range of motion.
In some variations, the controller may be configured to calculate a range of motion of a patient located on the patient platform based on the acquired image data. Some systems may comprise a display and the controller may be configured to output a graphic representing the calculated range of motion to the display. The controller may be further configured to compare the calculated range of motion with a motion envelope and to generate a notification if the calculated range of motion exceeds the motion envelope. For example, the motion envelope may be determined based on whether the therapeutic radiation source is emitting radiation, and the motion envelope has a first range of motion when the therapeutic radiation source is not emitting radiation and has a second range of motion when the therapeutic radiation source is emitting radiation, e.g., the first range of motion may be greater than the second range of motion. The motion envelope may be determined based on whether the patient platform is moving or stationary, and the motion envelope has a first range of motion when the patient platform is moving and has a second range of motion when the patient platform is stationary, e.g., the first range of motion may be greater than the second range of motion.
In some variations, the controller is configured to process the acquired images to calculate information relating to the radiation therapy system, and the information may comprise positional information of a patient within the patient area. The system may comprise a patient platform that is movable within the patient area, and the controller may be configured to calculate a difference between positional information of the patient at a specified time point and reference positional information of the patient, and calculate a set of patient platform adjustments that compensate for the patient positional information difference. The controller may be further configured to move the patient platform in accordance with the calculated set of patient platform adjustments. The controller may be configured to compare positional information of the patient and positional information of the rotatable ring and to generate a notification if an overlap is detected between patient positional information and rotatable ring positional information. In some variations, the information relating to the radiation therapy system comprises positional information of the patient platform and/or positional information of a patient positioning device located in the patient area.
Some variations of a patient and/or system monitoring imaging system may comprise an optical detector comprising a plurality of sensor pixels, and the controller may be configured to calibrate the optical detector by identifying faulty sensor pixels and to generate a notification when a faulty sensor pixel is identified. Identifying faulty sensor pixels may comprise recording output signals from each sensor pixel, calculating a difference between the output signal of each sensor pixel and its respective expected output signal, and tagging a sensor pixel as faulty if its calculated difference exceeds its respective pixel error threshold. The pixel error threshold for each sensor pixel may include a maximum acceptable pixel intensity difference. Calculating a difference between the output signal of each sensor pixel and its respective expected output signal may comprise calculating a difference between noise levels of the output signal of each sensor pixel with a threshold noise levels, and the pixel errors threshold may include a maximum acceptable pixel noise level. The controller may be further configured to substitute the output signal of an identified faulty sensor pixel with a signal derived by interpolating output signals of non-faulty sensor pixels located within a predetermined distance from the identified faulty sensor pixel. Optionally, any of the systems described herein may comprise a calibration phantom with predetermined dimensions and having one or more surface contour indicia with predetermined 3D geometric markings having predetermined width, height and length values. For example, at least one of the patient platform, a portion of the patient area, and a calibration phantom may have at least one calibration reference marking that has one or more predetermined width, and/or height, and/or length values.
Alternatively or additionally, the controller may be configured to calculate a pixel degradation rate for each sensor pixel by calculating changes in the calculated difference of each sensor pixel output signal over a specified interval of time, and to generate a detector replacement notification if the pixel degradation rate exceeds a threshold degradation rate. For example, calculating the pixel degradation rate may comprise generating a regression model using the calculated changes of each sensor pixel over the specified interval of time to generate a prediction of a time duration when each sensor pixel will exceed its respective pixel error threshold.
Also described herein are methods for generating a 3-D patient model. One variation of a method may comprise positioning a patient on a patient platform of a radiation therapy system, the radiation therapy system further comprising a circular gantry having a rotatable ring about a bore, a therapeutic radiation source coupled to the rotatable ring, and an optical imaging coupled to the rotatable ring inside the bore, acquiring 2-D images of the patient and the patient platform using the optical imaging system while rotating the optical imaging system, generating a 3-D model of the patient and the patient platform using the acquired 2-D images, and generating a 3-D model of the patient by subtracting a pre-calculated 3-D model of the patient platform from the generated 3-D model of the patient and the patient platform. The patient platform may be movable within a patient area that is at least partly located within the bore. Acquiring 2-D images may comprise acquiring 2-D images during at least one rotation of the ring about the patient area and may optionally comprise acquiring 2-D images at four or more pre-defined locations about the patient area. The 3-D model of the patient platform may be calculated based on position sensor data and patient platform geometry. Generating a 3-D model of the patient and the patient platform may comprise depth-sensing methods, such as stereovision methods, and/or laser scanning or triangulation methods, and/or time-of-flight methods, and/or projected light methods.
Also disclosed herein are methods for generating a collision notification for a radiation therapy system. One variation of a method may comprise deriving a 3-D model of a rotatable ring of a radiation therapy system based on positional sensor data, generating a 3-D patient model using any of the methods described above, comparing the generated 3-D patient model with the 3-D model of the rotatable ring to determine a location of the patient relative to the rotatable ring, and if the generated 3-D patient model is located within a pre-determined safety margin of the 3-D model of the rotatable ring, generating a notification to alert an operator of the radiation therapy system. The method may further comprise generating a notification to stop motion of the patient platform if the generated 3-D patient model is located within a pre-determined collision margin of the 3-D model of the rotatable ring. The generated 3-D patient model is a partial 3-D patient model that includes patient position data acquired from a subset of circumferential locations about the patient area.
Described herein are systems and methods for monitoring the position and/or location of a patient inside and/or outside a bore of a radiation therapy system. The systems and methods disclosed herein may also be used for calculating information relating to the radiation therapy system A radiation therapy system may comprise a gantry rotatable about a bore having an interior volume, a therapeutic radiation source coupled to the rotatable gantry, a patient platform that is movable within the interior volume of the bore, and a patient-monitoring imaging system coupled to the rotatable gantry such that the imaging system has a field-of-view that includes at least a portion of the volume of the bore and acquires image data as the gantry rotates. The patient-monitoring imaging system may comprise an optical imaging device such as a camera. The sensors or camera of the patient-monitoring imaging system may be mounted on the rotatable gantry in a bore region of the radiation therapy system. In some variations, the entire field-of-view of the patient-monitoring imaging system may be within the bore volume, while in other variations, the field-of-view of the imaging system may include both the interior volume of the bore and a region located outside of the bore. Alternatively, the field-of-view of the patient-monitoring imaging system may include only a region outside of the bore. A radiation therapy system may also comprise a housing disposed over the rotatable gantry such that the volume of the bore is bounded by an exterior portion of the housing. The image data acquired by the imaging system may be used to calculate information relating to the radiation therapy system. Examples of information relating to the radiation therapy system and/or information that may be calculated based on image data acquired by the monitoring imaging system may include, but are not limited to, patient positioning device position relative to the therapy system components, patient position and/or motion relative to the therapy system components, patient platform position, and the like.
Some radiation therapy systems may comprise a non-rotating patient-monitoring imaging system. The stationary patient-monitoring imaging system may be coupled to the exterior portion of the housing instead of the rotatable gantry on which the therapeutic radiation source is mounted. For example, the imaging system may be located on the exterior portion of the housing at the edge of the bore and have a field-of-view that includes a region outside of the bore (e.g., at the top or 12:00 position) and/or a bore volume. The stationary patient-monitoring imaging system may be located on the exterior portion of the housing within the bore (or along the length of the bore) such that the imaging system has a field-of-view that includes the bore volume. For the purposes of this document, a patient-monitoring imaging system with a sensor located in the bore and/or rotatable above the bore volume may be referred to as an in-bore imaging system.
Patient-monitoring imaging systems for any of the radiation therapy systems described herein may comprise imaging devices of different imaging modalities. In some variations, the imaging system may comprise an optical imaging device, such as one or more light emitters (e.g., emitting light in the visible spectrum, infrared wavelength range, and/or UV wavelength range), such as scanning lasers (e.g., laser profiler), and/or time-of-flight reflectometers, and/or monoscopic or stereoscopic cameras, optical detectors having one or more CCD sensors and/or CMOS sensors, and the like. Alternatively or additionally, imaging systems may comprise ultrasound systems, and/or X-ray systems, and/or CT systems, and/or terahertz and backscatter X-ray imaging systems, and the like. Other types of imaging devices that may be included in any of the patient-monitoring imaging systems described herein include infrared cameras, video cameras with speckle patterns, radiofrequency (RF) tracking systems comprising a transponder and receiver, projection cameras, ultrasonic transmitter (e.g., an ultrasonic transmitter array) and ultrasonic range sensor, a laser displacement sensor having an emitter and a sensor head, any range-measuring system, and the like. While the variations and features described below and depicted in the drawings relate to an optical imaging system, it should be understood that any one or more of the features and methods described herein may pertain to any other imaging modality or system, such as any non-ionizing imaging modality comprising one or more of the imaging devices listed above.
Also described herein are methods for generating 3-D models of a patient (and any patient-positioning devices or accessories) and/or determining the location and/or position of the patient relative to the components of the radiation therapy system. Some methods may comprise comparing a 3-D model generated during an imaging or treatment session with a 3-D model generated during patient set up (e.g., just before the session) to confirm that the patient has not deviated substantially from the initial set up position. Some methods may also determine whether the patient is on “collision course” with any components of the radiation therapy system, and generate a notification to inform the operator of any possible collision. For example, a method may comprise comparing the location and/or position of a patient with the location of the radiation therapy system components and generating a notification if the location and/or position of the patient is within a pre-determined safety margin of the radiation therapy system. A patient 3-D model may be compared with a machine 3-D model to define a patient platform motion allowance or envelope within which the platform may move without contacting the patient to any radiation therapy system components.
The motion allowance or envelope may vary depending on the position and/or motion of the patient platform. The patient platform may be configured to continuously move through the treatment plane while therapeutic radiation is emitted to the patient, and/or may be configured to stopped in the treatment plane at a discrete location while therapeutic radiation is emitted to the patient. In the latter variation, the therapeutic radiation emission may be paused while the patient platform is moved to the next discrete location and may resume radiation emission after the platform is stopped at the next discrete location. Such discrete stepping of the patient platform through the treatment plane may be referred to as beam station delivery, where therapeutic radiation is emitted when the patient platform is stopped at one of a plurality of beam stations during a treatment session. The motion allowance or envelope may be smaller or tighter when the platform is stopped as compared to when the platform is moving. Additionally or alternatively, the motion allowance or tolerance may be smaller during therapeutic radiation emission as compared to when therapeutic radiation is paused. In the variation where radiation may be delivered to the patient while the patient platform moves continuously through the treatment beam (e.g., helical delivery), the same motion allowance or tolerance may apply throughout the treatment session. Optionally, if the speed of the patient platform varies during a treatment session, the motion allowance may also vary during the treatment session. For example, the motion allowance or tolerance may increase as the speed of the platform increases, and the motion allowance or tolerance may decrease as the speed of the platform decreases. If the patient position or motion exceeds the motion envelope and/or acceptable range of motion, a notification (e.g., a visual or graphical output to the system display and/or an audio output) may be generated to alert the clinician. If the patient position and/or motion calculated based on the acquired image data indicates a significant change (e.g., on the order of a quarter of a body length or more, motions consistent with the patient getting off the platform, high-frequency motion suggestive of involuntary muscle motion or a seizure), the therapy system may automatically pause radiation delivery until the clinician can confirm the state of the patient.
In some variations, the optical image device may acquire a greater quantity of image data when the platform is stopped, which may facilitate the precise detection of patient motion, and may acquire a smaller quantity of image data when the platform is moving. Optionally, the system may be configured to allocate computational resources or processor bandwidth to the calculation of a patient 3-D model when therapeutic radiation is paused (e.g., while the patient platform is moving between beam stations), which may result in a high-resolution 3-D model. During the emission of therapeutic radiation (e.g., while the patient platform is stopped at a beam station), computational resources may be prioritized for controlling and monitoring the therapeutic radiation source, and a portion of the remaining computational resources may be allocated for calculating a low-resolution 3-D model.
The patient-monitoring systems and methods described herein may facilitate precise and dynamic monitoring of patient position and location. For radiation therapy systems having a bore or tunnel through which a patient may be moved in the course of a radiation session, it may be challenging to monitor the patient using imaging devices mounted on the ceiling and/or wall of the radiation therapy system bunker. The field-of-view of these bunker-mounted imaging devices may not include a significant portion of the internal volume of the bore, and therefore, the ability to monitor the location and/or position of a patient inside the bore may be limited, especially as the patient is moved deeper into the bore or tunnel and is obscured by the housing of the radiation therapy system. The inability to view the patient is especially pronounced for therapy systems that have long bores (e.g., having a length of over about 100 cm, or over about 120 cm, or over about 150 cm) and/or have closed bores. An example of such a radiation therapy system (100) is depicted in
Systems
A radiation therapy system may comprise a circular gantry having a stationary frame and a rotatable ring coupled to the stationary frame, a therapeutic radiation source coupled to the rotatable ring, and an imaging system configured to acquire image data of a patient during a treatment session. The rotatable ring may have a circumference and a longitudinal length (e.g., parallel to its axis of rotation). The therapeutic radiation source may be mounted at a specified circumferential location and a specified longitudinal location. Circumferential locations may be designated, for example, by an angle from about 0 degrees to about 360 degrees, where the 0°/360° position is at the top of the gantry (e.g., the 12:00 position) and the 180° position is at the bottom of the gantry (e.g., the 6:00 position). The longitudinal location may be specified as a distance from an edge of an opening of the ring, for example, if the length of the rotatable ring is LR, a longitudinal location may be designated as a distance from about 0 units to about LR units from the edge. A patient treatment region may comprise a bore that extends along the length of the rotatable ring (e.g., along IEC-Y axis), around which the ring rotates. Similar to the rotatable ring, the bore may also have a bore longitudinal length and a bore circumference. The length of the bore may correspond to the length of the rotatable ring and/or may be longer than the length of the ring. The radiation therapy system may further comprise a patient platform that is movable within the bore of the patient treatment region. The radiation therapy system may further comprise a housing that encloses the circular gantry and the therapeutic radiation source in an internal volume of the housing. The surface of the housing may form the sidewalls of the bore, and may, for example, circumscribe the bore forming a tunnel through which the patient platform may be moved during a patient scan and/or treatment session. Optionally, the radiation therapy system may comprise one or more PET detectors and/or a kV CT imaging system.
The patient-monitoring imaging system may comprise one or more sensors or cameras of a variety of imaging modalities that are mounted on the gantry or housing of the radiation therapy system. A patient-monitoring imaging system may comprise infrared, visible, and/or ultraviolet light sensors (e.g., optical detectors having one or more CCD sensors and/or CMOS sensors), ultrasound sensors, radar sensors such as terahertz sensors (from about 100 GHz to about 800 GHz), MRI sensors, and/or x-ray detectors (e.g., configured to detect backscattered x-rays and/or low intensity x-rays). Imaging systems may, for example, comprise one or more time-of-flight reflectometers, and/or one or more monoscopic stereoscopic cameras. A patient-monitoring imaging system may comprise an infra-red laser profiler, which may include a camera which may be used in conjunction with a linear scanner to generate 3-D patient profile (e.g., Ultra-High Speed In-line Profilometer LJ-V7000 series by KEYENCE©). An imaging system may comprise one or more light emitters located on the gantry or housing such that the emitted light illuminates a patient (and any patient positioning devices) within the bore of the therapy system and one or more light detectors or cameras located on the gantry or housing to measure the light that is reflected and/or transmitted through the patient (and any patient positioning devices). The imaging system may be configured to acquire image data using structured light surface mapping. An imaging system configured for structured light scanning may comprise one or more emitters that project a pattern of light on the patient (e.g., stripes and/or a grid) and one or more cameras or detectors or sensors that capture the distortion of the pattern of light on the patient. The data from the one or more cameras or detectors located at various locations about the patient treatment area or bore may be transmitted to a controller or processor that reconstructs the surface of the patient using laser scanning or triangulation methods. The one or more emitters may emit light of any desired wavelength, including infrared light, and/or red light, and/or green light and/or blue light, and/or white light, etc. Other types of imaging devices that may be included in any of the patient-monitoring imaging systems described herein include infrared cameras, video cameras with speckle patterns, radiofrequency cameras, projection cameras, and the like. Imaging devices may comprise sensors (e.g., amorphous silicon (a-Si) sensors) that are radiation hard and/or able to operate in the presence of elevated levels of ionizing radiation. A patient-monitoring imaging system may comprise a controller for processing image data acquired by the one or more sensors or cameras. The controller may be in communication with the one or more sensors via one or more data buses or wires. The patient-monitoring controller may be the same as the radiation therapy system controller or may be a separate controller, as may be desirable. The patient-monitoring system controller may comprise one or more memories that are configured to store raw and/or processed image data as well as patient and machine models.
In some variations, a radiation therapy system may comprise one or more patient-monitoring imaging systems coupled to the rotatable ring of the circular gantry and configured to acquire patient image data while the ring is rotating (e.g., during a imaging and/or treatment session). In some variations, the therapeutic radiation source and the patient-monitoring imaging system may both be mounted on the rotatable ring, while other variations may have multiple rotatable rings and the therapeutic radiation source and imaging system may be mounted on separate rotatable rings. Each rotatable ring may define a rotational plane, and the rotational planes of the separate rings may be coplanar or may not be coplanar. A patient-monitoring imaging system of the radiation therapy system may be mounted at any longitudinal location and/or any circumferential location on the rotatable ring. In some variations, a rotatable patient-monitoring imaging system may be located at the same circumferential location as the therapeutic radiation source, but at a different longitudinal location (which may be described as the patient-monitoring imaging system being non-coplanar with the therapeutic radiation source). In other variations, the patient-monitoring imaging system may be located at a different circumferential location as the therapeutic radiation source, but at the same longitudinal location (which may be described as the patient-monitoring imaging system being coplanar with the therapeutic radiation source). Alternatively, the patient-monitoring imaging system may be located at a different circumferential location and longitudinal location from the therapeutic radiation source. The circumferential location of the imaging system relative to the therapeutic radiation source may be selected to help reduce the exposure of the imaging system sensors to ionizing radiation from the radiation source. For example, the imaging system may be located at the same circumferential location as the therapeutic radiation source so that the radiation beams emitted by the radiation source are directed away from the imaging system. Imaging systems with radiation-hard or radiation-robust sensor elements may be located at regions of elevated radiation levels (e.g., opposite a therapeutic radiation source or about 180 degrees from a therapeutic radiation source) while imaging systems with radiation-sensitive or radiation damage-prone sensors may be located at regions with lowered radiation levels (e.g., at the same circumferential location as a therapeutic radiation source). While the examples of radiation therapy systems described and depicted herein show a single patient-monitoring imaging system, radiation therapy systems may have a plurality of imaging systems located at various circumferential and longitudinal locations along a gantry ring or bore. The imaging systems may be mounted on a common gantry or may be mounted on separate gantries (e.g., gantries that may be moved independent of each other). For example, the radiation therapy system may have a first rotatable gantry and a second rotatable gantry that is separate from the first rotatable gantry and the therapeutic radiation source may be mounted to the first gantry while the imaging system may be mounted to the second gantry. The axes of rotation for the first and second gantries may be collinear.
Several system parameters may affect the quality and quantity of acquired imaging data, and such parameters may be adjusted in order to ensure a sufficient quantity of imaging data is acquired for generating 3-D patient models or 2-D patient surface renderings in real-time. Examples of parameters may include (1) the gantry rotation rate; (2) the number of imaging sensor (for example, placed 90 or 180° apart to effectively double or quadruple the acquisition rate); (3) the IEC-Y field-of-view (FOV) of the imaging sensor; (4) the couch-top velocity; (5) the number beam stations and/or the spacing between each beam station. For example, a patient-monitoring imaging system for a radiation therapy system with a slower rotating gantry (e.g., less than about 30 RPM, less than about 20 RPM, less than about 10 RPM) may comprise a patient-monitoring imaging system with a longer IEC-Y FOV imaging sensor or may comprise a plurality of imaging sensors with a relatively shorter IEC-Y FOV (e.g., such that their cumulative FOV approximates the FOV of an imaging sensor with a relatively longer IEC-Y FOV). In some variations, a patient-monitoring imaging system may be used to check patient positioning and breathing during a CT scan. A CT scan typically involves breath-hold techniques to help reduce motion artifacts (especially for CT scans of the chest region), where typical breathing has frequency components between 0.1 Hz to 0.3 Hz. For a gantry capable of rotating at a speed of about 60 RPM, a patient-monitoring imaging system comprising a single sensor or camera with an IEC-Y FOV twice than the table travel distance per rotation may acquire sufficient imaging data to capture patient position and confirm proper breathing. For example, rotating an imaging system optical sensor at about 60 RPM or faster may allow for the acquisition of image data to detect patient position changes or uncontrolled breathing. A radiation therapy system comprising a gantry rotating at a slower rate (e.g., about 10 RPM, about 15 RPM, about 20 RPM, about 30 RPM) may comprise an imaging system having two sensor(s) or camera(s) that are mounted across from each other (e.g., about 180° apart), or orthogonal to each other (e.g., about 90° apart), which increases the effective frame rate to capture patient breathing motion similar to a single-sensor imaging system rotating at about twice the rate. Rotating the sensor(s) or camera(s) of an imaging system may allow for the acquisition of image data for confirming patient movement and generating 3-D patient models. Patient 3-D models in real-time may help to mitigate artifacts encountered in 2-D images (and/or images acquired by an imaging system located outside of the radiation therapy system bore), artifacts such as parallax error, limited patient treatment area or bore views, affects from ambient light, contrast variation, limited depth of focus and/or target contrast distortions.
The system may be configured to analyze or process image data to generate 3-D models (e.g., surface models or volumes) and/or measure patient and/or couch motion at varying degrees of precision at various times during a treatment session. For example, patient location, motion, and/or geometry data may be calculated with a greater degree of accuracy and precision when the therapeutic radiation source is emitting radiation (e.g., beam-on) than when the therapeutic radiation source is not emitting radiation (e.g., beam-off). In some variations, the system controller may be configured to generate a high-resolution and/or high-precision 3-D model during beam-off periods of time, and during beam-on, to calculate the incremental changes from the high-resolution 3-D model (e.g., motion, location, and/or geometry changes). During beam-on, the imaging system may continue to acquire image data but the controller may not use this data for 3-D model generation (which may be computationally intensive) and may instead use this data for monitoring motion and/or positional changes (which may be less computationally intensive than generating a 3-D model). This may help to relieve the computational load on the controller during beam-on, when computational resources (e.g., controller processor bandwidth) may be allocated for monitoring and controlling radiation emission from the therapeutic radiation source. Motion and/or position data calculated during beam-on may be compared with a predetermined motion envelope that includes a range of motion deemed acceptable during the delivery of radiation and/or may be calculated based on 3-D models and/or image data of the patient taken during the acquisition of treatment planning images. The acceptable motion envelope and/or range of motion during radiation delivery may be specified on a per-patient, per-clinician, and/or per-clinic basis. As described herein, the acceptable motion envelope may vary depending on whether the therapeutic radiation source is emitting radiation or not, and/or whether the patient platform is moving or not. At beam-off, those computational resources may then be shifted back to generating 3-D models and/or high-precision calculations using the acquired image data. Optionally, the controller may be configured to receive user input regarding the quality (e.g., resolution, precision, etc.) of the 3-D model such that higher or lower quality models may be generated as desired. The system may optionally comprise a display, and the motion envelope and/or range of motion and/or 3-D models and/or image data may be graphically represented on the display.
A rotatable patient-monitoring imaging system may be mounted at a longitudinal location of the rotatable ring such that the cumulative field-of-view (FOV) over all of the sensors or cameras is entirely within the internal volume of the bore (i.e., does not extend outside of the bore). For example, the imaging system may be mounted on the rotatable ring or housing in a central portion of the bore (e.g., where the longitudinal length of the bore is LB, the longitudinal location may be LB/2) and/or may be longitudinally offset from the location of the therapeutic radiation source. The cumulative FOV of the patient-monitoring imaging system may not overlap with a portion of the irradiation field of the therapeutic radiation source. Alternatively, the cumulative FOV of the patient-monitoring imaging system may overlap with a portion of the irradiation field of the therapeutic radiation source. For example, in some variations, about half of the cumulative FOV of the patient-monitoring imaging system may overlap with the irradiation field of the therapeutic radiation source. The proportion of the irradiation field that overlaps with the FOV of the patient-monitoring imaging system may be about 50% or more. In some variations, the FOV of the patient-monitoring system may be at least twice as large as the irradiation field of the therapeutic radiation source.
Alternatively or additionally, a rotatable imaging system may be mounted on the rotatable ring at or near the longitudinal location of an edge or opening of the bore. For example, an imaging system may be mounted on the rotatable ring at a longitudinal location that is just within the bore or just outside the bore. In some variations, an imaging system may be located at or near a bore opening, at the edge of the bore opening. The cumulative FOV of a patient-monitoring imaging system that is mounted at the periphery of a bore (either just inside or just outside the bore opening) may include regions that are within the bore volume and regions that are outside of the bore. For example, an imaging system may be mounted on the rotatable ring at a longitudinal location such that about half of the FOV of the imaging system is within the bore and about half of the FOV is outside of the bore. Alternatively, the FOV of the patient-monitoring imaging system may be mostly within the bore (e.g., about 60% or more, about 70% or more, about 85% or more, about 90% or more, about 95% or more, about 99% or more) or mostly outside of the bore (e.g., about 60% or more, about 70% or more, about 85% or more, about 90% or more, about 95% or more, about 99% or more).
One or more regions of the radiation system housing may be transparent or translucent, allowing for the passage of light, so that an optical patient-monitoring imaging system mounted on the rotatable gantry enclosed within the housing can acquire images of the patient, platform and/or patient positioning devices within the bore volume (e.g., the patient treatment region). The portions of the housing that are transparent or translucent may correspond to the movement trajectory of the sensors or cameras of the patient-monitoring imaging system. For example, in a radiation therapy system having an optical patient-monitoring imaging system mounted on a rotatable ring of a circular gantry enclosed in a housing, the movement trajectory of the one or more cameras may include at least a portion of the circumference of the bore. For a continuously rotating gantry, the one or more cameras may acquire image data from around the entire circumference (i.e., 0°-360°) of the bore. Alternatively, for a limited-rotating gantry where the gantry may move in arcs that are a subset of the entire circumference (e.g., arcs that span from about 0° to about 180°, arcs that span from about 0° to about 270°, etc.), the one or more cameras may acquire image data at positions along the arc. A radiation therapy system comprising an optical imaging system mounted on a rotatable gantry may comprise a housing having a transparent portion or window that wraps around at least a portion of the circumference of the bore. The location, size, and shape of the transparent portion or window may correspond to the movement trajectory of the optical imaging system as it rotates with the gantry, as well as the field-of-view of the optical imaging system. For example, the transparent portion of the housing may circumscribe the bore of a radiation therapy system, which may be preferred in a system with a continuously rotatable bore. Alternatively, the transparent portion of the housing may extend along an arc along the bore that corresponds to the arc trajectory of the rotatable gantry. In some variations, the transparent portion of the housing may correspond to predetermined locations at which the cameras of the patient-monitoring imaging system acquire images. While patient-monitoring imaging systems may be configured to continuously acquire image data while the ring of the circular gantry rotates (e.g., may comprise a high-bandwidth and/or high-data rate bus to transfer the acquired image data to a controller processor), imaging systems may also be configured to acquire image data at pre-determined or select gantry locations. For example, an imaging system may be configured to acquire image data when the one or more sensors or cameras are located at four circumferential locations (e.g., 0°/360°, 90°, 180°, 270°) as the gantry rotates. Accordingly, the housing may have four transparent portions or windows at four locations that correspond to the four circumferential locations. Similarly, an imaging system may be configured to acquire image data when the one or more sensors or cameras are located at two circumferential locations (e.g., 0°/360° and 180°), or at three circumferential locations (e.g., 0°/360°, 120°, and 240°) as the gantry rotates. Accordingly, the housing may have two or three transparent portions or windows, at two or three locations that correspond to the two or three circumferential locations, respectively. The transparent portion(s) or window(s) of the housing may be made of any transparent or translucent material, for example, acrylic. Some radiation therapy systems may comprise more than one optical patient-monitoring imaging system mounted on more than one rotatable gantry. For example, a first rotatable gantry for a CT system and a second rotatable gantry for a PET system, with a first imaging system mounted on the first rotatable gantry and a second imaging system mounted on the second rotatable gantry. The first and second rotatable gantries may each define rotational planes that may be coplanar or not coplanar (e.g., separated by a distance in the IEC-Y direction). The housing of a radiation therapy system having two rotatable optical imaging systems may comprise a first transparent portion corresponding to the movement trajectory of the first imaging system and a second transparent portion corresponding to the movement trajectory of the second imaging system, where the first and second transparent portions may be separated by a distance in the IEC-Y axis. For example, each transparent portion may comprise a clear window strip that allows optical imaging sensors of the first and second optical imaging systems to acquire optical data as the gantry rotates. Radiation therapy systems having patient-monitoring imaging systems comprising terahertz sensors may comprise a housing with one or more portions or windows made of a material that is transparent to the wavelength of the terahertz sensors. Examples of such terahertz-transparent materials may include silicon, and polymers such as polytetrafluoroethylene (PFTE, aka Teflon) and TPX (polymethylpentene film), polypropylene (PP), polyethylene (PE) and high density PE (HDPE).
One variation of a radiation system housing comprising a transparent or translucent region is depicted in
Alternatively, as depicted in
While the patient-monitoring imaging systems of the radiation therapy systems in
Moreover, image data (which may be used for 2-D image reconstruction and/or 3-D patient modeling) acquired while rotating the imaging system of the patient setup region may help facilitate registration of the patient with the radiation therapy system coordinate system before a treatment session, and may form a baseline patient position and/or location with which images acquired during the treatment session may be compared. The patient may be monitored throughout the session to confirm that they have not substantially deviated from the initial set up configuration. If the patient-monitoring imaging system detects a substantial deviation from the initial set up configuration or position (e.g., movement beyond a pre-determined threshold), a notification or alert may be generated. In some variations, the notification may be transmitted visually (e.g., to a display, lighting, etc.) and/or audibly (e.g., one or more tones) to the system operator. The operator may then pause the session in order to assess the condition of the patient and the system, and determine whether to proceed with the session (e.g., imaging and/or treatment). For example, if the operator is able to identify and address any patient and/or system issues, the session may be continued. If the issue(s) cannot be resolved, the session may be terminated. The imaging data acquired during a session may be stored and/or transmitted to a treatment planning system, and can be used to determine whether a radiation therapy treatment plan should be updated or adjusted in order to accommodate changes in patient geometry (e.g., size and shape) and/or patient positioning accessories for future sessions. For example, a radiation therapy treatment plan may be adjusted to account for increases or decreases in patient size, and/or patient positioning accessories included during treatment setup but not anticipated during treatment planning, and/or patient breathing patterns.
In some variations, a radiation therapy system may comprise a patient-monitoring imaging system that is stationary, and does not rotate about the patient treatment region and/or bore. In some variations, a stationary patient-monitoring imaging system may comprise one or more non-rotating sensor(s) or camera(s) mounted on the housing of the radiation therapy system inside the bore. Imaging data acquired by an in-bore stationary patient-monitoring imaging system may be used to generate a 2D surface map of a patient within the bore. Alternatively or additionally, a stationary imaging system may comprise one or more non-rotating sensor(s) or camera(s) mounted on the system housing just outside of the bore, where the FOV includes the patient treatment setup region. In some variations, the entire FOV of a stationary patient-monitoring imaging system may is entirely external to the bore or at least more than about 50% of the FOV is external to the bore.
While radiation therapy systems described above and herein are depicted as having a single patient-monitoring imaging system, it should be understood that a radiation therapy system may have more than one patient-monitoring imaging system. For example, some variations may have a first patient-monitoring imaging system that acquires position and/or location data of a patient region before it crosses the treatment plane, and a second patient-monitoring imaging system that acquires position and/or location data of the patient region after it crosses the treatment plane. One or both of the first and second patient-monitoring imaging systems may be rotatable. In some variations, a first patient-monitoring imaging system may be mounted on the rotatable gantry while a second patient-monitoring imaging system is stationary. A first patient-monitoring imaging system may be mounted on the rotatable gantry and located within a bore region of the system such that FOV includes the bore volume and a second patient-monitoring imaging system may be mounted on the rotatable gantry and located outside the bore region of the system (e.g., at the bore entrance opening, just outside the bore entrance opening) such that the FOV includes the patient setup region. A radiation therapy system may have one or more of the patient-monitoring imaging systems described herein, in any combination as may be desirable.
Optionally, visual markers projected onto, or worn by, the patient may provide reference points for image processing and/or reconstruction (2-D and/or 3-D) methods. For example, one or more retroreflectors or optical fiducials attached to the patient. Optionally, a patient may wear fitted clothing during a treatment planning or diagnostic imaging session and wear the same fitted clothing during a treatment session to help promote a more uniform visual field (e.g., uniform texture, spatial frequency, colors, etc.) to facilitate image processing by the patient-monitoring imaging system. Alternatively or additionally, an imaging system may comprise one or more emitters that may project structured light patterns or striped light patterns onto the patient, where contours of the patient may be highlighted or accentuated by distortions in the light patterns.
One example of a radiation therapy system that may comprise any of the imaging systems described herein is depicted in
One example of a radiation therapy system that may comprise any of the imaging systems described herein is depicted in
The imaging systems disclosed herein may not be limited to monitoring patient position and/or location for radiation therapy. For example, the imaging systems described herein may be used to monitor patient position and/or location in CT, PET, and/or MRI scanning systems that do not have therapeutic radiation sources to monitor patient position and/or location across multiple scan sessions. For example, patient position and/or location data from a previous imaging session may be used to help setup and register the patient for a subsequent imaging session. This may facilitate consistent patient positioning for each of a plurality of imaging sessions, and may help the alignment, comparison, and/or stitching of images acquired during different scan sessions. Alternatively or additionally, the imaging systems disclosed herein may be included in imaging systems (e.g., CT, PET, and/or MRI scanning systems) that are used to acquire images for radiation therapy treatment planning. In one variation, a scan of the patient on a patient platform (optionally, with patient fixation devices that may be used during a treatment session) may be acquired at the time the treatment planning images are acquired. The image data acquired from the scan may be used to generate a patient plan profile (e.g., a 3-D model such as a surface profile), which may include the position, geometry (e.g., anatomical size and shape), and/or any patient motion during the acquisition of treatment planning images. After the treatment plan is developed and the patient is set up for the treatment session (and with any patient fixation devices that were used in treatment planning) in a system that has a therapeutic radiation source and any of the patient-monitoring imaging systems described herein, image data may be acquired and then processed to generate a patient treatment profile. The patient treatment profile may be compared with the patient plan profile to evaluate whether the differences are acceptable for proceeding with radiation delivery and/or whether radiation delivery needs to be modified to compensate for any differences. The range of acceptable positional and/or geometrical differences of the patient and/or patient platform (e.g., registration envelope tolerance) may be determined by a clinician and/or clinic, and may optionally depend on patient-specific parameters such as their disease-state, size, age, and the like. The differences between the patient treatment profile and the patient plan profile may represent baseline, acceptable differences and patient scans, 3-D models, and/or motions during the treatment session may be compared with the baseline level of differences to determine whether the patient continues to be in position (i.e., remains in their set-up or designated positions) for radiation delivery.
Optionally, any of the monitoring imaging systems described herein may comprise one or more calibration phantoms that may be used at system setup to confirm correct system installation, and/or may be used on a daily, weekly, monthly, and/or yearly basis to confirm that the imaging devices, sensors, detectors (e.g., optical imaging devices, detectors, and/or sensors) are performing within an acceptable tolerance. A calibration phantom may comprise surface contour indicia having 3-D geometric structures or calibration reference markings of varying (but known or predetermined) radii of curvature, widths, heights, and lengths, some of which may be in the range of the radii of curvature, widths, heights, and lengths of a patient's body. Some calibration phantoms may be sized and shaped to mimic the size and contours of an adult patient or a child patient, and/or a patient with male anatomical features or a patient with female anatomical features. The surface contours of a phantom may have patterns of curves or ridges with known spatial characteristics (e.g., length or width, radius of curvature, spacing or distances between raised or protruding features, etc.), and may comprise, for example, steps (e.g., a saw tooth pattern with different tooth angles, spacing between steps, etc.), concave portions, convex portions, and/or difference raised or recessed shapes (i.e., circular, cylindrical, rectangular, etc.). The calibration phantom may be built into the patient platform at a non-treatment region (e.g., at either end of the platform, and/or at an underside of the platform) and/or may be removable from the patient platform (i.e., may be temporarily affixed to the patient platform or simply placed along indicia or other markings on the platform). For example, an underside region of the patient platform (i.e., the surface of the platform that is opposite the patient-contacting surface) may comprise one or more surface contour indicia (e.g., calibration reference markings) with known geometric parameters so that for each revolution of the camera, the surface contour(s) derived from the acquired image data of the platform underside may be compared with the known surface contour(s) of the surface contour indicia. If the difference between the calculated surface contour(s) and the surface contour indicia exceeds a threshold, a notification may be generated indicating the imaging system may need to be re-calibrated, serviced, and/or the imaging system detector(s) need to be replaced. By incorporating the calibration phantom as a fixture on the underside of the patient platform, the calibration and/or registration of the imaging system may be monitored regularly throughout a treatment and/or imaging session (e.g., checking calibration/registration at a rate of once every revolution, once every 2 revolutions, once every 5 revolutions, once every 10 revolutions, once every second, once every 2 seconds, once every 5 seconds once every 10 seconds, once every 30 seconds, once every minute, once every 3 minutes, once every 5 minutes, once every 10 minutes, etc.), and if the imaging system is no longer calibrated or registered within the desired tolerance, the user may be notified.
The imaging system may rotate about a phantom while acquiring image data, and then generate a 3-D model using the acquired image data. The generated 3-D model may be compared with the known geometry and location of the phantom and any differences may be recorded. The system may then determine whether the differences are within the uncertainty limit and/or an acceptable error range. Depending on the nature of the differences, the imaging system may adjust calibration factors (e.g., gain and/or sensitivity of the detector(s), corrections or offsets for distance measurements, or alignment of the detector(s) with one or more light sources, etc.) to reduce those differences. Alternatively or additionally, the inner surface of the bore region of the system housing and/or the patient platform may comprise one or more of the surface contours described above that may be used to calibrate the imaging system. For example, the surface of the bore that is located within the FOV of the imaging system (e.g., along the inner circumference of the bore) and/or a non-treatment region of a patient platform may comprise raised or recessed regions with known spatial and geometric characteristics and/or other such surface landmarks or fiducials that may be distinguished or extracted by the monitoring imaging system. The spatial and geometric characteristics of these surface contours calculated based on the acquire image data may be compared against the known properties of these contours to identify any discrepancies. If the identified discrepancies exceed an error threshold, then a notification may be generated indicating that the imaging system (e.g., the detector(s) or camera(s)) may need to be serviced or replaced. Images of fiducials or landmarks along the inner surface of a gantry housing within the bore may be acquired continuously during a treatment or imaging as the detector rotates around the bore and when a discrepancy is identified, a notification may be generated.
Methods
Image data acquired by any of the imaging systems described herein may be used to generate 2-D and/or 3-D patient models. For example, imaging data acquired by a rotatable patient-monitoring imaging system comprising a rotatable sensor may be used to generate a 3-D model of the patient, patient platform or couch, and any positioning or immobilization devices. The generated 3-D model may be used to monitor patient movements, including respiratory motion, to help tailor the delivery of therapeutic radiation. For example, a 3-D model generated based on image data acquired by an imaging system with a FOV that includes internal bore volume (and/or comprises sensor(s) or camera(s) that are located in the bore of the system) may provide precise and accurate position data that may be used to adjusting the fluence and/or timing of radiation pulses (e.g., respiratory gating). Image data acquired by an in-bore imaging system, whether it be a stationary or a rotatable imaging system, may also help confirm that patient positioning devices or fixtures remain in their designated locations even after the patient has been advanced into the system bore. For example, 3-D models may be used to confirm that a patient has not changed their position, and/or location, and/or geometry in the course of the session, and/or generate motion allowance or envelopes or ranges for the movement of the patient platform. The methods described herein may be used to monitor patient position and/or radiation therapy system position(s) for helical radiation delivery and/or beam station delivery. Furthermore, patient scans or models of varying degrees of resolutions or quality may be calculated at various time points during a treatment session. For example, during beam-on, acquired image data may be used to monitor patient motion and/or incremental changes from a previously-generated 3-D model and during beam-off, acquired image data may be used to generate an updated 3-D model.
One method for generating 2-D and/or 3-D patient model may comprise rotating the imaging system about the patient treatment region and/or patient setup region at least once. As the imaging system rotates, it acquires images of the treatment region and/or setup region. Some methods may comprise continuous rotation and continuous image acquisition. Alternatively or additionally, some methods may comprise stepped rotation and image acquisition. For example, the imaging system may rotate to predetermined locations about the circumference of the gantry (e.g., at the 0°, 90°, 180°, and 360°/0° circumferential locations), stop at those locations, and acquire one or more images at that location before advancing to the next location. Alternatively, the gantry may continuously rotate without stopping at any predetermined locations, but the imaging system may be configured to acquire images when located at predetermined locations. As described previously, the housing of a radiation therapy system may comprise one or more transparent portions that correspond with the motion trajectory of the imaging system, and/or the pre-determined locations where imaging system sensors acquire patient images.
After acquiring 2-D image data, the method may comprise generating a 3-D model of the patient and any other items in the patient treatment region (and/or setup region) based on the 2-D image data. For example, a 3-D reconstruction method may comprise depth sensing algorithms, and may comprise methods for calculating depth information based on image information of a 2-D image. Methods may comprise stereovision, laser scanning or triangulation, time-of-flight, and projected light methods.
After a 3-D model of the patient treatment region and/or setup region is generated (which may include data pertaining to both the patient and the platform), the known 3-D machine model of the patient platform may be subtracted from the 3-D model of the treatment region and/or setup region to derive the 3-D patient model, which may include a 3-D model, position and/or location of a patient and any attached accessories attached to the patient or the patient platform. In some variations, if the FOV of the imaging system is relatively narrow (e.g., size of the FOV is less than about half a body length of a patient), a 3-D patient model may be generated based on a series of partial 3-D models acquired sequentially as the patient is advanced into the radiation therapy system and the gantry rotates about the patient. Multiple partial 3-D patient models may be added/registered together to form a patient 3-D model that approximates the geometry, position and/or location of the patient.
Some methods may comprise calculating or updating the patient 3-D model throughout the session and comparing the patient model to the system/machine 3-D model. The system/machine 3-D model at a particular time point may be determined based on the initial 3-D model (which is known at the time of manufacture) and the cumulative movements of the various components (e.g., platform, rotatably gantry, etc.) during the session. Continuous updates and re-registration of the patient 3-D model may be especially useful when the patient is advanced deeper into the bore. For example, an emission guided radiation therapy treatment session may comprise acquiring an initial set of imaging data using a rotatable optical patient-monitoring imaging system, acquiring a kV CT image of a patient on the patient platform, moving the patient into the bore, acquiring a PET image of the patient, and applying therapeutic radiation to the patient as the patient is moved through the bore. As the patient moves through the radiation therapy system, a rotatable optical patient-monitoring imaging system may continuously acquire image data to generate and update patient 3-D models.
In some variations, an updated patient 3-D model may be used to confirm that the patient has not deviated from their position during setup. For example, a cross-section of the patient 3-D model may be calculated to map the location of the patient within the cross-section of the bore. For example, a cross-section of a patient 3-D model may be calculated based on images acquired of a patient wearing form-fitted clothing during the treatment planning scan, if required. In some variations, the clothing may be substantially transparent to terahertz or lower energy X-ray backscatter sensors. At a later imaging scan and/or treatment session, the patient may wear the same form-fitted clothing, if required, and an updated cross-section of the patient 3-D model may be calculated and compared to the previous cross-section. If the patient is in a similar position as they were during the planning scan, the scan and/or treatment session may proceed. In some variations, an optical imaging system (or any non-ionizing imaging system) may acquire data and a patient 3-D model may be generated before the patient is exposed to any ionizing radiation. For example, after a patient is set up on the platform, an optical imaging system may acquire data and generate the patient 3-D model before a patient CT image is acquired. If the patient 3-D model indicates that the patient position and/or location and/or geometry is similar to the patient position and/or location and/or geometry during treatment planning, then the radiation therapy system may then perform a CT scan and/or proceed with radiation delivery. Optionally, a 3-D model of the patient's face may be used for verifying their identity (using various facial recognition techniques) prior to scanning and/or treatment.
In some variations, an updated patient 3-D model may be used to monitor patient motion in the bore. A patient 3-D model generated based on image data acquired in real-time may be used to register the patient to a volume image such as a kV CT image and/or to detect patient motion that may trigger a pause in radiation delivery. Patient 3-D models generated using image data acquired during a treatment session may optionally be compared with patient 3-D models acquired during the acquisition of treatment planning images, and this comparison may be used to determine whether to adjust the treatment plan and/or radiation delivery to compensate for any changes in patient positioning and/or geometry. In some variations, certain changes in the patient 3-D model may merit adjustments to the radiation delivery. For example, preset thresholds may define when there is a substantial difference in the patient 3-D model compared to a reference model (such as a patient 3-D model generated at an earlier time point, and/or at the start of a treatment session), and the radiation therapy system may generate one or more notification indicating the nature of the patient change. For example, the radiation therapy system may generate a notification indicating that the patient has moved, and/or that the patient is in a particular phase of the breathing cycle (and optionally, whether the patient is in a breathing cycle phase suitable for radiation delivery), etc. If the patient position and/or location and/or any characteristics of the patient 3-D model has changed beyond the preset thresholds, the radiation therapy system may prompt the operator to pause radiation delivery or may optionally automatically pause radiation delivery.
In some variations, an updated patient 3-D model may be used for defining a patient platform or couch motion allowance or envelope for tumor position corrections. During a patient localization session just after the patient is set up on the platform in preparation for a treatment session, kV CT images may be acquired to identify any tumor position corrections needed prior to treatment and image data may be acquired and a patient 3-D model may be generated. The radiation therapy system can compensate for tumor position errors by moving (e.g., rolling, pivoting, shifting, tilting, etc.) the patient platform so that the tumor is in a desired location or orientation for radiation delivery. The patient 3-D model may be used to calculate the motion allowance of the patient platform that is available for making any tumor position corrections safely (e.g., without moving the platform into contact with the radiation therapy system). For example, the patient 3-D model may be compared with the machine 3-D model during the patient localization session, and a platform motion allowance may be calculated by identifying the bore space available for safe platform motion. This may allow the radiation therapy system to tailor platform motion allowances or motion ranges to a patient's size and geometry, which may allow for greater degrees of freedom and motion for tumor position correction. Optionally, patient motion allowances and/or motion envelopes that include an acceptable range of motion, may vary depending on whether the patient platform is moving and/or whether the therapeutic radiation source is emitting radiation. For example, the patient platform motion allowance and the patient motion allowance may be tighter during beam-on than during beam-off. In beam station delivery, the patient platform motion allowance and the patient motion allowance may be tighter when the patient platform is stopped at a beam station (i.e., in anticipation of radiation deliver) than when the patient platform is moving between beam stations. The differences between the motion allowances or ranges between beam-on and beam-off (and/or between when the platform is moving or stationary) may be determined or selected on a patient-specific basis by a clinician and/or a clinic.
Some optical imaging systems may comprise a controller that is configured to mitigate image distortions that may arise from rapid rotation of the optical detector. Image distortions may include, but are not limited to, image wobble (e.g., “jello effect”), skew, spatial aliasing (e.g., causing objects in the FOV to appear smeared), and/or temporal aliasing (e.g., strobe-like effects). The optical imaging system controller may be configured to employ one or more of various methods for rolling shutter rectification/correction and/or image stabilization, and/or may be configured to combine image data acquired over multiple revolutions to counteract or reduce the effect of these motion-related artifacts. Alternatively or additionally, some variations of radiation therapy system may comprise calibration reference markings and/or structures located on stationary components, such as the inner surface of the gantry housing in the bore region. Some variations may comprise calibration references markings and/structures located on the top side (i.e., patient side) of the patient platform. These and other calibration reference markings and/or structures (such as any reference markings on the underside of the patient platform) may be used alone or together to correct for image distortions. Optionally, a gyroscope (e.g., a MEM gyroscope) may be mounted near or adjacent to the imaging system detectors and the vibration data from the gyroscope may be used with a compensatory algorithm to correct for motion-related artifacts.
In some variations, the controller may optionally be configured to sort or bin image data based on the gantry angle from which the image data was acquired. For example, image data collected from gantry angles where the detector FOV includes the top portion of the patient platform (i.e., the patient side) may be binned and analyzed separately from image data collected from gantry angles where the detector FOV includes the bottom portion of the patient platform (i.e., the underside of the platform). Image data collected from the platform underside may not provide much, if any, information pertinent to patient positioning and motion, but may instead be used to measure platform sag (i.e., deflection of the patient platform due to the cantilever effect). Image data acquired from either side of the platform (i.e., on the right and left side of a patient) may optionally be used to measure platform sage. In variations where the patient platform underside has calibration or registration reference markings or contours, image data collected from the platform underside may be used to confirm that the imaging system is functioning within specified tolerances. Image data collected from the top (i.e., patient-side) of the platform may be used to monitor patient positioning and/or motion. During therapeutic radiation source beam-on, when patient position should remain as steady as possible (i.e., where the motion envelope relatively tight or small), close monitoring of the patient position may be prioritized over system calibration. For example, during beam-on, the imaging system may acquire image data from the top side of the platform and not the underside of the platform, which may increase the data acquisition and processing bandwidth allocated to calculating models and/or images that depict the position and/or motion of the patient. During beam-off, image data acquisition from the underside of the platform may resume in order to confirm system calibration and/or calculate platform sag/deflections.
The imaging system controller may also be configured to combine and/or stitch images of a patient and/or patient platform across multiple slices as the patient moves (e.g., continuously or in steps) along IEC-Y through the therapeutic radiation beam plane. In beam station delivery, each image slice may correspond to a particular beam station, and to generate a model of the patient along IEC-Y (i.e., along the length or height of the patient, to help generate a full-body or partial-body length model of the patient), the controller may acquire multiple image slices that are then stitched together. The width (i.e., thickness) of each slice may be a width (or other dimension) of the detector FOV. For example, the FOV for an optical detector may be from about 1 cm to about 20 cm or more, e.g., about 1 cm, about 2 cm, about 3 cm, about 5 cm, about 10 cm, or more, resulting in an image slice width having the same or similar dimensions. Some optical detectors may have a FOV of upwards of 20 cm, for example, from about 20 cm to about 100 cm, e.g., about 50 cm, about 60 cm, about 75 cm, about 80 cm, or more. In some variations, the FOV may be greater than the distance between individual beam stations so that the images may have some overlap (e.g., about 10% overlap, about 20% overlap, about 25% overlap, about 30% overlap, etc.), which may facilitate stitching multiple image slices together. In some variations, the patient may remain at a beam station for multiple revolutions of the imaging system, which may help provide a greater degree of patient position and motion detail. One variation of a method for stitching two (or more) images may comprise aligning the images such that they are on the same coordinate system, identifying one or more distinctive features that are common in both images and/or identify image pixels in the two images that correspond to each other, and adjusting position of the two images relative to each other such that the common or corresponding features in the two images are overlapping. In some variations, the imaging system controller may be configured to stitch or otherwise combine images acquired while the platform is moving, and may be configured to detect discontinuities between images (e.g., discontinuous regions for which image data was not acquired or the position change(s) of a common feature between two images is not captured) and to generate a notification to the user to re-scan the patient to bridge the discontinuities. For example, discontinuities may occur while scanning the chest/abdomen region, where breathing motions (e.g., respiratory cycle) may introduce image data variabilities that may require additional image data (e.g., additional scans of the same patient chest region). Such breathing artifacts may be at least partially mitigated by instructing the patient to perform a breath hold during image acquisition.
Imaging systems may have multiple scan modes, and optionally, the scan mode may be selected based on the radiation delivery mode. For example, an imaging system may have a first mode where image data is acquired only when the patient platform is stopped, and a second mode where image data is continuously required regardless of the patient platform motion. The first mode may be selected if image data of a single patient region is desired, and multiple images (e.g., axial images) may be acquired of the same patient region (e.g., across multiple revolutions of an optical detector around the patient while the platform is stationary). The first mode may also be selected for beam-station radiation delivery (e.g., step-and-shoot), where multiple images or image slices are acquired over multiple beam stations with some overlap (e.g., 25% overlap) between images. Image data may be acquired only when the platform is stopped at one of a series of beam stations, and the multiple image slices may be stitched together to build-up a patient and/or system model along IEC-Y. The second mode of image acquisition may be selected for helical radiation delivery where radiation is delivered while the patient platform is in motion (e.g., continuously in motion). In some variations, the second mode of image acquisition may be selected for beam-station delivery, where image data may be acquire both when the platform is stationary and when the platform is moving.
In some variations, an updated patient 3-D model may be used for generating collision notifications. The updated patient 3-D models may be continuously compared with the current system/machine 3-D model. Based upon this comparison, a notification may be generated indicating whether a collision between the patient, patient platform, positioning devices and the system housing (or any other components of the therapy system) is likely and optional instructions for avoiding such a collision. In some variations, collision checks may be performed with a pre-defined safety margin and if the patient 3-D model is located within the safety margin, then a collision notification or alert may be generated.
Calibration and Quality Assurance Methods
The system and patient monitoring imaging systems described herein may be calibrated at the time the systems are installed, and/or may be calibrated periodically according to a schedule or as desired (e.g., to troubleshoot any system errors or notifications that may have been generated during a treatment session). One variation of a calibration method may comprise moving a calibration phantom into the FOV of the monitoring imaging system, acquiring image data using the detectors and/or sensors and/or cameras of the monitoring imaging system, generating a model of the calibration phantom (e.g., a 3-D model, 2-D images, surface model, etc.), comparing the generated model of the calibration phantom with a reference model of the calibration phantom, and generating a notification (e.g., a visual or audio notification) that indicates whether the difference between the generated model and the reference model is within a registration envelope tolerance. Optionally, the method may comprise installing or placing the calibration phantom onto the platform (unless the phantom is built into the platform). Comparing the generated and reference models may comprise comparing their dimensions (e.g., length, width, height, depth, radius of curvature of each contour), relative position of different geometrical structures or landmarks along the surface, shapes and sizes of the geometrical structures or landmarks along the surface, and the like. The registration envelope tolerance may be determined by a clinician or may adhere to the standards of a clinic or hospital. If the difference between the generated model and the reference model is within the registration envelope tolerance, the system may generate a PASS notification. If the difference between the generated model and the reference model is outside of the registration envelope tolerance, the system may generate a FAIL or ERROR notification.
Since the monitoring imaging system is located in close proximity to a high-energy radiation source (i.e., the therapeutic radiation source), the sensitivity and/or precision of the detectors, as well as each of the sensor pixels of the detectors, may degrade over time. Daily, and/or weekly, and/or monthly quality assurance and/or calibration checks may provide information to a clinician and/or system administrator as to whether the detectors or sensors need to be replaced. Such checks may identify and/or count and/or locate sensor pixels on a detector that are faulty (e.g., elevated noise levels, non-responsive, drifting gain levels, intensity variations, etc.), and when the controller is analyzing the acquired image data, the data output from faulty pixels may be discarded or replaced with nearby pixel data outputs (e.g., by interpolation). In some variations, the output signals of each sensor pixel may be compared with expected output signals of that sensor pixel, and if the difference exceeds a pixel error threshold, that sensor pixel may be tagged as faulty. Similarly, noise levels (e.g., signal-to-noise ratio or SNR) may be calculated for each sensor pixel and if the noise levels exceed a maximum acceptable pixel noise level, that sensor pixel may be tagged as faulty.
One variation of a method for a quality assurance check of an imaging system that has at least one detector having one or more sensor pixels may comprise calibrating the gain and/or offset of the detector with nothing located in the patient treatment region (or FOV of the detector), acquiring N sets of image data using the detector as it rotates about the patient treatment region, and for each sensor pixel, calculating a difference from nearest neighbors (NND) and a standard deviation (STD) over the N sets of recorded image data. The method may comprise comparing each sensor pixel's calculated NND and/or STD values with a threshold NND and/or STD value. The threshold NND and/or STD values may be determined based upon the amount of variance that a clinician and/or technician deems acceptable, and/or may be derived by averaging all of the NND values (or STD values) over all of the sensor pixels in the detector. If the calculated NND and/or STD values of a sensor pixel exceeds the threshold NND and/or STD values, that sensor pixel may be marked or tagged as a faulty sensor pixel. Optionally, the quality assurance or calibration method may comprise checking whether the sensor pixels adjacent or near the faulty sensor pixels are also tagged as faulty, and if not, the output signal of the faulty sensor pixel may be substituted with a signal derived by interpolating the output signals of the nearby non-faulty sensor pixels. However, if the sensor pixels surrounding an identified faulty pixels have also been tagged as faulty and interpolation is intractable, then the method may comprise generating a notification such as an “ERROR (cannot interpolate)” signal, which may be a graphical element output to a display and/or an audio alert. The method may also comprise counting and storing a cumulative number of faulty sensor pixels in a detector, and if that number exceeds a predetermined limit, then the method may comprise generating a notification such as an “ERROR (number of faulty pixels exceeds limit)” signal, which may be a graphical element output to a display and/or an audio alert. In some variations, any error signal may indicate that the imaging system detector needs to be replaced while in other variations, only certain errors signals may require that the detector is replaced. For example, if the notification indicates that the number of faulty pixels has exceeded the limit, then the clinician and/or technician may be instructed to replace the detector. However, if the notification indicates that a faulty sensor pixel may not be interpolated, but the number of faulty pixels has not exceeded the limit (and the number of faulty sensor pixels that cannot be interpolated has not exceeded a predetermined limit), a clinician and/or technician may decide to proceed to test whether the image data is sufficient for calculating a 3-D model and/or for monitoring patient and/or platform motion. For example, the method may comprise moving a calibration phantom into the FOV of the monitoring imaging system, for example, by installing or placing the calibration phantom onto the platform, and acquiring image data using the detectors and/or sensors and/or cameras of the monitoring imaging system. The 3-D model of the surface contours and/or geometry of the phantom may be calculated based on the acquired image data and compared to a reference 3-D model of the phantom, and if the difference is within an error tolerance, the method may comprise generating a notification that the imaging system has a PASS status. If the difference is outside of the error tolerance, the method may comprise generating a notification that the imaging system has a FAIL status.
The calibration methods described herein may be performed daily, weekly, monthly, and/or yearly according to a schedule, and/or whenever the clinician wishes to troubleshoot or test the imaging system (e.g., in response to one or more error notifications). In variations where the radiation therapy system patient platform comprises one or more calibration reference indicia, these calibration methods may be performed as often as every single revolution, if desired. As described above, in some variations, the controller may be configured to monitor the calibration frequently (e.g., once every revolution, once every 2 revolutions, once every 5 revolutions, once every 10 revolutions, once every second, once every 2 seconds, once every 5 seconds once every 10 seconds, once every 30 seconds, once every minute, once every 3 minutes, once every 5 minutes, once every 10 minutes, etc.) during therapeutic radiation source beam-off, and may be configured to stop monitoring the calibration during beam-on.
An imaging system may be configured to calculate the rate at which the imaging system detector(s) are degrading in order to provide an estimate of when the detector(s) may need to be replaced or serviced. Detector degradation may be represented by drifts or shifts in detector sensor pixel gain and/or offset, and optionally, the noise levels of the gain and/or offset. For example, increased or widened noise levels or uncertainties may indicate that the imaging system detector is less reliable, and when detector sensor pixel gain and/or offset values have deviates past an acceptable (pre-determined) threshold or limit, notifications or warnings may be generated to advise servicing or replacement. For example, there may be various levels of thresholds or limits for sensor pixel gain, offset, and/or noise values, and as the detector degrades, notifications indicating increasing levels of urgency and/or indicating increasing levels of degradation severity may be generated (e.g., detector is showing moderate levels of quality degradation to increased levels of quality degradation to severe levels of quality degradation to unacceptable levels of degradation, etc.). When a sensor pixel is entirely unresponsive to a light input, the system may generate a FAIL notification and may optionally lock operation of the imaging system and/or radiotherapy system until the detector is replaced or serviced/re-calibrated. In some variations, the imaging system controller may be configured to calculate the degradation rate of detector sensor pixels using calibration data acquired over time (e.g., a specified interval of time) where the pixel calibration may be represented by a gain and offset. The controller may be configured to calculate changes in sensor pixel noise over one or more sensor pixels of a detector and to monitor the rate of noise increase over increments of time. The controller may be configured to output an estimated length of time remaining until the detector is recommended for servicing and replacing by calculating the rate of a degradation factor (e.g., gain drift, offset drift, noise level increase, uncertainty increase) and extrapolating the time it would take for the value/level of that degradation factor to reach a threshold level indicating that such detector quality degradation is unacceptable and needs to be serviced or replaced. For example, the controller may be configured to calculate the rate at which the noise level of a detector (e.g., one or more of the sensor pixels on the detector) increase, and extrapolating or otherwise predicting or estimating the time it would take for the noise level to reach a maximum threshold level. The estimated remaining length of time for the detector before it exceeds the noise threshold and/or “expiration date” may be output to a system display for viewing by the clinician or user. Alternatively or additionally, a controller may calculate a pixel degradation rate based on a regression model and applying the regression model to the acquired image data. While some methods may comprise estimating the remaining life time until detector replacement or servicing based on sensor pixel noise levels (e.g., signal-to-noise ratio or SNR), some methods comprise estimating the remaining life time by measuring the differences between the expected sensor pixel output and the actual sensor pixel output when scanning a calibration phantom. For example, calculating the differences in the actual output signal of each sensor pixel and its respective expected output signal may comprise calculating the difference between intensity values of the output signal and intensity values of the respective expected output signal, and the pixel degradation rate is calculated or regression model based on changes in the calculated intensity differences over the specified interval of time. Alternatively or additionally, the imaging system may comprise one or more radiation measurement devices (i.e., dosimeters and the like) and may provide a recommendation to service or replace the imaging system detectors once a threshold level of radiation has been emitted by the therapeutic radiation source. Notifications of detector degradation and/or recommendations for servicing and/or replacing components (e.g., detector(s)) of the imaging system may comprise audio alerts and/or graphical notifications output to the system display.
In some variations, any error signal may indicate that the imaging system detector needs to be replaced while in other variations, only certain errors signals may require that the detector is replaced. For example, if the notification indicates that the number of faulty pixels has exceeded the limit, then the clinician and/or technician may be instructed to replace the detector. However, if the notification indicates that a faulty sensor pixel may not be interpolated, but the number of faulty pixels has not exceeded the limit (and the number of faulty sensor pixels that cannot be interpolated has not exceeded a predetermined limit), a clinician and/or technician may decide to proceed to test whether the image data is sufficient for calculating a 3-D model and/or for monitoring patient and/or platform motion.
System Controller
The radiation therapy systems and/or patient-monitoring imaging systems described herein may each comprise a controller having a processor and one or more memories. In some variations, the radiation therapy system and the patient-monitoring imaging system(s) may share a controller, as may be desirable. A controller may comprise one or more processors and one or more machine-readable memories in communication with the one or more processors. The controller may be connected to a radiation therapy system and/or a patient-monitoring imaging system and/or other systems by wired or wireless communication channels. In some variations, the controller may be coupled to a patient platform or disposed on a trolley or medical cart adjacent to the patient and/or operator.
The controller may be implemented consistent with numerous general purpose or special purpose computing systems or configurations. Various exemplary computing systems, environments, and/or configurations that may be suitable for use with the systems and devices disclosed herein may include, but are not limited to software or other components within or embodied on personal computing devices, network appliances, servers or server computing devices such as routing/connectivity components, portable (e.g., hand-held) or laptop devices, multiprocessor systems, microprocessor-based systems, and distributed computing networks.
Examples of portable computing devices include smartphones, personal digital assistants (PDAs), cell phones, tablet PCs, phablets (personal computing devices that are larger than a smartphone, but smaller than a tablet), wearable computers taking the form of smartwatches, portable music devices, and the like.
Processor
In some embodiments, a processor may be any suitable processing device configured to run and/or execute a set of instructions or code and may include one or more data processors, image processors, graphics processing units, physics processing units, digital signal processors, and/or central processing units. The processor may be, for example, a general purpose processor, Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), or the like. The processor may be configured to run and/or execute application processes and/or other modules, processes and/or functions associated with the system and/or a network associated therewith. The underlying device technologies may be provided in a variety of component types, e.g., metal-oxide semiconductor field-effect transistor (MOSFET) technologies like complementary metal-oxide semiconductor (CMOS), bipolar technologies like emitter-coupled logic (ECL), polymer technologies (e.g., silicon-conjugated polymer and metal-conjugated polymer-metal structures), mixed analog and digital, or the like.
Memory
In some embodiments, memory may include a database and may be, for example, a random access memory (RAM), a memory buffer, a hard drive, an erasable programmable read-only memory (EPROM), an electrically erasable read-only memory (EEPROM), a read-only memory (ROM), Flash memory, etc. The memory may store instructions to cause the processor to execute modules, processes and/or functions associated with the system, such as one or more treatment plans, imaging data acquired by one or more patient-monitoring imaging systems (e.g., during a previous treatment session and/or current treatment session, real-time imaging data), updated or adapted treatment plans, updated or adapted dose delivery instructions, radiation therapy system instructions (e.g., that may direct the operation of the gantry, therapeutic radiation source, multi-leaf collimator, and/or any other components of a radiation therapy system and/or patient-monitoring imaging system), and image and/or data processing associated with radiation delivery and/or patient-monitoring.
Some embodiments described herein relate to a computer storage product with a non-transitory computer-readable medium (also may be referred to as a non-transitory processor-readable medium) having instructions or computer code thereon for performing various computer-implemented operations. The computer-readable medium (or processor-readable medium) is non-transitory in the sense that it does not include transitory propagating signals per se (e.g., a propagating electromagnetic wave carrying information on a transmission medium such as space or a cable). The media and computer code (also may be referred to as code or algorithm) may be those designed and constructed for the specific purpose or purposes. Examples of non-transitory computer-readable media include, but are not limited to, magnetic storage media such as hard disks, floppy disks, and magnetic tape; optical storage media such as Compact Disc/Digital Video Discs (CD/DVDs); Compact Disc-Read Only Memories (CD-ROMs), and holographic devices; magneto-optical storage media such as optical disks; solid state storage devices such as a solid state drive (SSD) and a solid state hybrid drive (SSHD); carrier wave signal processing modules; and hardware devices that are specially configured to store and execute program code, such as Application-Specific Integrated Circuits (ASICs), Programmable Logic Devices (PLDs), Read-Only Memory (ROM), and Random-Access Memory (RAM) devices. Other embodiments described herein relate to a computer program product, which may include, for example, the instructions and/or computer code disclosed herein.
A user interface may serve as a communication interface between an operator or clinician and the radiation therapy system. The user interface may comprise an input device and output device (e.g., touch screen and display) and be configured to receive input data and output data from one or more of the support arm, external magnet, sensor, delivery device, input device, output device, network, database, and server. Sensor data from one or more sensors may be received by user interface and output visually, audibly, and/or through haptic feedback by one or more output devices. As another example, operator control of an input device (e.g., joystick, keyboard, touch screen) may be received by user and then processed by processor and memory for user interface to output a control signal to one or more support arms, external magnets, intracavity devices, and delivery devices.
Some variations of a radiation therapy system and/or patient-monitoring imaging system may comprise a display device that may allow an operator to view graphical and/or textual representations of fluence maps, and/or dose distributions, and/or regions of interest, and/or volumes of interest, and/or patient anatomical images (e.g., geometry, motion, position, location relative to radiation therapy system components), and/or patient data (e.g., physiological and/or biological), and the like. In some variations, an output device may comprise a display device including at least one of a light emitting diode (LED), liquid crystal display (LCD), electroluminescent display (ELD), plasma display panel (PDP), thin film transistor (TFT), organic light emitting diodes (OLED), electronic paper/e-ink display, laser display, and/or holographic display.
Communication
In some embodiments, a radiation therapy system and/or patient-monitoring imaging system may be in communication with other computing devices via, for example, one or more networks, each of which may be any type of network (e.g., wired network, wireless network). A wireless network may refer to any type of digital network that is not connected by cables of any kind. Examples of wireless communication in a wireless network include, but are not limited to cellular, radio, satellite, and microwave communication. However, a wireless network may connect to a wired network in order to interface with the Internet, other carrier voice and data networks, business networks, and personal networks. A wired network is typically carried over copper twisted pair, coaxial cable and/or fiber optic cables. There are many different types of wired networks including wide area networks (WAN), metropolitan area networks (MAN), local area networks (LAN), Internet area networks (IAN), campus area networks (CAN), global area networks (GAN), like the Internet, and virtual private networks (VPN). Hereinafter, network refers to any combination of wireless, wired, public and private data networks that are typically interconnected through the Internet, to provide a unified networking and information access system.
Cellular communication may encompass technologies such as GSM, PCS, CDMA or GPRS, W-CDMA, EDGE or CDMA2000, LTE, WiMAX, and 5G networking standards. Some wireless network deployments combine networks from multiple cellular networks or use a mix of cellular, Wi-Fi, and satellite communication. In some embodiments, the systems, apparatuses, and methods described herein may include a radiofrequency receiver, transmitter, and/or optical (e.g., infrared) receiver and transmitter to communicate with one or more devices and/or networks.
This application is a continuation of U.S. patent application Ser. No. 16/191,131, filed on Nov. 14, 2018, which claims priority to U.S. Provisional Patent Application Ser. No. 62/585,772, filed on Nov. 14, 2017, each of which are hereby incorporated by reference in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
3418475 | Hudgens | Dec 1968 | A |
3668399 | Cahill et al. | Jun 1972 | A |
3767883 | Staats | Oct 1973 | A |
3794840 | Scott | Feb 1974 | A |
3869615 | Hoover et al. | Mar 1975 | A |
3906233 | Vogel | Sep 1975 | A |
4361902 | Brandt et al. | Nov 1982 | A |
4389569 | Hattori et al. | Jun 1983 | A |
4503331 | Kovacs, Jr. et al. | Mar 1985 | A |
4529882 | Lee | Jul 1985 | A |
4563582 | Mullani | Jan 1986 | A |
4575868 | Ueda et al. | Mar 1986 | A |
4628499 | Hammett | Dec 1986 | A |
4642464 | Mullani | Feb 1987 | A |
4647779 | Wong | Mar 1987 | A |
4677299 | Wong | Jun 1987 | A |
4771785 | Duer | Sep 1988 | A |
4868844 | Nunan | Sep 1989 | A |
5075554 | Yunker et al. | Dec 1991 | A |
5099505 | Seppi et al. | Mar 1992 | A |
5117445 | Seppi et al. | May 1992 | A |
5168532 | Seppi et al. | Dec 1992 | A |
5206512 | Iwao | Apr 1993 | A |
5207223 | Adler | May 1993 | A |
5272344 | Williams | Dec 1993 | A |
5317616 | Swerdloff et al. | May 1994 | A |
5329567 | Ikebe | Jul 1994 | A |
5351280 | Swerdloff et al. | Sep 1994 | A |
5390225 | Hawman | Feb 1995 | A |
5394452 | Swerdloff et al. | Feb 1995 | A |
5396534 | Thomas | Mar 1995 | A |
5418827 | Deasy et al. | May 1995 | A |
D359802 | Fontaine | Jun 1995 | S |
5442675 | Swerdloff et al. | Aug 1995 | A |
5548627 | Swerdloff et al. | Aug 1996 | A |
5577026 | Gordon et al. | Nov 1996 | A |
5661773 | Swerdloff et al. | Aug 1997 | A |
5668371 | Deasy et al. | Sep 1997 | A |
5724400 | Swerdloff et al. | Mar 1998 | A |
5751781 | Brown et al. | May 1998 | A |
5813985 | Carroll | Sep 1998 | A |
5818902 | Yu | Oct 1998 | A |
5851182 | Sahadevan | Dec 1998 | A |
5889834 | Vilsmeier et al. | Mar 1999 | A |
5917883 | Khutoryansky | Jun 1999 | A |
5937028 | Tybinkowski et al. | Aug 1999 | A |
5946425 | Bove, Jr. et al. | Aug 1999 | A |
5954647 | Bova et al. | Sep 1999 | A |
6137114 | Rohe et al. | Oct 2000 | A |
6180943 | Lange | Jan 2001 | B1 |
6184530 | Hines et al. | Feb 2001 | B1 |
6188748 | Pastyr et al. | Feb 2001 | B1 |
6255655 | McCroskey et al. | Jul 2001 | B1 |
6260005 | Yang et al. | Jul 2001 | B1 |
6271517 | Kroening, Jr. et al. | Aug 2001 | B1 |
6281505 | Hines et al. | Aug 2001 | B1 |
6385288 | Kanematsu | May 2002 | B1 |
6396902 | Tybinkowski et al. | May 2002 | B2 |
6405072 | Cosman | Jun 2002 | B1 |
6429578 | Danielsson et al. | Aug 2002 | B1 |
6438202 | Olivera et al. | Aug 2002 | B1 |
6449331 | Nutt et al. | Sep 2002 | B1 |
6449340 | Tybinkowski et al. | Sep 2002 | B1 |
6455856 | Gagnon | Sep 2002 | B1 |
6459769 | Cosman | Oct 2002 | B1 |
6504899 | Pugachev et al. | Jan 2003 | B2 |
6560311 | Shepard et al. | May 2003 | B1 |
6618467 | Ruchala et al. | Sep 2003 | B1 |
6624451 | Ashley et al. | Sep 2003 | B2 |
6628744 | Luhta | Sep 2003 | B1 |
6661866 | Limkeman et al. | Dec 2003 | B1 |
6696694 | Pastyr et al. | Feb 2004 | B2 |
6700949 | Susami et al. | Mar 2004 | B2 |
6714076 | Kalb | Mar 2004 | B1 |
6730924 | Pastyr et al. | May 2004 | B1 |
6735277 | McNutt et al. | May 2004 | B2 |
6778636 | Andrews | Aug 2004 | B1 |
6792078 | Kato et al. | Sep 2004 | B2 |
6794653 | Wainer et al. | Sep 2004 | B2 |
6810103 | Tybinkowski et al. | Oct 2004 | B1 |
6810108 | Clark et al. | Oct 2004 | B2 |
6831961 | Tybinkowski et al. | Dec 2004 | B1 |
6841784 | Brahme et al. | Jan 2005 | B2 |
6865254 | Nafstadius | Mar 2005 | B2 |
6888919 | Graf | May 2005 | B2 |
6891166 | Brahme et al. | May 2005 | B2 |
6914959 | Bailey et al. | Jul 2005 | B2 |
6934363 | Seufert | Aug 2005 | B2 |
6965661 | Kojima et al. | Nov 2005 | B2 |
6969194 | Nafstadius | Nov 2005 | B1 |
6976784 | Kojima et al. | Dec 2005 | B2 |
6990175 | Nakashima et al. | Jan 2006 | B2 |
7016522 | Bani-Hashemi | Mar 2006 | B2 |
7020233 | Tybinkowski et al. | Mar 2006 | B1 |
7026622 | Kojima et al. | Apr 2006 | B2 |
7085347 | Mihara et al. | Aug 2006 | B2 |
7110808 | Adair | Sep 2006 | B2 |
7120223 | Nafstadius | Oct 2006 | B2 |
7129495 | Williams et al. | Oct 2006 | B2 |
7154096 | Amano | Dec 2006 | B2 |
7167542 | Juschka et al. | Jan 2007 | B2 |
7177386 | Mostafavi et al. | Feb 2007 | B2 |
7188999 | Mihara et al. | Mar 2007 | B2 |
7191100 | Mostafavi | Mar 2007 | B2 |
7199382 | Rigney et al. | Apr 2007 | B2 |
7227925 | Mansfield et al. | Jun 2007 | B1 |
7242750 | Tsujita | Jul 2007 | B2 |
7263165 | Ghelmansarai | Aug 2007 | B2 |
7265356 | Pelizzari et al. | Sep 2007 | B2 |
7280633 | Cheng et al. | Oct 2007 | B2 |
7291840 | Fritzler et al. | Nov 2007 | B2 |
7297958 | Kojima et al. | Nov 2007 | B2 |
7298821 | Ein-Gal | Nov 2007 | B2 |
7301144 | Williams et al. | Nov 2007 | B2 |
7310410 | Sohal et al. | Dec 2007 | B2 |
7331713 | Moyers | Feb 2008 | B2 |
7338207 | Gregerson et al. | Mar 2008 | B2 |
7348974 | Smith et al. | Mar 2008 | B2 |
7356112 | Brown et al. | Apr 2008 | B2 |
7367955 | Zhang et al. | May 2008 | B2 |
7386099 | Kasper et al. | Jun 2008 | B1 |
7397901 | Johnsen | Jul 2008 | B1 |
7397902 | Seeber et al. | Jul 2008 | B2 |
7405404 | Shah | Jul 2008 | B1 |
7412029 | Myles | Aug 2008 | B2 |
7433503 | Cherek | Oct 2008 | B2 |
7439509 | Grazioso et al. | Oct 2008 | B1 |
7446328 | Rigney et al. | Nov 2008 | B2 |
7453983 | Schildkraut et al. | Nov 2008 | B2 |
7453984 | Chen et al. | Nov 2008 | B2 |
7469035 | Keall et al. | Dec 2008 | B2 |
7496181 | Mazin et al. | Feb 2009 | B2 |
7545911 | Rietzel et al. | Jun 2009 | B2 |
7555103 | Johnsen | Jun 2009 | B2 |
7558378 | Juschka et al. | Jul 2009 | B2 |
7560698 | Rietzel | Jul 2009 | B2 |
7564951 | Hasegawa et al. | Jul 2009 | B2 |
7596209 | Perkins | Sep 2009 | B2 |
7620444 | Le et al. | Nov 2009 | B2 |
7627082 | Kojima et al. | Dec 2009 | B2 |
7639853 | Olivera et al. | Dec 2009 | B2 |
7649981 | Seppi et al. | Jan 2010 | B2 |
7656999 | Hui et al. | Feb 2010 | B2 |
7657304 | Mansfield et al. | Feb 2010 | B2 |
7679049 | Rietzel | Mar 2010 | B2 |
7693256 | Brahme et al. | Apr 2010 | B2 |
7711087 | Mostafavi | May 2010 | B2 |
7715606 | Jeung et al. | May 2010 | B2 |
7742575 | Bourne | Jun 2010 | B2 |
7755054 | Shah et al. | Jul 2010 | B1 |
7755055 | Schilling | Jul 2010 | B2 |
7755057 | Kim | Jul 2010 | B2 |
7769430 | Mostafavi | Aug 2010 | B2 |
7778691 | Zhang et al. | Aug 2010 | B2 |
7792252 | Bohn | Sep 2010 | B2 |
7795590 | Takahashi et al. | Sep 2010 | B2 |
7800070 | Weinberg et al. | Sep 2010 | B2 |
7820975 | Laurence et al. | Oct 2010 | B2 |
7820989 | Sommer | Oct 2010 | B2 |
7826593 | Svensson et al. | Nov 2010 | B2 |
7831073 | Fu et al. | Nov 2010 | B2 |
7839972 | Ruchala et al. | Nov 2010 | B2 |
7847274 | Kornblau et al. | Dec 2010 | B2 |
7869562 | Khamene et al. | Jan 2011 | B2 |
7869862 | Seppi et al. | Jan 2011 | B2 |
7885371 | Thibault et al. | Feb 2011 | B2 |
7939808 | Shah et al. | May 2011 | B1 |
7942843 | Tune et al. | May 2011 | B2 |
7952079 | Neustadter et al. | May 2011 | B2 |
7957507 | Cadman | Jun 2011 | B2 |
7965819 | Nagata | Jun 2011 | B2 |
7983380 | Guertin et al. | Jul 2011 | B2 |
8017915 | Mazin | Sep 2011 | B2 |
8019042 | Shukla et al. | Sep 2011 | B2 |
8059782 | Brown | Nov 2011 | B2 |
8060177 | Hamill | Nov 2011 | B2 |
8063376 | Maniawski et al. | Nov 2011 | B2 |
8090074 | Filiberti et al. | Jan 2012 | B2 |
8093568 | Mackie et al. | Jan 2012 | B2 |
8116427 | Kojima et al. | Feb 2012 | B2 |
8130384 | Kindlein et al. | Mar 2012 | B2 |
8139713 | Janbakhsh | Mar 2012 | B2 |
8139714 | Sahadevan | Mar 2012 | B1 |
8144962 | Busch et al. | Mar 2012 | B2 |
8148695 | Takahashi et al. | Apr 2012 | B2 |
8148703 | Sommer | Apr 2012 | B2 |
8160205 | Saracen et al. | Apr 2012 | B2 |
8193508 | Shchory et al. | Jun 2012 | B2 |
8198600 | Neustadter et al. | Jun 2012 | B2 |
8232535 | Olivera et al. | Jul 2012 | B2 |
8235530 | Maad | Aug 2012 | B2 |
8239002 | Neustadter et al. | Aug 2012 | B2 |
8269195 | Rigney et al. | Sep 2012 | B2 |
8278633 | Nord et al. | Oct 2012 | B2 |
8280002 | Bani-Hashemi et al. | Oct 2012 | B2 |
8295435 | Wang et al. | Oct 2012 | B2 |
8295906 | Saunders et al. | Oct 2012 | B2 |
8303505 | Webler et al. | Nov 2012 | B2 |
8304738 | Gagnon et al. | Nov 2012 | B2 |
8306185 | Bal et al. | Nov 2012 | B2 |
8311185 | Seppi et al. | Nov 2012 | B2 |
8335296 | Dehler et al. | Dec 2012 | B2 |
8357903 | Wang et al. | Jan 2013 | B2 |
8384049 | Broad | Feb 2013 | B1 |
8395127 | Frach et al. | Mar 2013 | B1 |
8406844 | Ruchala et al. | Mar 2013 | B2 |
8406851 | West et al. | Mar 2013 | B2 |
8442287 | Fordyce, II et al. | May 2013 | B2 |
8461538 | Mazin | Jun 2013 | B2 |
8461539 | Yamaya et al. | Jun 2013 | B2 |
8467497 | Lu et al. | Jun 2013 | B2 |
8483803 | Partain et al. | Jul 2013 | B2 |
8509383 | Lu et al. | Aug 2013 | B2 |
8520800 | Wilfley et al. | Aug 2013 | B2 |
8536547 | Maurer, Jr. et al. | Sep 2013 | B2 |
8537373 | Humphrey | Sep 2013 | B2 |
8571639 | Mostafavi | Oct 2013 | B2 |
8581196 | Yamaya et al. | Nov 2013 | B2 |
8588367 | Busch et al. | Nov 2013 | B2 |
8594769 | Mostafavi | Nov 2013 | B2 |
8617422 | Koschan et al. | Dec 2013 | B2 |
8641592 | Yu | Feb 2014 | B2 |
8664610 | Chuang | Mar 2014 | B2 |
8664618 | Yao | Mar 2014 | B2 |
8712012 | O'Connor | Apr 2014 | B2 |
8745789 | Saracen et al. | Jun 2014 | B2 |
8748825 | Mazin | Jun 2014 | B2 |
8767917 | Ruchala et al. | Jul 2014 | B2 |
8788020 | Mostafavi et al. | Jul 2014 | B2 |
8816307 | Kuusela et al. | Aug 2014 | B2 |
8873710 | Ling et al. | Oct 2014 | B2 |
8884240 | Shah et al. | Nov 2014 | B1 |
8939920 | Maad | Jan 2015 | B2 |
8992404 | Graf et al. | Mar 2015 | B2 |
9061141 | Brunker et al. | Jun 2015 | B2 |
9179982 | Kunz et al. | Nov 2015 | B2 |
9205281 | Mazin | Dec 2015 | B2 |
9232928 | Mostafavi | Jan 2016 | B2 |
9248312 | Li et al. | Feb 2016 | B2 |
9360570 | Rothfuss et al. | Jun 2016 | B2 |
9370672 | Parsai et al. | Jun 2016 | B2 |
9437339 | Echner | Sep 2016 | B2 |
9437340 | Echner et al. | Sep 2016 | B2 |
9498167 | Mostafavi et al. | Nov 2016 | B2 |
9560970 | Rose et al. | Feb 2017 | B2 |
9575192 | Ng et al. | Feb 2017 | B1 |
9649509 | Mazin et al. | May 2017 | B2 |
9697980 | Ogura et al. | Jul 2017 | B2 |
9731148 | Olivera et al. | Aug 2017 | B2 |
9820700 | Mazin | Nov 2017 | B2 |
9878180 | Schulte et al. | Jan 2018 | B2 |
9886534 | Wan et al. | Feb 2018 | B2 |
9952878 | Grimme et al. | Apr 2018 | B2 |
9974494 | Mostafavi et al. | May 2018 | B2 |
10159853 | Kuusela et al. | Dec 2018 | B2 |
10327716 | Mazin | Jun 2019 | B2 |
10478133 | Levy et al. | Nov 2019 | B2 |
10603515 | Olcott et al. | Mar 2020 | B2 |
10617890 | Mazin et al. | Apr 2020 | B2 |
10646188 | Mostafavi et al. | May 2020 | B2 |
10695583 | Mazin et al. | Jun 2020 | B2 |
10695586 | Harper et al. | Jun 2020 | B2 |
10745253 | Saracen et al. | Aug 2020 | B2 |
10795037 | Olcott et al. | Oct 2020 | B2 |
10959686 | Mazin | Mar 2021 | B2 |
11007384 | Olcott et al. | May 2021 | B2 |
11285340 | Larkin et al. | Mar 2022 | B2 |
11287540 | Olcott et al. | Mar 2022 | B2 |
11309072 | Carmi | Apr 2022 | B2 |
11369806 | Laurence, Jr. et al. | Jun 2022 | B2 |
11504550 | Maolinbay | Nov 2022 | B2 |
11511133 | Olcott et al. | Nov 2022 | B2 |
11520415 | Douglas et al. | Dec 2022 | B2 |
11627920 | Mazin | Apr 2023 | B2 |
11642027 | Otto | May 2023 | B2 |
11675097 | Olcott et al. | Jun 2023 | B2 |
20020051513 | Pugachev et al. | May 2002 | A1 |
20020085668 | Blumhofer et al. | Jul 2002 | A1 |
20020148970 | Wong et al. | Oct 2002 | A1 |
20020163994 | Jones | Nov 2002 | A1 |
20020191734 | Kojima et al. | Dec 2002 | A1 |
20020193685 | Mate et al. | Dec 2002 | A1 |
20030036700 | Weinberg | Feb 2003 | A1 |
20030043951 | Akers | Mar 2003 | A1 |
20030058984 | Susami et al. | Mar 2003 | A1 |
20030080298 | Karplus et al. | May 2003 | A1 |
20030105397 | Tumer et al. | Jun 2003 | A1 |
20030128801 | Eisenberg et al. | Jul 2003 | A1 |
20030219098 | McNutt et al. | Nov 2003 | A1 |
20040002641 | Sjogren et al. | Jan 2004 | A1 |
20040024300 | Graf | Feb 2004 | A1 |
20040030246 | Townsend et al. | Feb 2004 | A1 |
20040037390 | Mihara et al. | Feb 2004 | A1 |
20040057557 | Nafstadius | Mar 2004 | A1 |
20040158416 | Slates | Aug 2004 | A1 |
20040162457 | Maggiore et al. | Aug 2004 | A1 |
20040218719 | Brown et al. | Nov 2004 | A1 |
20050028279 | de Mooy | Feb 2005 | A1 |
20050104001 | Shah | May 2005 | A1 |
20050213705 | Hoffman | Sep 2005 | A1 |
20050228255 | Saracen et al. | Oct 2005 | A1 |
20050234327 | Saracen et al. | Oct 2005 | A1 |
20060002511 | Miller et al. | Jan 2006 | A1 |
20060058637 | Sommer | Mar 2006 | A1 |
20060072699 | Mackie et al. | Apr 2006 | A1 |
20060113482 | Pelizzari et al. | Jun 2006 | A1 |
20060124854 | Shah | Jun 2006 | A1 |
20060173294 | Ein-Gal | Aug 2006 | A1 |
20060182326 | Schildkraut et al. | Aug 2006 | A1 |
20060193435 | Hara et al. | Aug 2006 | A1 |
20060237652 | Kimchy et al. | Oct 2006 | A1 |
20070003010 | Guertin | Jan 2007 | A1 |
20070003123 | Fu et al. | Jan 2007 | A1 |
20070014391 | Mostafavi et al. | Jan 2007 | A1 |
20070023669 | Hefetz et al. | Feb 2007 | A1 |
20070025496 | Brown et al. | Feb 2007 | A1 |
20070025513 | Ghelmansarai | Feb 2007 | A1 |
20070043289 | Adair | Feb 2007 | A1 |
20070053491 | Schildkraut et al. | Mar 2007 | A1 |
20070055144 | Neustadter et al. | Mar 2007 | A1 |
20070133749 | Mazin et al. | Jun 2007 | A1 |
20070153969 | Maschke | Jul 2007 | A1 |
20070164239 | Terwilliger et al. | Jul 2007 | A1 |
20070165779 | Chen et al. | Jul 2007 | A1 |
20070167801 | Webler et al. | Jul 2007 | A1 |
20070211857 | Urano et al. | Sep 2007 | A1 |
20070221869 | Song | Sep 2007 | A1 |
20070237290 | Mostafavi | Oct 2007 | A1 |
20070265230 | Rousso et al. | Nov 2007 | A1 |
20070265528 | Xu et al. | Nov 2007 | A1 |
20070270693 | Fiedler et al. | Nov 2007 | A1 |
20080002811 | Allison | Jan 2008 | A1 |
20080031404 | Khamene et al. | Feb 2008 | A1 |
20080043910 | Thomas | Feb 2008 | A1 |
20080095416 | Jeung et al. | Apr 2008 | A1 |
20080103391 | Dos Santos Varela | May 2008 | A1 |
20080128631 | Suhami | Jun 2008 | A1 |
20080130825 | Fu et al. | Jun 2008 | A1 |
20080152085 | Saracen et al. | Jun 2008 | A1 |
20080156993 | Weinberg et al. | Jul 2008 | A1 |
20080164875 | Haworth et al. | Jul 2008 | A1 |
20080203309 | Frach et al. | Aug 2008 | A1 |
20080205588 | Kim | Aug 2008 | A1 |
20080214927 | Cherry et al. | Sep 2008 | A1 |
20080217541 | Kim | Sep 2008 | A1 |
20080230705 | Rousso et al. | Sep 2008 | A1 |
20080240352 | Brahme et al. | Oct 2008 | A1 |
20080251709 | Cooke et al. | Oct 2008 | A1 |
20080253516 | Hui et al. | Oct 2008 | A1 |
20080262473 | Kornblau et al. | Oct 2008 | A1 |
20080272284 | Rietzel | Nov 2008 | A1 |
20080273659 | Guertin et al. | Nov 2008 | A1 |
20080298536 | Ein-Gal | Dec 2008 | A1 |
20090003655 | Wollenweber | Jan 2009 | A1 |
20090086909 | Hui et al. | Apr 2009 | A1 |
20090088622 | Mostafavi | Apr 2009 | A1 |
20090116616 | Lu et al. | May 2009 | A1 |
20090131734 | Neustadter et al. | May 2009 | A1 |
20090169082 | Mizuta et al. | Jul 2009 | A1 |
20090236532 | Frach et al. | Sep 2009 | A1 |
20090256078 | Mazin | Oct 2009 | A1 |
20090285357 | Khamene et al. | Nov 2009 | A1 |
20090309046 | Balakin | Dec 2009 | A1 |
20100010343 | Daghighian et al. | Jan 2010 | A1 |
20100040197 | Maniawski et al. | Feb 2010 | A1 |
20100049030 | Saunders et al. | Feb 2010 | A1 |
20100054412 | Brinks et al. | Mar 2010 | A1 |
20100063384 | Kornblau et al. | Mar 2010 | A1 |
20100065723 | Burbar et al. | Mar 2010 | A1 |
20100067660 | Maurer, Jr. | Mar 2010 | A1 |
20100069742 | Partain et al. | Mar 2010 | A1 |
20100074400 | Sendai | Mar 2010 | A1 |
20100074498 | Breeding et al. | Mar 2010 | A1 |
20100166274 | Busch et al. | Jul 2010 | A1 |
20100176309 | Mackie et al. | Jul 2010 | A1 |
20100198063 | Huber et al. | Aug 2010 | A1 |
20100237259 | Wang | Sep 2010 | A1 |
20100266099 | Busch et al. | Oct 2010 | A1 |
20100276601 | Duraj et al. | Nov 2010 | A1 |
20110006212 | Shchory et al. | Jan 2011 | A1 |
20110044429 | Takahashi et al. | Feb 2011 | A1 |
20110073763 | Subbarao | Mar 2011 | A1 |
20110092814 | Yamaya et al. | Apr 2011 | A1 |
20110105895 | Kornblau et al. | May 2011 | A1 |
20110105897 | Kornblau et al. | May 2011 | A1 |
20110118588 | Kornblau et al. | May 2011 | A1 |
20110198504 | Eigen | Aug 2011 | A1 |
20110200170 | Nord et al. | Aug 2011 | A1 |
20110210261 | Maurer, Jr. | Sep 2011 | A1 |
20110211665 | Maurer, Jr. et al. | Sep 2011 | A1 |
20110215248 | Lewellen et al. | Sep 2011 | A1 |
20110215259 | Iwata | Sep 2011 | A1 |
20110272600 | Bert et al. | Nov 2011 | A1 |
20110297833 | Takayama | Dec 2011 | A1 |
20110301449 | Maurer, Jr. | Dec 2011 | A1 |
20110309252 | Moriyasu et al. | Dec 2011 | A1 |
20110309255 | Bert et al. | Dec 2011 | A1 |
20110313231 | Guertin et al. | Dec 2011 | A1 |
20110313232 | Balakin | Dec 2011 | A1 |
20120035470 | Kuduvalli et al. | Feb 2012 | A1 |
20120068076 | Daghighian | Mar 2012 | A1 |
20120076269 | Roberts | Mar 2012 | A1 |
20120083681 | Guckenburger et al. | Apr 2012 | A1 |
20120138806 | Holmes et al. | Jun 2012 | A1 |
20120161014 | Yamaya et al. | Jun 2012 | A1 |
20120174317 | Saracen et al. | Jul 2012 | A1 |
20120189102 | Maurer, Jr. et al. | Jul 2012 | A1 |
20120213334 | Dirauf et al. | Aug 2012 | A1 |
20120230464 | Ling et al. | Sep 2012 | A1 |
20120245453 | Tryggestad et al. | Sep 2012 | A1 |
20120318989 | Park et al. | Dec 2012 | A1 |
20120323117 | Neustadter et al. | Dec 2012 | A1 |
20130025055 | Saracen et al. | Jan 2013 | A1 |
20130060134 | Eshima et al. | Mar 2013 | A1 |
20130092842 | Zhang et al. | Apr 2013 | A1 |
20130109904 | Siljamaki | May 2013 | A1 |
20130111668 | Wiggers et al. | May 2013 | A1 |
20130193330 | Wagadarikar et al. | Aug 2013 | A1 |
20130266116 | Abenaim et al. | Oct 2013 | A1 |
20130279658 | Mazin | Oct 2013 | A1 |
20130327932 | Kim et al. | Dec 2013 | A1 |
20130343509 | Gregerson et al. | Dec 2013 | A1 |
20140029715 | Hansen et al. | Jan 2014 | A1 |
20140104051 | Breed | Apr 2014 | A1 |
20140107390 | Brown et al. | Apr 2014 | A1 |
20140110573 | Wang et al. | Apr 2014 | A1 |
20140163368 | Rousso et al. | Jun 2014 | A1 |
20140184197 | Dolinsky | Jul 2014 | A1 |
20140193336 | Rousso et al. | Jul 2014 | A1 |
20140217294 | Rothfuss et al. | Aug 2014 | A1 |
20140224963 | Guo et al. | Aug 2014 | A1 |
20140228613 | Mazin et al. | Aug 2014 | A1 |
20140239204 | Orton et al. | Aug 2014 | A1 |
20140257096 | Prevrhal et al. | Sep 2014 | A1 |
20140341351 | Berwick et al. | Nov 2014 | A1 |
20140355735 | Choi | Dec 2014 | A1 |
20140371581 | Mostafavi et al. | Dec 2014 | A1 |
20150018673 | Rose et al. | Jan 2015 | A1 |
20150035942 | Hampton et al. | Feb 2015 | A1 |
20150060685 | Maad et al. | Mar 2015 | A1 |
20150076357 | Frach | Mar 2015 | A1 |
20150078528 | Okada | Mar 2015 | A1 |
20150126801 | Matteo et al. | May 2015 | A1 |
20150131774 | Maurer, Jr. et al. | May 2015 | A1 |
20150168567 | Kim et al. | Jun 2015 | A1 |
20150177394 | Dolinsky et al. | Jun 2015 | A1 |
20150190658 | Yu | Jul 2015 | A1 |
20150265852 | Meir et al. | Sep 2015 | A1 |
20150276947 | Hoenk et al. | Oct 2015 | A1 |
20150285922 | Mintzer et al. | Oct 2015 | A1 |
20150301201 | Rothfuss et al. | Oct 2015 | A1 |
20160023019 | Filiberti | Jan 2016 | A1 |
20160073977 | Mazin | Mar 2016 | A1 |
20160097866 | Williams | Apr 2016 | A1 |
20160146949 | Frach et al. | May 2016 | A1 |
20160155228 | Sakata et al. | Jun 2016 | A1 |
20160206203 | Yu et al. | Jul 2016 | A1 |
20160209515 | Da Silva Rodrigues et al. | Jul 2016 | A1 |
20160219686 | Nakayama et al. | Jul 2016 | A1 |
20160266260 | Preston | Sep 2016 | A1 |
20160273958 | Hoenk et al. | Sep 2016 | A1 |
20160287347 | Meier | Oct 2016 | A1 |
20160299240 | Cho et al. | Oct 2016 | A1 |
20160325117 | Arai | Nov 2016 | A1 |
20160361566 | Larkin et al. | Dec 2016 | A1 |
20160374632 | David | Dec 2016 | A1 |
20170014648 | Mostafavi | Jan 2017 | A1 |
20170036039 | Gaudio | Feb 2017 | A1 |
20170052266 | Kim et al. | Feb 2017 | A1 |
20170065834 | Liu | Mar 2017 | A1 |
20170082759 | Lyu et al. | Mar 2017 | A1 |
20170199284 | Silari et al. | Jul 2017 | A1 |
20170220709 | Wan | Aug 2017 | A1 |
20170242136 | O'Neill et al. | Aug 2017 | A1 |
20170281975 | Filiberti et al. | Oct 2017 | A1 |
20180133508 | Pearce et al. | May 2018 | A1 |
20180292550 | Xu et al. | Oct 2018 | A1 |
20190070437 | Olcott et al. | Mar 2019 | A1 |
20190126069 | Nord et al. | May 2019 | A1 |
20190357859 | Mazin | Nov 2019 | A1 |
20200164230 | Larkin et al. | May 2020 | A1 |
20200215355 | Olcott et al. | Jul 2020 | A1 |
20200222724 | Mazin et al. | Jul 2020 | A1 |
20200368551 | Bassalow et al. | Nov 2020 | A1 |
20200368557 | Harper et al. | Nov 2020 | A1 |
20210196212 | Mazin | Jul 2021 | A1 |
20210267683 | Brown | Sep 2021 | A1 |
20210327560 | Carmi | Oct 2021 | A1 |
20220093285 | Burns | Mar 2022 | A1 |
20220096867 | Mazin et al. | Mar 2022 | A1 |
20220143422 | Harper | May 2022 | A1 |
20220193451 | Duval et al. | Jun 2022 | A1 |
20220296929 | Laurence, Jr. et al. | Sep 2022 | A1 |
20220342095 | Olcott et al. | Oct 2022 | A1 |
20220395707 | Laurence, Jr. et al. | Dec 2022 | A1 |
20230218928 | Maolinbay | Jul 2023 | A1 |
20230256268 | Olcott et al. | Aug 2023 | A1 |
20230337991 | Mazin | Oct 2023 | A1 |
20230393292 | Olcott et al. | Dec 2023 | A1 |
Number | Date | Country |
---|---|---|
1681436 | Oct 2005 | CN |
1799509 | Jul 2006 | CN |
1960780 | May 2007 | CN |
101297759 | Nov 2008 | CN |
101378805 | Mar 2009 | CN |
101803929 | Aug 2010 | CN |
101970043 | Feb 2011 | CN |
103071241 | May 2013 | CN |
103648392 | Mar 2014 | CN |
103650095 | Mar 2014 | CN |
103932789 | Jul 2014 | CN |
105073188 | Nov 2015 | CN |
106461801 | Feb 2017 | CN |
10-2008-053321 | May 2010 | DE |
10-2013-205606 | Oct 2014 | DE |
0 437 434 | Jul 1995 | EP |
0 817 978 | Aug 2001 | EP |
0 984 393 | Mar 2007 | EP |
1 762 177 | Mar 2007 | EP |
1 501 604 | Dec 2009 | EP |
1 898 234 | Apr 2010 | EP |
2 188 815 | Nov 2011 | EP |
2 535 086 | Dec 2012 | EP |
2 687 259 | Jan 2014 | EP |
2 872 913 | Feb 2016 | EP |
2 874 702 | Sep 2016 | EP |
1 664 752 | Jun 2017 | EP |
3 720 554 | Jul 2021 | EP |
2839894 | Nov 2003 | FR |
69634119 | Feb 2006 | GB |
2513596 | Nov 2014 | GB |
208396 | Dec 2010 | IL |
H-01-156830 | Jun 1989 | JP |
H-09-122110 | May 1997 | JP |
9-189769 | Jul 1997 | JP |
H-11-290466 | Oct 1999 | JP |
2002-263090 | Sep 2002 | JP |
2003-534823 | Nov 2003 | JP |
2005-261941 | Sep 2005 | JP |
2007-502166 | Feb 2007 | JP |
2007-507246 | Mar 2007 | JP |
2008-107326 | May 2008 | JP |
2008-173184 | Jul 2008 | JP |
2008-173299 | Jul 2008 | JP |
2009-544101 | Dec 2009 | JP |
2010-500910 | Jan 2010 | JP |
2011-508654 | Mar 2011 | JP |
2011-514213 | May 2011 | JP |
2012-042344 | Mar 2012 | JP |
2012-129984 | Jul 2012 | JP |
2012-254146 | Dec 2012 | JP |
2013-257320 | Dec 2013 | JP |
2013-545560 | Dec 2013 | JP |
2014-521370 | Aug 2014 | JP |
2017-199876 | Nov 2017 | JP |
9520013 | Feb 1997 | NL |
WO-8910090 | Nov 1989 | WO |
WO-9522241 | Aug 1995 | WO |
WO-0015299 | Mar 2000 | WO |
WO-2004017832 | Mar 2004 | WO |
WO-2004017832 | Mar 2004 | WO |
WO-2005018734 | Mar 2005 | WO |
WO-2005018734 | Mar 2005 | WO |
WO-2005018735 | Mar 2005 | WO |
WO-2005018735 | Mar 2005 | WO |
WO-2005110495 | Nov 2005 | WO |
WO-2006051531 | May 2006 | WO |
WO-2006051531 | May 2006 | WO |
WO-2006086765 | Aug 2006 | WO |
WO-2006086765 | Aug 2006 | WO |
WO-2007045076 | Apr 2007 | WO |
WO-2007094002 | Aug 2007 | WO |
WO-2007094002 | Aug 2007 | WO |
WO-2007120674 | Oct 2007 | WO |
WO-2007120674 | Oct 2007 | WO |
WO-2007124760 | Nov 2007 | WO |
WO-2008019118 | Feb 2008 | WO |
WO-2008019118 | Feb 2008 | WO |
WO-2008024463 | Feb 2008 | WO |
WO-2008024463 | Feb 2008 | WO |
WO-2008127368 | Oct 2008 | WO |
WO-2008127368 | Oct 2008 | WO |
WO-2009114117 | Sep 2009 | WO |
WO-2009114117 | Sep 2009 | WO |
WO-2010015358 | Feb 2010 | WO |
WO-2010018477 | Feb 2010 | WO |
WO-2010018477 | Feb 2010 | WO |
WO-2010109585 | Sep 2010 | WO |
WO-2010110255 | Sep 2010 | WO |
WO-2012135771 | Oct 2012 | WO |
WO-2013168043 | Nov 2013 | WO |
WO-2013168043 | Nov 2013 | WO |
WO-2015038832 | Mar 2015 | WO |
WO-2015042510 | Mar 2015 | WO |
WO-2015103564 | Jul 2015 | WO |
WO-2015134953 | Sep 2015 | WO |
WO-2015161036 | Oct 2015 | WO |
WO-2016097977 | Jun 2016 | WO |
WO-2016203822 | Dec 2016 | WO |
WO-2017220116 | Dec 2017 | WO |
Entry |
---|
Black, Q.C. et al. (2004). “Defining a Radiotherapy Target with positron emission tomography,” Int. J. Radiation Oncology Biol. Phys. 60:1272-1282. |
Chang, J.Y. et al. (2008). “Image-guided radiation therapy for non-small cell lung cancer,” J. Thorac. Oncol. 3(2):177-186. |
Chen, Y. et al. (2011). Dynamic tomotherapy delivery, Am. Assoc. Phys. Med. 38:3013-3024. |
Corrected Notice of Allowability dated Jan. 29, 2020, for U.S. Appl. No. 16/100,054, filed Aug. 9, 2018, 4 pages. |
Corrected Notice of Allowability dated Feb. 3, 2021, for U.S. Appl. No. 16/425,416, filed May 29, 2019, 2 pages. |
Corrected Notice of Allowability dated May 17, 2022, for U.S. Appl. No. 16/191,131, filed Nov. 14, 2018, 8 pages. |
Corrected Notice of Allowability dated Feb. 14, 2023, for U.S. Appl. No. 17/697,828, filed Mar. 17, 2022, 4 pages. |
Corrected Notice of Allowability dated Feb. 23, 2023, for U.S. Appl. No. 17/697,828, filed Mar. 17, 2022, 2 pages. |
Corrected Notice of Allowability dated Mar. 16, 2023, for U.S. Appl. No. 17/203,532, filed Mar. 16, 2021, 2 pages. |
Dieterich, S. et al. (2003). “Skin respiratory motion tracking for stereotactic radiosurgery using the CyberKnife,” Elsevier Int'l Congress Series 1256:130-136. |
Erdi, Y.E. (2007). “The use of PET for radiotherapy,” Curr. Medical Imaging Reviews 3(1):3-16. |
Extended European Search Report dated Oct. 7, 2015, for European Application No. 12 763 280.0, filed on Mar. 30, 2012, 11 pages. |
Extended European Search Report dated Mar. 31, 2017, for European Application No. 09 719 473.2, filed on Mar. 9, 2009, 8 pages. |
Extended European Search Report dated Jun. 9, 2020, for EP Application No. 17 871 349.1, filed on Nov. 15, 2017, 6 pages. |
Extended European Search Report dated Oct. 30, 2020, for EP Application No. 20 179 036.7, filed on Mar. 9, 2009, 12 pages. |
Extended European Search Report dated May 26, 2021, for EP Application No. 18 832 571.6, filed on Jul. 11, 2018, 9 pages. |
Extended European Search Report dated Mar. 30, 2022, for EP Application No. 21 195 331.0, filed on Nov. 15, 2017, 11 pages. |
Fan, Q. et al. (2012). “Emission Guided Radiation Therapy for Lung and Prostrate Cancers: A Feasibility Study on a Digital Patient,” Med. Phys. 39(11):7140-7152. |
Fan, Q. et al. (2013). “Toward a Planning Scheme for Emission Guided Radiation Therapy (EGRT): FDG Based Tumor Tracking in a Metastatic Breast Cancer Patient,” Med. Phys. 40(8): 12 pages. |
Final Office Action dated Aug. 15, 2012, for U.S. Appl. No. 13/209,275, filed Aug. 12, 2011, 8 pages. |
Final Office Action dated Aug. 10, 2021, for U.S. Appl. No. 16/887,896, filed May 29, 2020, 66 pages. |
Final Office Action dated Jan. 11, 2022, for U.S. Appl. No. 16/191,131, filed Nov. 14, 2018, 25 pages. |
Freeman, T. (2015). “Radiotherapy needs preclinical research,” Medical Physics Web, 4 total pages. |
Galvin, J.M. (2018). “The multileaf collimator—A complete guide,” 17 total pages. |
Gibbons, J.P. (2004). “Dose calculation and verification for tomotherapy,” 2004 ACMP Meeting, Scottsdale, AZ., 71 total pages. |
Glendinning, A.G. et al. (2001). “Measurement of the response of Gd2O2S:Tb phosphor to 6 MV x-rays,” Phys. Mol. Biol. 46:517-530. |
Handsfield, L.L. et al. (2014). “Phantomless patient-specific TomoTherapy QA via delivery performance monitoring and a secondary Monte Carlo dose calculation,” Med. Phys. 41:101703-1-101703-9. |
International Search Report dated May 4, 2009, for PCT Application No. PCT/US2009/01500, filed on Mar. 9, 2009, 3 pages. |
International Search Report dated Mar. 7, 2018, for PCT Application No. PCT/US2017/061848, filed on Nov. 15, 2017, 4 pages. |
International Search Report dated Oct. 2, 2018, for PCT Application No. PCT/US2018/041700, filed on Jul. 11, 2018, 2 pages. |
International Search Report dated Oct. 24, 2018, for PCT Application No. PCT/US2018/046132, filed on Aug. 9, 2018, 2 pages. |
International Search Report dated Mar. 13, 2018, for PCT Application No. PCT/US2017/061855, filed on Nov. 15, 2017, 4 pages. |
International Search Report dated Jun. 20, 2018, for PCT Application No. PCT/US2018/025252, filed on Mar. 29, 2018, 2 pages. |
International Search Report dated Jan. 30, 2019, for PCT Application No. PCT/US2018/061099, filed on Nov. 14, 2018, 4 pages. |
Kapatoes, J.M. et al. (2001). “A feasible method for clinical delivery verification and dose reconstruction in tomotherapy,” Med. Phys. 28:528-542. |
Keall, P.J. et al. (2001). “Motion adaptive x-ray therapy: a feasibility study,” Phys. Med. Biol. 46:1-10. |
Keyence (2019). Ultra-high speed in-line profilometer—LJV7000 series, 3 total pages. |
Kim, H. et al. (2009). “A multi-threshold method for the TOF-PET Signal Processing,” Nucl. Instrum. Meth. Phys. Res. A. 602:618-621. |
Krouglicof, N. et al. (2013). “Development of a Novel PCB-Based Voice Coil Actuator for Opto-Mechatronic Applications,” presented at IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Tokyo, Japan, Nov. 3-7, 2013, pp. 5834-5840. |
Langen, K.M. et al. (2010). “QA for helical tomotherapy: report of the AAPM Task Group 148,” Med. Phys. 37:4817-4853. |
Li, X. et al. (2016). “Timing calibration for Time-of-Flight PET using positron-emitting isotopes and annihilation targets,” IEEE Transactions on Nuclear Science 63:1351-1358. |
Lu, W. (2009). “Real-time motion-adaptive-optimization (MAO) in tomotherapy,” Phys. Med. Biol. 54:4373-4398. |
Lu, W. (2008). “Real-time motion-adaptive delivery (MAD) using binary MLC: I. Static beam (topotherapy) delivery,” Phys. Med. Biol. 53:6491-6511. |
Mackie, T.R. et al. (Nov.-Dec. 1993). “Tomotherapy: A New Concept for the Delivery of Dynamic Conformal Radiotherapy,” Med. Phys. 20(6):1709-1719. |
McMahon, R. et al. (2008). “A real-time dynamic-MLC control algorithm for delivering IMRT to targets undergoing 2D rigid motion in the beam's eye view,” Med. Phys. 35:3875-3888. |
Mazin, S. R. et al. (2010). “Emission-Guided Radiation Therapy: Biologic Targeting and Adaptive Treatment,” Journal of American College of Radiology 7(12):989-990. |
Non-Final Office Action dated Jan. 10, 2011, for U.S. Appl. No. 12/367,679, filed Feb. 9, 2009, 9 pages. |
Non-Final Office Action dated Feb. 28, 2012, for U.S. Appl. No. 13/209,275, filed Aug. 12, 2011, 8 pages. |
Non-Final Office Action dated Sep. 19, 2013, for U.S. Appl. No. 13/895,255, filed May 15, 2013, 8 pages. |
Non-Final Office Action dated Jan. 7, 2020, for U.S. Appl. No. 15/814,222, filed Nov. 15, 2017, 13 pages. |
Non-Final Office Action dated Oct. 5, 2020, for U.S. Appl. No. 16/887,896, filed May 29, 2020, 62 pages. |
Non-Final Office Action dated Nov. 3, 2020, for U.S. Appl. No. 16/818,325, filed Mar. 13, 2020, 9 pages. |
Non-Final Office Action dated Mar. 12, 2021, for U.S. Appl. No. 16/887,896, filed May 29, 2020, 64 pages. |
Non-Final Office Action dated Apr. 26, 2021, for U.S. Appl. No. 16/191,131, filed Nov. 14, 2018, 28 pages. |
Non-Final Office Action dated Jul. 5, 2022, for U.S. Appl. No. 17/203,532, filed Mar. 16, 2021, 13 pages. |
Non-Final Office Action dated Dec. 14, 2022, for U.S. Appl. No. 16/887,852, filed May 29, 2020, 12 pages. |
Non-Final Office Action dated Jan. 17, 2023, for U.S. Appl. No. 17/837,900, filed Jun. 10, 2022, 12 pages. |
Notice of Allowance dated Jul. 25, 2011, for U.S. Appl. No. 12/367,679, filed Feb. 9, 2009, 7 pages. |
Notice of Allowance dated Apr. 9, 2014, for U.S. Appl. No. 13/895,255, filed May 15, 2013, 7 pages. |
Notice of Allowance dated Oct. 27, 2015, for U.S. Appl. No. 14/278,973, filed May 15, 2014, 8 pages. |
Notice of Allowance dated Mar. 27, 2013, for U.S. Appl. No. 13/209,275, filed Aug. 12, 2011, 9 pages. |
Notice of Allowance dated Oct. 5, 2017, for U.S. Appl. No. 14/951,194, filed Nov. 24, 2015, 11 pages. |
Notice of Allowance dated Apr. 4, 2019, for U.S. Appl. No. 15/807,383, filed Nov. 8, 2017, 11 pages. |
Notice of Allowance dated Dec. 4, 2019, for U.S. Appl. No. 16/100,054, filed Aug. 9, 2018, 13 pages. |
Notice of Allowance dated Apr. 10, 2020, for U.S. Appl. No. 16/033,125, filed Jul. 11, 2018, 18 pages. |
Notice of Allowance dated Apr. 30, 2020, for U.S. Appl. No. 15/814,222, filed Nov. 15, 2017, 10 pages. |
Notice of Allowance dated Jan. 12, 2021, for U.S. Appl. No. 16/425,416, filed May 29, 2019, 13 pages. |
Notice of Allowance dated Feb. 22, 2021, for U.S. Appl. No. 16/818,325, filed Mar. 13, 2020, 7 pages. |
Notice of Allowance dated Dec. 22, 2021, for U.S. Appl. No. 16/887,896, filed May 29, 2020, 11 pages. |
Notice of Allowance dated Apr. 29, 2022, for U.S. Appl. No. 16/191,131, filed Nov. 14, 2018, 11 pages. |
Notice of Allowance dated Jul. 12, 2022, for U.S. Appl. No. 17/238,113, filed Apr. 22, 2021, 9 pages. |
Notice of Allowance dated Aug. 1, 2022, for U.S. Appl. No. 17/238,113, filed Apr. 22, 2021, 8 pages. |
Notice of Allowance dated Dec. 15, 2022, for U.S. Appl. No. 17/203,532, filed Mar. 16, 2021, 8 pages. |
Notice of Allowance dated Feb. 7, 2023, for U.S. Appl. No. 17/697,828, filed Mar. 17, 2022, 10 pages. |
North Shore LIJ (2008). IMRT treatment plans: Dosimetry measurements & monitor units validation, 133 total pages. |
Olivera, G.H. et al. (2000). “Modifying a plan delivery without re-optimization to account for patient offset in tomotherapy,” Proceedings of the 22nd Annual EMBS International Conference, Jul. 23-28, 2000, Chicago, IL, pp. 441-444. |
Papanikolaou, N. et al. (2010). “MU-Tomo: Independent dose validation software for helical tomo therapy,” J. Cancer Sci. Ther. 2:145-152. |
Parodi, K. (2015). “Vision 20/20: Positron emission tomography in radiation therapy planning, delivery, and monitoring,” Med. Phys. 42:7153-7168. |
Partial Supplementary European Search Report dated Jun. 25, 2015, for European Application No. 12 763 280.0, filed on Mar. 30, 2012, 6 pages. |
Prabhakar, R. et al. (2007). “An Insight into PET-CT Based Radiotherapy Treatment Planning,” Cancer Therapy (5):519-524. |
Schleifring (2013). Slip Ring Solutions—Technology, 8 total pages. |
Tashima, H. et al. (2012). “A Single-Ring Open PET Enabling PET Imaging During Radiotherapy,” Phys. Med. Biol. 57(14):4705-4718. |
TomoTherapy® (2011). TOMOHD Treatment System, Product Specifications, 12 total pages. |
Varian Medical Systems (2004). “Dynamic Targeting™ Image-Guided Radiation Therapy—A Revolution in Cancer Care,” Business Briefing: US Oncology Review, Abstract only, 2 pages. |
ViewRay's MRIDIAN LINAC enables radiosurgery with MRI vision for cancer therapy, (2017). YouTube video located at https://www.youtube.com/watch?v=zm3g-BISYDQ, PDF of Video Screenshot Provided. |
Wang, D. et al. (2006). “Initial experience of FDG-PET/CT guided IMRT of head-and-neck carcinoma,” Int. J. Radiation Oncology Biol. Phys. 65:143-151. |
Wikipedia (2016). “Scotch yoke,” Retrieved from https://en.wikipedia.org/wiki/Scotch_yoke, 3 pages. |
Willoughby, T. et al. (2012). “Quality assurance for nonradiographic radiotherapy localization and positioning systems: Report of task group 147,” Med. Phys. 39:1728-1747. |
Written Opinion of the International Searching Authority dated May 4, 2009, for PCT Application No. PCT/US2009/01500, filed on Mar. 9, 2009, 5 pages. |
Written Opinion of the International Searching Authority dated Jul. 20, 2012, for PCT Patent Application No. PCT/US2012/031704, filed on Mar. 30, 2012, 10 pages. |
Written Opinion of the International Searching Authority dated Mar. 7, 2018, for PCT Application No. PCT/US2017/061848, filed on Nov. 15, 2017, 5 pages. |
Written Opinion of the International Searching Authority dated Oct. 2, 2018, for PCT Application No. PCT/US2018/041700, filed on Jul. 11, 2018, 19 pages. |
Written Opinion of the International Searching Authority dated Oct. 24, 2018, for PCT Application No. PCT/US2018/046132, filed on Aug. 9, 2018, 7 pages. |
Written Opinion of the International Searching Authority dated Mar. 13, 2018, for PCT Application No. PCT/US2017/061855, filed on Nov. 15, 2017, 6 pages. |
Written Opinion of the International Searching Authority dated Jun. 20, 2018, for PCT Application No. PCT/US2018/025252, filed on Mar. 29, 2018, 12 pages. |
Written Opinion of the International Searching Authority dated Jan. 30, 2019, for PCT Application No. PCT/US2018/061099, filed on Nov. 14, 2018, 11 pages. |
Yamaya, T. et al. (2008). “A proposal of an open PET geometry,” Physics in Med. and Biology 53:757-773. |
Final Office Action dated Sep. 19, 2023, for U.S. Appl. No. 17/837,900, filed Jun. 10, 2022, 16 pages. |
Non-Final Office Action dated Aug. 3, 2023, for U.S. Appl. No. 18/053,874, filed Nov. 9, 2022, 8 pages. |
Notice of Allowance dated Jun. 30, 2022, for U.S. Appl. No. 16/582,286, filed Sep. 25, 2019, 10 pages. |
Notice of Allowance dated Jul. 21, 2022, for U.S. Appl. No. 16/582,286, filed Sep. 25, 2019, 7 pages. |
Corrected Notice of Allowability mailed on Jan. 30, 2024, for U.S. Appl. No. 16/887,852, filed May 29, 2020, 2 pages. |
Extended European Search Report mailed on Jan. 24, 2024, for EP Application No. 23 160 060.2, filed Mar. 9, 2009, 12 pages. |
Non-Final Office Action mailed on Dec. 13, 2023, for U.S. Appl. No. 18/056,188, filed Nov. 16, 2022, 7 pages. |
Non-Final Office Action mailed on Jan. 16, 2024, for U.S. Appl. No. 18/178,431, filed Mar. 3, 2023, 16 pages. |
Notice of Allowance mailed on Dec. 22, 2023, for U.S. Appl. No. 18/053,874, filed Nov. 9, 2022, 8 pages. |
Notice of Allowance mailed on Dec. 28, 2023, for U.S. Appl. No. 16/887,852, filed May 29, 2020, 9 pages. |
Notice of Allowance mailed on Feb. 7, 2024, for U.S. Appl. No. 18/311,134, filed May 2, 2023, 11 pages. |
Number | Date | Country | |
---|---|---|---|
20220395707 A1 | Dec 2022 | US |
Number | Date | Country | |
---|---|---|---|
62585772 | Nov 2017 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16191131 | Nov 2018 | US |
Child | 17852067 | US |