Treating a patient for a tumor can be a complex endeavor which may span over an extended time period. For example, treatments may include surgical procedures, radiation, or chemotherapy, and may occur over the course of several months or years. Moreover, a physician may perform multiple image scans, such as CT or MRI, over time to visualize the changes in the tumor as different treatments are administered. The physician may need to monitor and compare results of past treatments performed by reviewing historical scans and images in order to decide on future treatments based on the size and location of the tumor and the change in size over time. In addition, it may be desirable for multiple physicians to collaborate. For example, a group of physicians may meet regularly or occasionally to collaborate and review a patient's case in order to decide on next steps and future treatment. However, compiling, organizing, and reviewing this information in a collaborative manner may be tedious, inefficient, time consuming, and prone to error.
Provided are a plurality of example embodiments, including, but not limited to, a method for tracking a tumor, comprising:
Also provided are additional example embodiments, some, but not all of which, are described hereinbelow in more detail.
In the accompanying drawings, structures are illustrated that, together with the detailed description provided below, describe exemplary embodiments of the claimed invention. Like elements are identified with the same reference numerals. It should be understood that elements shown as a single component may be replaced with multiple components, and elements shown as multiple components may be replaced with a single component. The drawings are not to scale and the proportion of certain elements may be exaggerated for the purpose of illustration.
The following acronyms and definitions will aid in understanding the detailed description:
VR—Virtual Reality—A 3Dimensional computer generated environment which can be explored and interacted with by a person in varying degrees.
HMD—Head Mounted Display refers to a headset which can be used in VR environments. It may be wired or wireless. It may also include one or more add-ons such as headphones, microphone, HD camera, infrared camera, hand trackers, positional trackers etc.
SNAP Model—A SNAP case refers to a 3D texture or 3D objects created using one or more scans of a patient (CT, MR, fMR, DTI, etc.) in DICOM file format. It also includes different presets of segmentation for filtering specific ranges and coloring others in the 3D texture. It may also include 3D objects placed in the scene including 3D shapes to mark specific points or anatomy of interest, 3D Labels, 3D Measurement markers, 3D Arrows for guidance, and 3D surgical tools. Surgical tools and devices have been modeled for education and patient specific rehearsal, particularly for appropriately sizing aneurysm clips.
MD6DM—Multi Dimension full spherical virtual reality, 6 Degrees of Freedom Model. It provides a graphical simulation environment which enables the physician to experience, plan, perform, and navigate the intervention in full spherical virtual reality environment.
Fly-Through—Also referred to as a tour, this describes a perspective view of a virtual reality environment while moving through the virtual reality environment along a defined path.
A surgery rehearsal and preparation tool previously described in U.S. Pat. No. 8,311,791, incorporated in this application by reference, has been developed to convert static CT and MM medical images into dynamic and interactive multi-dimensional full spherical virtual reality, six (6) degrees of freedom models (“MD6DM”) based on a prebuilt SNAP model that can be used by physicians to simulate medical procedures in real time. The MD6DM provides a graphical simulation environment which enables the physician to experience, plan, perform, and navigate the intervention in full spherical virtual reality environment. In particular, the MD6DM gives the surgeon the capability to navigate using a unique multidimensional model, built from traditional two-dimensional patient medical scans, that gives spherical virtual reality 6 degrees of freedom (i.e. linear; x, y, z, and angular, yaw, pitch, roll) in the entire volumetric spherical virtual reality model.
The MD6DM is rendered in real time by an image generator using a SNAP model built from the patient's own data set of medical images including CT, MM, DTI etc., and is patient specific. A representative brain model, such as Atlas data, can be integrated to create a partially patient specific model if the surgeon so desires. The model gives a 360° spherical view from any point on the MD6DM. Using the MD6DM, the viewer is positioned virtually inside the anatomy and can look and observe both anatomical and pathological structures as if he were standing inside the patient's body. The viewer can look up, down, over the shoulders etc., and will see native structures in relation to each other, exactly as they are found in the patient. Spatial relationships between internal structures are preserved and can be appreciated using the MD6DM.
The algorithm of the MD6DM rendered by the image generator takes the medical image information and builds it into a spherical model, a complete continuous real time model that can be viewed from any angle while “flying” inside the anatomical structure. In particular, after the CT, Mill, etc. takes a real organism and deconstructs it into hundreds of thin slices built from thousands of points, the MD6DM reverts it to a 3D model by representing a 360° view of each of those points from both the inside and outside.
Described herein is a tumor tracking system, leveraging an image generator and a MD6DM model, for tracking and monitoring a status of a tumor over time. In particular, the tumor tracking system enables a physician, or a group of physician in collaboration, to visualize and interact with an integrated representation of data relating to a patient's tumor and associated treatment history and also to plan future treatments of the tumor. An integrated and interactive tumor board generated by the tumor tracking system provides a single interface that compiles and organizes information from multiple sources to enable efficient tumor tracking and treatment planning.
In one example, the tumor tracking computer 102 receives patient tumor data 104 from a local data source 106 located proximate to or integrated with the tumor tracking computer 102. In another example, the tumor tracking computer 102 receives patient tumor data 104 from a network data source 108 via a network 120 such an enterprise network or the Internet. In one example, the network data source 108 may be a third-party data source such as an EMR provider.
The tumor tracking computer 102 integrates the patient tumor data 104 into a tumor board 108 for physician 110 interaction. The tumor tracking computer 102 further includes an image generator (not shown) for rendering a MD6DM using a received SNAP model. In one example, the tumor tracking computer 102 also synchronizes the patient tumor data 104 such that as the physician 110 interacts with a first portion of the patient tumor data 104 via the tumor board 108, the remaining portion of the patient tumor data 104 is updated and presented via the tumor board 108 accordingly. In one example, the tumor board 108 provides for a read-only experience in which the physician 110 may only visualize the patient tumor data 104 without making any changes to the data. In another example, the tumor board 108 provides an experience in which the physician 110 may make changes to the patient tumor data 104, such as updating a patient health record, adding a note, recommending a treatment plan, etc.
The tumor tracking computer 102 presents the tumor board 108 to the physician 110 on a display 112 for viewing and interaction via suitable user input mechanisms. In another example, tumor tracking computer 102 may present the tumor board 108 to the physician 110 via a HMD 114 for a more immersive experience. In one example, tumor tracking computer 102 may present the tumor board 108 to a remote physician 116 via a remote display 118 over the network 120 to enable collaboration. In such an example, the tumor tracking computer 102 may present the identical tumor board 108, including any interactions with the tumor board 108 by any number of multiple physicians, simultaneously to the multiple physicians such that the multiple physicians are all experiencing the same view for efficient collaboration.
In order to facilitate the review and analysis of the tumor 204 over time, the tumor board 200 includes a timeline for navigating historical data relating to the tumor 204. In one example, the timeline may be divided into multiple components. For example, as illustrated, the tumor board 200 includes a treatment timeline 210 and an imaging and surgery timeline 212. It should be appreciated that the number of timelines may be expanded or combined as suitable to appropriately provide a physician with an effective interactive experience for visualizing the data.
As further illustrated in
Similarly, as illustrated in
Selecting a medical image from the detailed medical imaging view 408 causes the MD6DM viewer 502, as illustrated in
In one example, the tumor tracking computer 102 has artificial intelligence capabilities and is able to make recommendations, via the tumor viewer 508 in the tumor board, for how to treat a patient's tumor. For example, the tumor tracking computer 102 may learn from historical tumor data and develop its own algorithms on what combinations of radiation, chemotherapies, and surgeries are most effective for treating certain types of tumors. The tumor tracking computer 102 may asses the current tumor being tracked and evaluated and make a treatment recommendation based on its own AI algorithms.
The tumor viewer 508 includes a tumor timeline feature 506, which upon selection, generates a tumor timeline window 600 as illustrated in
In one example, as illustrated in
In one example, as illustrated in
Processor 1002 processes instructions, via memory 1004, for execution within computer 500. In an example embodiment, multiple processors along with multiple memories may be used.
Memory 1004 may be volatile memory or non-volatile memory. Memory 1004 may be a computer-readable medium, such as a magnetic disk or optical disk. Storage device 1006 may be a computer-readable medium, such as floppy disk devices, a hard disk device, optical disk device, a tape device, a flash memory, phase change memory, or other similar solid state memory device, or an array of devices, including devices in a storage area network of other configurations. A computer program product can be tangibly embodied in a computer readable medium such as memory 1004 or storage device 1006.
Computer 1000 can be coupled to one or more input and output devices such as a display 1014, a printer 1016, a scanner 1018, a mouse 1020, and a HMD 1022.
As will be appreciated by one of skill in the art, the example embodiments may be actualized as, or may generally utilize, a method, system, computer program product, or a combination of the foregoing. Accordingly, any of the embodiments may take the form of specialized software comprising executable instructions stored in a storage device for execution on computer hardware, where the software can be stored on a computer-usable storage medium having computer-usable program code embodied in the medium.
Databases may be implemented using commercially available computer applications, such as open source solutions such as MySQL, or closed solutions like Microsoft SQL that may operate on the disclosed servers or on additional computer servers. Databases may utilize relational or object oriented paradigms for storing data, models, and model parameters that are used for the example embodiments disclosed above. Such databases may be customized using known database programming techniques for specialized applicability as disclosed herein.
Any suitable computer usable (computer readable) medium may be utilized for storing the software comprising the executable instructions. The computer usable or computer readable medium may be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. More specific examples (a non-exhaustive list) of the computer readable medium would include the following: an electrical connection having one or more wires; a tangible medium such as a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a compact disc read-only memory (CDROM), or other tangible optical or magnetic storage device; or transmission media such as those supporting the Internet or an intranet.
In the context of this document, a computer usable or computer readable medium may be any medium that can contain, store, communicate, propagate, or transport the program instructions for use by, or in connection with, the instruction execution system, platform, apparatus, or device, which can include any suitable computer (or computer system) including one or more programmable or dedicated processor/controller(s). The computer usable medium may include a propagated data signal with the computer-usable program code embodied therewith, either in baseband or as part of a carrier wave. The computer usable program code may be transmitted using any appropriate medium, including but not limited to the Internet, wireline, optical fiber cable, local communication busses, radio frequency (RF) or other means.
Computer program code having executable instructions for carrying out operations of the example embodiments may be written by conventional means using any computer language, including but not limited to, an interpreted or event driven language such as BASIC, Lisp, VBA, or VBScript, or a GUI embodiment such as visual basic, a compiled programming language such as FORTRAN, COBOL, or Pascal, an object oriented, scripted or unscripted programming language such as Java, JavaScript, Perl, Smalltalk, C++, Object Pascal, or the like, artificial intelligence languages such as Prolog, a real-time embedded language such as Ada, or even more direct or simplified programming using ladder logic, an Assembler language, or directly programming using an appropriate machine language.
To the extent that the term “includes” or “including” is used in the specification or the claims, it is intended to be inclusive in a manner similar to the term “comprising” as that term is interpreted when employed as a transitional word in a claim. Furthermore, to the extent that the term “or” is employed (e.g., A or B) it is intended to mean “A or B or both.” When the applicants intend to indicate “only A or B but not both” then the term “only A or B but not both” will be employed. Thus, use of the term “or” herein is the inclusive, and not the exclusive use. See, Bryan A. Garner, A Dictionary of Modern Legal Usage 624 (2d. Ed. 1995). Also, to the extent that the terms “in” or “into” are used in the specification or the claims, it is intended to additionally mean “on” or “onto.” Furthermore, to the extent the term “connect” is used in the specification or claims, it is intended to mean not only “directly connected to,” but also “indirectly connected to” such as connected through another component or components.
While the present application has been illustrated by the description of embodiments thereof, and while the embodiments have been described in considerable detail, it is not the intention of the applicants to restrict or in any way limit the scope of the appended claims to such detail. Additional advantages and modifications will readily appear to those skilled in the art. Therefore, the application, in its broader aspects, is not limited to the specific details, the representative apparatus and method, and illustrative examples shown and described. Accordingly, departures may be made from such details without departing from the spirit or scope of the applicant's general inventive concept.
This application claims the benefit of U.S. provisional patent application Ser. No. 63/105,089, filed on Oct. 23, 2020, incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
63105089 | Oct 2020 | US |