System and method for performing computerized simulations for image-guided procedures using a patient specific model

Information

  • Patent Grant
  • 8543338
  • Patent Number
    8,543,338
  • Date Filed
    Tuesday, March 17, 2009
    15 years ago
  • Date Issued
    Tuesday, September 24, 2013
    11 years ago
Abstract
Embodiments of the invention are directed to a method of performing computerized simulations of image-guided procedures. The method may include producing a digital image-based model of an anatomical structure based on medical image data, producing, based on the image-based model and extrapolated data, an extended model that represents the anatomical structure and adjacent anatomical regions not included in the medical image data, displaying a graphical representation of the extended model and performing a computerized simulation of an image-guided procedure using the extended model.
Description
BACKGROUND OF THE INVENTION

The clinical practice is replacing, wherever possible, traditional open surgical procedures with less invasive techniques that require, however, indirect, image-based feedback. In image-guided procedures, such as vascular catheterization, angioplasty and stent placement the physician needs to identify anatomical structures in images. These procedures are difficult to master without extensive training, yet training on humans may be fatal. Accordingly, simulation systems for image-guided procedures may train a physician without unnecessary risk and may also serve as pre-operative planning tool or post-operative assessment tool. Most of the simulation systems are based on predefined models of anatomical structures and are not patient specific. Accordingly, such systems cannot be used for accurately planning an operation on a particular patient prior to performing the operation or for post assessment of the operation. A more progressive simulation system is a patient-specific simulation system that uses patient-specific medical image data.





BRIEF DESCRIPTION OF THE DRAWINGS

The subject matter regarded as the invention is particularly pointed out and distinctly claimed in the concluding portion of the specification. The invention, however, both as to organization and method of operation, together with objects, features and advantages thereof, may best be understood by reference to the following detailed description when read with the accompanied drawings in which:



FIG. 1 shows an exemplary system for simulating an image-guided procedure according to embodiments of the invention;



FIG. 2 shows an illustration of an exemplary anatomical structure helpful in understanding embodiments of the present invention;



FIG. 3 is a flowchart diagram illustrating a method for simulating an image-guided procedure according to some embodiments of the present invention;



FIGS. 4A and 4B show graphical illustrations of an exemplary 3D digital model produced according to embodiments of the invention;



FIG. 5 is a flowchart diagram illustrating an exemplary method for generating an extended model for client-specific simulation for image-guided procedures according to some embodiments of the present invention;



FIG. 6 is a graphical illustration of an exemplary 3D generic model for extended models according to embodiments of the invention; and



FIGS. 7A and 7B show graphical illustrations of an exemplary 3D digital model produced according to embodiments of the invention.





It will be appreciated that for simplicity and clarity of illustration, elements shown in the figures have not necessarily been drawn to scale. For example, the dimensions of some of the elements may be exaggerated relative to other elements for clarity. Further, where considered appropriate, reference numerals may be repeated among the figures to indicate corresponding or analogous elements.


DETAILED DESCRIPTION OF EMBODIMENTS OF THE INVENTION

In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the invention. However, it will be understood by those of ordinary skill in the art that the invention may be practiced without these specific details. In other instances, well-known methods, procedures, components, modules, units and/or circuits have not been described in detail so as not to obscure the invention.


Embodiments of the invention are directed to patient-specific computerized simulations for image-guided procedures. The method may include producing a digital model of an anatomical structure based on medical image data received from a scan of a subject. The subject may be for example a patient that is about to undergo an image-guided procedure. Usually the medical image data received, for example, from a CT system or any other suitable imaging system does not include or cover the entire area of interest for simulation but rather a more limited area or region. For example, a scan performed prior to a stenting procedure may typically cover areas in close vicinity to the area about to be treated. As will be understood to a person skilled in the art, usually, a scan of a patient is performed as a diagnostic tool to diagnose, for example, pathology in a specific anatomical region and therefore usually such a scan covers only the area that diseased, damaged or modified and its immediate vicinity. An operation or procedure for treating the pathology may require passage through other anatomical regions not covered by the scan. A physician may wish to practice on the entire procedure prior to performing the operation and/or to perform post-procedure assessment. Accordingly, in order to enable a comprehensive simulation of the procedure, an anatomical model of additional regions may be desired.


According to embodiments of the invention, the method may include producing an extended model based on the digital model and extrapolated data representing adjacent sections of the anatomical structures not included in the medical image data. A graphical representation of the extended model may be displayed on a monitor and the extended model may be viewed with additional information, such as models of tools. A physician may perform a computerized simulation of the image-guided procedure using the extended model as a pre-procedure of the actual surgery.


The medical image data may be patient specific medical images obtained from an imaging system such as computed tomography (CT), CT-fluoroscopy, fluoroscopy, magnetic resonance imaging (MRI), Ultrasound, positron emission tomography (PET) and X-ray. Embodiments of the invention may use as input the medical image data to produce 3D or 4D model of the anatomical structure, organ, and system and further an extended model including regions that are not presented in the medical images obtained from the imaging system.


The 3D model generated based on the image data may be, for example, a polygonal mesh representing the three-dimensional (3D) surfaces of the anatomical structure, a voxel mask of the structure volume or a patch surface, such as a 2D B-spline or the like. For ease of explanation, embodiments of the invention are described with respect to polygonal meshes. It should be however understood to a person skilled in the art that the invention is not limited in to such models and other models are within the scope of the invention.


Reference is made to FIG. 1 showing an exemplary system 100 for simulating an image-guided procedure according to embodiments of the invention. System 100 may include an input unit 105, an output unit 120, a model generation unit 110, a simulation unit 115 and a management unit 135. System 100 may further include a memory 130 and a controller 131. Input unit 105 may receive medical image data and may deliver the medical image data to model generation unit 110 for producing a patient-specific model for simulation. The model may include extended portions representing areas of the anatomical structures that are were not presented in the medical image data used as input data. When a user performs a simulation, for example as a pre-procedure for an image-guided procedure, using simulation unit 115, a graphical representation of the model and the simulation process may be displayed on a monitor (not shown) of output unit 120.


Input unit 105 may interface with or receive medical image data from an imaging system (not shown) such as X-ray system, CT system, MRI system and/or an ultrasound imaging system. Input unit 105 may include a mouse, a keyboard, a touch screen or pad or any suitable input devices. Alternatively or additionally, input unit 105 may include a wired or wireless network interface card (NIC) that may receive data, for example, from the imaging system. According to some embodiments, input unit 105 may communicate with a system or server storing medical images such as a picture archiving communication system (PACS) and may obtain any relevant imaging information, data or parameters from such a system, server or application.


Model generation emit 110 may include components or modules for generating a digital model and its graphical representation, e.g., a 3D or 4D anatomical model of an anatomical structure, such as an organ vessel system or any other area of interest within the body. The model may be generated by model generation unit 110 according to information received from an imaging system, for example, a medical image received from a CT system via input unit 105. Simulation unit 115 may include components for generating a simulation of an image-guided procedure. According to embodiments of the invention, system 100 may further include a force feedback device (not shown) to simulate the perception of force when manipulating physical tools within the body.


Output unit 120 may include components for interfacing with a display screen to enable visual output or optionally a speaker or another audio device to enable audible output. Output unit 120 may include one or more displays, speakers and/or any other suitable output devices. Output unit 120 may communicate with any other component or unit of system 100 and accordingly may enable such units to communicate with external systems. Units 105, 110, 115 and 120 may be or may comprise software, hardware, firmware or any suitable combination thereof.


Controller 131 may be any suitable controller or processing unit, e.g., a central processing unit processor (CPU). Memory 130 may be any suitable memory component, device, chip or system and may store applications or other executable codes that may be executed by controller 131. For example, applications or modules implementing model generation and/or simulation may be loaded into memory 130 and executed by controller 131.


It will be recognized that system 100 as described herein is an exemplary system. According to embodiments of the invention, system 100 may be implemented on a single computational device or alternatively, in a distributed configuration, on two or more different computational devices. For example, model generation unit 110 may operate on a first computational device and managed by a first management unit whereas simulation unit 115 may operate on another computational device and managed by a second management unit that communicates with the first management unit. In another exemplary embodiment, management unit 135 may operate on a computational device, model generation unit 110 may operate on a second computational device and simulation unit 115 may operate on a third computational device.


Reference is made to FIG. 2 showing an exemplary illustration of an anatomical structure of a cardiovascular system helpful in understanding embodiments of the invention. Performing an image-guided procedure simulation, such as carotid stenting simulation, the user may practice navigation in the vasculature while experimenting with various catheters. Typically, the scan of the patient for diagnostic purposes is performed only in the direct vicinity of the lesion. Accordingly, a model for simulation that would use as input data such scans may not encompass all the areas through which that the physician should navigate to reach the lesion.


The illustration of FIG. 2 shows a top region 220 representing an area that was scanned by an exemplary imaging system and a bottom region 230 that was not scanned. Accordingly, the medical image data would include only the image of region 220. According to embodiments of the invention, in order to perform a full and/or comprehensive simulation of the procedure, an extended model for simulation may be produced representing both the scanned region 220 and the un-scanned region 230. According to embodiments of the invention, the portion of the model representing the scanned region may be generated from the medical image data of a specific subject and the model may be extrapolated to include representation of the un-scanned region. The extrapolation may include selecting from a set of pre-designed generic models representing anatomical structures the best match to represent the desired extended portion.


For example, in an angioplasty procedure, such as carotid stenting, in order to reach the left vertebral at point 235 where a lesion may exist, a catheter may be inserted into the aorta at the lower part of the patient's body and guided through point 240 on the way to the left vertebral. Accordingly, an extended model that includes the aorta may be desired for the simulated procedure.


Reference is made to FIG. 3, which is an exemplary flowchart describing a method for simulating an image-guided procedure according to some embodiments of the present invention. Reference is additionally made to FIGS. 4A and 4B showing graphical illustrations of an exemplary 3D digital model produced according to embodiments of the invention. As shown by box 310, the method may include receiving medical image data of a subject. The medical image data may be received from an imaging or scanning system such as for example, CT or MRI scanners. As shown by box 315, the method may include processing the received image data and generating based on the processed data a model of the anatomical structure depicted in the medical image, which is patient specific. A model that includes only anatomical structure that is represented in the medical image data is also termed herein a basic model or an image-based model. FIG. 4A shows an exemplary image-based model 410 processed from CT data of a portion of the vasculature.


According to exemplary embodiments, the processing may include segmenting the medical image data. Segmentation involves partitioning of the image domain into non-overlapping regions that correspond to distinct anatomical structures and identifying the desired anatomical structures in the medical images. Other segmentation techniques, such as soft segmentation, probabilistic or Bayesian segmentations may enable overlapping regions. It will be recognized that embodiments of the invention are not limited in that respect, and any applicable segmentation or soft segmentation method may be used.


The segmentation process may be implemented using fully-automated algorithms, semi-automated algorithms with minimal input from the user, tailor-made algorithms where the user can explicitly specify the desired segmentation or by manual segmentation, using for example CAD tools. The output of the segmentation process includes identifying a portion of the image data as a set of voxels (a mask) that represents the desired anatomical structure volume. The processing of the data may further include generating the discretisized surfaces of this volume, also referred to as a boundary-representation (B-rep). These surfaces are commonly represented by polygonal meshes, although embodiments of the invention are not limited in this respect and other representations, such as spline surface patches or constructive solid geometry (CSG) representation or any mix thereof are likewise possible.


The processing may further include computation of center-lines of approximate-tubular sections of the polygonal mesh, e.g., blood vessels, intestines or colon. In the exemplary image-based or basic model of FIG. 4A, the center-lines represent the center-lines of different vessels. The vessel center-lines may be defined by cubic splines, e.g., Catmul-Rom splines or any other suitable mathematical function. The processing may further include, additionally or alternatively, computation of the radii of the various vessels.


According to some embodiments, generating the image-based model of the anatomical structure depicted in the medical image may include the process of registration. Registration may be defined as the alignment of multiple data sets into a single coordinate system such that the spatial locations of corresponding points would coincide. The segmented portions of the anatomical structure may be properly located and rotated with respect to other models representing the anatomy of the patient. Such other models may be generic or patient-specific models as described herein. For example, registering a simulated blood vessel may include properly locating it with respect to digital models of the bone structure, internal organs or lungs of the patient. Registration may enable presenting a real-life image of the subject being treated. Registration may further enable rotating or otherwise relocating or repositioning in space a simulated model while preserving the relative position of organs, sections, regions or parts in the simulated model.


According to some embodiments, the procedures, tasks and/or functions associated with the segmentation, generation of polygonal meshes, computing the centerlines and/or radii may be performed in the segmentation stage or at simulation start up by either model generation unit 110 or simulation unit 115. It should be understood by a person skilled in the art that although the described herein are typically performed, other implementations may exist. For example, simulation unit 115 may generate a model using as input the set of voxels without performing surface rendering and/or computation of centerlines or radii and/or registration. It will be recognized that embodiments of the invention are not limited by the exemplary method or system described for generating an image-based basic model of an anatomical structure and other methods may be used for generating such a model without departing from the scope of the invention.


Returning back to FIG. 3, according to embodiments of the invention, the method, may include generating or calculating an extended model of the anatomical structure by extrapolating boundaries of the basic model depicting the medical image data (box 320). Then, the method may include performing a patient-specific simulation of an image-guided procedure according to the extended model (box 325). According to embodiments of the invention, parts, sections or regions missing from a medical image data of a subject may be simulated or modeled by extrapolating a model calculated from the medical image data. FIG. 4B shows an extended model produced by identifying to boundaries in basic model 410 of FIG. 4 and extrapolating the basic model 410 and adding extrapolated sections 420, 421 as explained in detail with respect to FIGS. 5, 6, 7A and 7B.


Reference is made to FIG. 5 which is an exemplary flowchart describing a method of generating an extended model for patient-specific simulation of an image-guided procedure according to embodiments of the invention. Reference is additionally made to FIGS. 6, 7A and 7B showing graphical illustrations of exemplary polygonal meshes produced according to embodiments of the invention. Embodiments of the invention may be applicable to various anatomical structures, systems, sections, regions, organs or parts thereof. Accordingly, the method described herein with reference to FIG. 5 may be applicable to generating an extended model of any such anatomical structure. However, for the sake of simplicity and clarity the discussion below refers to as an exemplary illustration to the vasculature. It will be noted that the embodiments of the invention are not limited in this respect. For ease of explanation, embodiments of the invention are described with respect to polygonal meshes. It should be however understood to a person skilled in the art that the invention is not limited in to such models and other models are within the scope of the invention.


According to the exemplary embodiment of FIG. 5, the input to the extrapolation phase is an image-based model that is represented as a polygonal mesh representing 3D surfaces of the image-based model, properly registered and rotated including further the vessel centerlines represented by splines. As shown by box 510, the method may include identifying one or more boundaries of the polygonal mesh, such as mesh 410 of FIG. 4A or mesh 710 of FIG. 7A. For example, the boundaries of the polygonal mesh may be detected by detecting edges associated with only one polygon. The detected edges may be grouped into one or more connectivity components where each component is associated with an open boundary of a single vessel that may potentially require extrapolation.


According to an exemplary embodiment of the invention, the method may include selecting for each boundary that requires extrapolation a generic pre-designed model represented for example as a polygonal mesh, such as mesh 610 of FIG. 6, to be used as a model for the extended section (box 515). The generic model may be selected from a set of pre-designed models to best match the open boundary edge according to certain parameters and rules. The generic model may be represented by a polygonal mesh to be used as an approximation or estimation of a region or section not included in an image-based model generated according to the medical image data.


The selected generic model, e.g., polygonal mesh, may be retrieved from a database, library or any other repository of generic meshes. Such repository may store a large number of polygonal meshes or other models, each corresponding, for example, to a particular vessel at a particular anatomical location and has known physical characteristics such as length and radius. According to embodiments of the invention, a repository of pre-generated models or polygonal meshes may be accessed in order to obtain a polygonal mesh closest to or best matching an open boundary of the polygonal mesh representing the basic or image-based model. For example, polygonal mesh 610 may be chosen as a basis or starting point for generating an extended section 720 for section 710 in boundary 715 as illustrated in FIGS. 7A and 7B. The selection of the best match may be performed manually or automatically based on a predefined set of geometric attributes of the segmented model. Alternatively, the selection may be made by using both manual and automated operations. According to embodiments of the invention, the system may send a user a recommendation to select a specific generic model and the user may them manually confirm of deny the recommendation.


Then, as shown by box 525, the exemplary embodiment may include positioning the selected mesh in alignment with the boundary. An exemplary alignment method may utilize centerlines associated with a library of predefined extrapolation meshes and may include computing a tangent vector to the centerline at the edge of the segmented region near the boundary. Such tangent vector may be computed with relation to a point at the edge or boundary detected. Next, a point on a centerline of the predefined or generated polygonal mesh may be located. Such point may be related to or have a tangent vector that is close or identical to the tangent value computed for the boundary of the simulated organ. Using points with similar tangent vector may enable positioning the predefined polygonal mesh in correct orientation with respect to the simulated organ or the polygonal mesh representing it.


Then, as shown by box 530, the exemplary embodiment may include connecting the centerline of the polygonal mesh representing the image-based model with the centerline of the selected predefined polygonal mesh as illustrated by FIG. 7A. The connection of the centerlines may be done in accordance with the computed tangent values as described herein. The centerlines may be joined, bonded or welded such that the points with close, similar or identical tangent value coincide to ensure that the basic model and the added section may be similarly positioned or orientated in a 3D space.


Next, as shown by box 535, the exemplary embodiment may include removing of redundant sections. As the registration of the boundary of the basic model represented by the basic polygonal mesh is known, a virtual box, sphere or other structured volume may be defined such that it may contain the polygonal mesh. Accordingly, parts of the added section extending beyond or outside such virtual volume may be removed as they may be redundant. Alternatively, only vertices of the selected a mesh which lie inside such virtual box or volume may be kept for further processing as described herein.


As shown by box 540, the exemplary embodiment may include scaling the radius of the selected generic mesh to better match the radius of the basic polygonal mesh representing the basic model. The shape of the selected predefined polygonal mesh may not fit or perfectly match that of the boundary or open end to which it is connected. Moreover, although aligned or orientated as described herein, the centerlines of the polygonal mesh representing the basic model and the selected polygonal mesh representing the extended section may still be spatially apart. Additionally, for example, in the case of a simulated blood vessel, a cross-section of the predefined, extrapolated section being matched to an open end may not be a perfect circle, nor does its centerline guaranteed to be an exact mathematical centerline of a cylinder. Accordingly, a different scaling factor, possibly for every angle around the centerline may be applied.


Then, as shown by box 545, the exemplary embodiment may include welding the surface of selected polygonal mesh with the surface of the basic polygonal mesh. For example, the basic polygonal mesh and the selected predefined polygonal mesh may be joined such that a continuous surface is produced as illustrated by FIG. 7B. Vertices from the boundary of the basic polygon mesh and from the corresponding selected mesh of the extended section may be joined and represented by a single vertex while redundant vertices may be removed from the model. The exemplary method described above may be repeated for each open end, boundary or edge of the basic model that needs to be extended.


It should be understood to a person skilled in the art that the predefined generic models representing the extended anatomy may be represented in many methods and models, different or similar to the basic model. According to some embodiments, the model representing the extended regions may be different than the basic model. Accordingly, the model representing an extrapolated region may be converted to another format or representation to comply with the basic model. The extrapolated region in the converted representation may then be welded or otherwise connected to the basic model.


Alternatively, according to other embodiments of the invention, a polygonal mesh or another suitable model representing the extrapolated section may be generated procedurally. The procedural generation of the extrapolated sections or regions may include identifying key attributes of the basic model such as the boundaries to be extended, relevant radii, centerlines and the like. For example, centerlines of a generated basic model of a blood vessel may be extrapolated according to a first set of rules. A non-limiting example for the first set of rules may include rules directed to smoothly curving the extrapolation towards a pre-determined direction. Then, the radii function of the simulated blood vessel model may be extrapolated, for example, by using a second set of rules. A non-limiting example for the second set of rules may include varying the radius smoothly from the identified value of the basic model toward a predefined value (e.g. 1 cm) for a predefined length and then smoothly changing the radius to zero at a predefined distance from the tip (e.g. 2 mm from the tip). Then, using the extrapolated centerline and radius-function, a unified polygonal mesh representing both the basic model and its extrapolation may be generated.


Embodiments of the invention may include an article such as a computer or processor readable medium, or a computer or processor storage medium, such as for example a memory, a disk drive, or a USB flash memory, encoding, including or storing instructions, e.g., computer-executable instructions, which when executed by a processor or controller, carry out methods disclosed herein.


Although embodiments of the invention are not limited in this regard, the terms “plurality” and “a plurality” as used herein may include, for example, “multiple” or “two or more”. The terms “plurality” or “a plurality” may be used throughout the specification to describe two or more components, devices, elements, units, parameters, or the like.


Unless explicitly stated, the method embodiments described herein are not constrained to a particular order or sequence. Additionally, some of the described method embodiments or elements thereof can occur or be performed at the same point in time or overlapping points in time. As known in the art, an execution of an executable code segment such as a function, task, sub-task or program may be referred to as execution of the function, program or other component.


Although embodiments of the invention are not limited in this regard, discussions utilizing terms such as, for example, “processing,” “computing,” “calculating,” “determining,” “establishing”, “analyzing”, “checking”, or the like, may refer to operation(s) and/or process(es) of a computer, a computing platform, a computing system, or other electronic computing device, that manipulate and/or transform data represented as physical (e.g., electronic) quantities within the computer's registers and/or memories into other data similarly represented as physical quantities within the computer's registers and/or memories or other information storage medium that may store instructions to perform operations and/or processes.


While certain features of the invention have been illustrated and described herein, many modifications, substitutions, changes, and equivalents may occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit of the invention.

Claims
  • 1. A method of performing computerized simulations of image-guided procedures, the method comprising: producing a digital image-based model of an anatomical structure in a patient based on medical image data obtained from the patient, the digital image-based model including a first 3D polygonal mesh representing the medical image data;producing, based on the image-based model and data representing anatomical structures not included in the medical image data, an extended model that represents the anatomical structure in the patient and adjacent anatomical regions not included in the medical image data, wherein producing the extended model comprises: automatically selecting from a set of pre-designed generic 3D polygonal meshes a second 3D polygonal mesh representing anatomical structures not included in the medical image data, andpositioning the second 3D polygonal mesh in alignment with a boundary of the first 3D polygonal mesh;displaying a graphical representation of the extended model; andperforming a computerized simulation of an image-guided procedure using the extended model.
  • 2. The method of claim 1, wherein the medical image data is received from a scan of a patient and the image-based model is a patient-specific model related to the patient.
  • 3. The method of claim 1, wherein producing the image-based model comprises producing the first 3D polygonal mesh.
  • 4. The method of claim 3, wherein producing the extended model comprises: identifying a boundary of the first 3D polygonal mesh;automatically selecting the second 3D polygonal mesh such that it represents an extended section that best matches the boundary, based on spatial location and geometric parameters; andcombining the boundary of the first 3D polygonal mesh with the second 3D polygonal mesh.
  • 5. The method of claim 4, wherein combining the boundary comprises: computing a center-line of a tubular section associated with the boundary; andconnecting the centerline of the tubular section of the first 3D polygonal mesh with a centerline of the second 3D polygonal mesh.
  • 6. The method of claim 5 further comprising: removing digital data associated with a redundant section; andscaling the second 3D polygonal mesh such that a radius of the second 3D polygonal mesh matches a radius of the first 3D polygonal mesh.
  • 7. The method of claim 1, wherein producing the extended model comprises: identifying a boundary of the image-based model;identifying attributes at the boundary;generating a procedurally-generated model based on one or more of the attributes as an extension of the image-based model, wherein the extended model includes both the image-based model and the procedurally-generated model.
  • 8. The method of claim 7, wherein identifying the attributes at the boundary includes determining a value for a radius at the boundary and defining a centerline of a section associated with the boundary.
  • 9. The method of claim 1, wherein producing the extended model comprises: identifying a boundary of the image-based model, the image-based model being represented in a first format;identifying attributes at the boundary; andselecting a predefined model to be the extended model, the predefined model being represented in a second format different than the first format.
  • 10. The method of claim 9, further comprising: converting the second format of the model into the first format.
  • 11. The method of claim 1, wherein producing the image-based model comprises: segmenting the medical image data;identifying a portion of the image data that represents a desired volume of the anatomical structure; andgenerating a boundary-representation of the desired volume.
  • 12. The method of claim 1, wherein the medical image data is received from a CT or MRI scanner.
  • 13. The method of claim 1, wherein the anatomical structure is a blood vessel.
  • 14. A system for simulating an image-guided procedure, the system comprising: a model generation unit to produce a digital image-based model of an anatomical structure in a patient based on received medical image data obtained from the patient and to produce, based on the image-based model and data representing anatomical structures not included in the medical image data, an extended model that represents the anatomical structure in the patient and adjacent anatomical regions not included in the medical image data, wherein the digital image-based model includes a first 3D polygonal mesh representing the medical image data and wherein producing the extended model comprises: automatically selecting from a set of pre-designed generic 3D polygonal meshes a second 3D polygonal mesh representing anatomical structures not included in the medical image data, andpositioning the second 3D polygonal mesh in alignment with a boundary of the first 3D polygonal mesh; anda simulation unit to display a graphical representation of the extended model and to perform a computerized simulation of an image-guided procedure using the extended model.
  • 15. The system of claim 14, wherein the medical image data is received from a scan of a patient and the image-based model is a patient-specific model related to the patient.
  • 16. A non-transitory computer readable storage medium having instructions stored thereon that when executed by a computing device result in: producing a digital image-based model of an anatomical structure in a patient based on medical image data obtained from the patient, the digital image-based model including a first 3D polygonal mesh representing the medical image data;producing, based on the image-based model and data representing anatomical structures not included in the medical image data, an extended model that represents the anatomical structure and adjacent anatomical regions not included in the medical image data, wherein producing the extended model comprises: automatically selecting from a set of pre-designed 3D generic polygonal meshes a second 3D polygonal mesh representing anatomical structures not included in the medical image data, andpositioning the second 3D polygonal mesh in alignment with a boundary of the first 3D polygonal mesh;displaying a graphical representation of the extended model; andperforming a computerized simulation of an image-guided procedure using the extended model.
  • 17. The computer readable storage medium of claim 16, wherein the instructions that result in producing the extended model further comprise instructions that when executed by the computing device result in: identifying a boundary of the first 3D polygonal mesh;automatically selecting the second 3D polygonal mesh such that it represents an extended section that best matches the boundary, based on spatial location and geometric parameters; andcombining the boundary of the first 3D polygonal mesh with the second 3D polygonal mesh.
  • 18. The computer readable storage medium of claim 17, wherein the instructions that result in combining the boundary further comprise instructions that when executed by the computing device result in: computing a center-line of a tubular section associated with the boundary; andconnecting the centerline of the tubular section of the first 3D polygonal mesh with a centerline of the second 3D polygonal mesh.
  • 19. The computer readable storage medium of claim 18, further comprising instructions that when executed by the computing device result in: removing digital data associated with a redundant section; andscaling the second 3D polygonal mesh such that a radius of the second 3D polygonal mesh matches a radius of the first 3D polygonal mesh.
  • 20. The computer readable storage medium of claim 16, wherein the instructions that result in producing the image-based model of the anatomical structure further comprise instructions that when executed by the computing device result in: segmenting the medical image data;identifying a portion of the image data that represents a desired volume of the anatomical structure; andgenerating a boundary-representation of the desired volume.
CROSS REFERENCE TO RELATED APPLICATIONS

This application is a Continuation-in-part application of U.S. application Ser. No. 12/224,314, filed Aug. 22, 2008, which is a National Phase Application of PCT International Application No. PCT/IL2008/000056, International filing date Jan. 13, 2008, claiming priority of U.S. Provisional Patent Application No. 60/880,415, filed Jan. 16, 2007.

US Referenced Citations (390)
Number Name Date Kind
1959490 Mistelski May 1934 A
3024539 Rider Mar 1962 A
3263824 Jones et al. Aug 1966 A
3406601 Clifford Oct 1968 A
3490059 Paulsen et al. Jan 1970 A
3517446 Corlyon et al. Jun 1970 A
3520071 Abrahamson et al. Jul 1970 A
3573444 Kawabata et al. Apr 1971 A
3579842 Scher May 1971 A
3704529 Cioppa Dec 1972 A
3722108 Chase Mar 1973 A
3739276 Dornberger Jun 1973 A
3775865 Rowan Dec 1973 A
3789518 Chase Feb 1974 A
3795061 Sarnoff et al. Mar 1974 A
3795150 Eckhardt Mar 1974 A
3814145 Gott et al. Jun 1974 A
3861065 Courtenay et al. Jan 1975 A
3875488 Crocker et al. Apr 1975 A
3919691 Noll Nov 1975 A
3945593 Schanzer Mar 1976 A
3991490 Markman Nov 1976 A
4024473 Edge et al. May 1977 A
4024873 Antoshkiw et al. May 1977 A
4033331 Guss et al. Jul 1977 A
4078317 Wheatley et al. Mar 1978 A
4089494 Anderson et al. May 1978 A
4115755 Cotton Sep 1978 A
4136554 Larson Jan 1979 A
4148014 Burson Apr 1979 A
4162582 McGraw et al. Jul 1979 A
4177984 Douglas et al. Dec 1979 A
4182054 Wise et al. Jan 1980 A
4183249 Anderson Jan 1980 A
4227319 Guy et al. Oct 1980 A
4236685 Kissel Dec 1980 A
4250636 Horwitz Feb 1981 A
4250887 Dardik et al. Feb 1981 A
4262549 Schwellenbach Apr 1981 A
4264312 Cianci Apr 1981 A
4276702 Horwitz Jul 1981 A
4307539 Klein Dec 1981 A
4333070 Barnes Jun 1982 A
4334216 Lacroix Jun 1982 A
4360345 Hon Nov 1982 A
4398889 Lam et al. Aug 1983 A
4427388 Hope Jan 1984 A
4436188 Jones Mar 1984 A
4459113 Boscaro Gatti et al. Jul 1984 A
4464117 Foerst Aug 1984 A
4478407 Manabe Oct 1984 A
4481001 Graham et al. Nov 1984 A
4504233 Galus et al. Mar 1985 A
4513235 Acklam et al. Apr 1985 A
4545390 Leary Oct 1985 A
4550617 Fraignier et al. Nov 1985 A
4551101 Neumann Nov 1985 A
4573452 Greenberg Mar 1986 A
4599070 Hladky et al. Jul 1986 A
4604016 Joyce Aug 1986 A
4605373 Rosen Aug 1986 A
4632341 Repperger et al. Dec 1986 A
4642055 Saliterman Feb 1987 A
4646742 Packard et al. Mar 1987 A
4654648 Herrington et al. Mar 1987 A
4655673 Hawkes Apr 1987 A
4659313 Kuster et al. Apr 1987 A
4667182 Murphy May 1987 A
4688983 Lindbom Aug 1987 A
4706006 Solomon Nov 1987 A
4708650 Holewinski et al. Nov 1987 A
4708656 de Vries et al. Nov 1987 A
4712101 Culver Dec 1987 A
4713007 Alban Dec 1987 A
4726772 Amplatz Feb 1988 A
4733214 Andresen Mar 1988 A
4742815 Ninan et al. May 1988 A
4748984 Patel Jun 1988 A
4751662 Crosbie Jun 1988 A
4757302 Hatakeyama et al. Jul 1988 A
4769763 Trieb et al. Sep 1988 A
4775289 Kazerooni Oct 1988 A
4782327 Kley et al. Nov 1988 A
4786892 Kubo et al. Nov 1988 A
4789340 Zikria Dec 1988 A
4794384 Jackson Dec 1988 A
4795296 Jau Jan 1989 A
4797104 Laerdal et al. Jan 1989 A
4803413 Kendig et al. Feb 1989 A
4820162 Ross Apr 1989 A
4823634 Culver Apr 1989 A
4825875 Ninan et al. May 1989 A
4839838 LaBiche et al. Jun 1989 A
4857881 Hayes Aug 1989 A
4860215 Seraji Aug 1989 A
4865423 Doi Sep 1989 A
4867685 Brush et al. Sep 1989 A
4868549 Affinito et al. Sep 1989 A
4870964 Bailey, Jr. et al. Oct 1989 A
4874998 Hollis, Jr Oct 1989 A
H703 Repperger et al. Nov 1989 H
4879556 Duimel Nov 1989 A
4881324 Khinchuk Nov 1989 A
4885565 Embach Dec 1989 A
4887966 Gellerman Dec 1989 A
4891764 McIntosh Jan 1990 A
4896554 Culver Jan 1990 A
4907796 Roel-Rodriguez Mar 1990 A
4907970 Meenen Mar 1990 A
4907973 Hon Mar 1990 A
4909232 Carella Mar 1990 A
4912638 Pratt Mar 1990 A
4930770 Baker Jun 1990 A
4934694 McIntosh Jun 1990 A
4940234 Ishida et al. Jul 1990 A
4949119 Moncrief et al. Aug 1990 A
4955654 Tsuchihashi et al. Sep 1990 A
4961138 Gorniak Oct 1990 A
4961267 Herzog Oct 1990 A
4964097 Wang et al. Oct 1990 A
4975546 Craig Dec 1990 A
4982618 Culver Jan 1991 A
4982918 Kaye Jan 1991 A
4998916 Hammerslag et al. Mar 1991 A
5004391 Burdea Apr 1991 A
5007300 Siva Apr 1991 A
5009598 Bennington Apr 1991 A
5018922 Yoshinada et al. May 1991 A
5019761 Kraft May 1991 A
5021982 Crosbie et al. Jun 1991 A
5022384 Freels et al. Jun 1991 A
5033352 Kellogg et al. Jul 1991 A
5044956 Behensky et al. Sep 1991 A
5048508 Storz Sep 1991 A
5057078 Foote et al. Oct 1991 A
5062594 Repperger Nov 1991 A
5072361 Davis et al. Dec 1991 A
5077769 Franciose Dec 1991 A
5078152 Bond et al. Jan 1992 A
5086296 Clark Feb 1992 A
5103404 McIntosh Apr 1992 A
5104328 Lounsbury Apr 1992 A
5112228 Zouras May 1992 A
5116051 Moncrief et al. May 1992 A
5116180 Fung et al. May 1992 A
5125843 Holloway Jun 1992 A
5126948 Mitchell et al. Jun 1992 A
5135488 Foote et al. Aug 1992 A
5139261 Openiano Aug 1992 A
5142931 Menahem Sep 1992 A
5143505 Burdea et al. Sep 1992 A
5146566 Hollis, Jr. et al. Sep 1992 A
5149270 McKeown Sep 1992 A
5153716 Smith Oct 1992 A
5158459 Edelberg Oct 1992 A
5167159 Lucking Dec 1992 A
5171299 Heitzmann et al. Dec 1992 A
5177473 Drysdale Jan 1993 A
5180351 Ehrenfried Jan 1993 A
5181181 Glynn Jan 1993 A
5184306 Erdman et al. Feb 1993 A
5184319 Kramer Feb 1993 A
5185561 Good et al. Feb 1993 A
5186629 Rohen Feb 1993 A
5189355 Larkins et al. Feb 1993 A
5191320 MacKay Mar 1993 A
5193963 McAffee et al. Mar 1993 A
5196017 Silva et al. Mar 1993 A
5197003 Moncrief et al. Mar 1993 A
5203563 Loper, III Apr 1993 A
5204600 Kahkoska Apr 1993 A
5209131 Baxter May 1993 A
5209661 Hildreth et al. May 1993 A
5212473 Louis May 1993 A
5215523 Williams et al. Jun 1993 A
5220260 Schuler Jun 1993 A
5222893 Hardesty Jun 1993 A
5223776 Radke et al. Jun 1993 A
5228356 Chuang Jul 1993 A
5240417 Smithson et al. Aug 1993 A
5243266 Kasagami et al. Sep 1993 A
5246007 Frisbie et al. Sep 1993 A
5247432 Ueda Sep 1993 A
5252068 Gryder Oct 1993 A
5252070 Jarrett Oct 1993 A
5257462 Buttermann Nov 1993 A
5259626 Ho Nov 1993 A
5259894 Sampson Nov 1993 A
5264768 Gregory et al. Nov 1993 A
5265034 Breckenridge et al. Nov 1993 A
5269519 Malone Dec 1993 A
5275174 Cook Jan 1994 A
5275565 Moncrief Jan 1994 A
5279309 Taylor Jan 1994 A
5279563 Brucker et al. Jan 1994 A
5280265 Kramer et al. Jan 1994 A
5283970 Aigner Feb 1994 A
5286203 Fuller et al. Feb 1994 A
5295694 Levin Mar 1994 A
5296846 Ledley Mar 1994 A
5296871 Paley Mar 1994 A
5305203 Raab Apr 1994 A
5309140 Everett, Jr. et al. May 1994 A
5311422 Loftin et al. May 1994 A
5313230 Venolia et al. May 1994 A
5313568 Wallace et al. May 1994 A
5314339 Aponte May 1994 A
5317689 Nack et al. May 1994 A
5318533 Adams et al. Jun 1994 A
5324260 O'Neill et al. Jun 1994 A
5327790 Levin et al. Jul 1994 A
5334027 Wherlock Aug 1994 A
5335557 Yasutake Aug 1994 A
5344354 Wiley Sep 1994 A
5353242 Crosbie et al. Oct 1994 A
5354162 Burdea et al. Oct 1994 A
5355148 Anderson Oct 1994 A
5364271 Aknin et al. Nov 1994 A
5366376 Copperman et al. Nov 1994 A
5368484 Copperman et al. Nov 1994 A
5368487 Medina Nov 1994 A
5368565 DeLong Nov 1994 A
5370535 Prendergast Dec 1994 A
5371778 Yanof et al. Dec 1994 A
5379663 Hara Jan 1995 A
5382885 Salcudean et al. Jan 1995 A
5384460 Tseng Jan 1995 A
5385549 Lampropoulos et al. Jan 1995 A
5389865 Jacobus et al. Feb 1995 A
5396267 Bouton Mar 1995 A
5397308 Ellis et al. Mar 1995 A
5397323 Taylor et al. Mar 1995 A
5399091 Mitsumoto Mar 1995 A
5402801 Taylor Apr 1995 A
5403191 Tuason Apr 1995 A
5412189 Cragun May 1995 A
5412880 Raab May 1995 A
5414337 Schuler May 1995 A
5423754 Cornelius et al. Jun 1995 A
5425644 Szinicz Jun 1995 A
5425709 Gambale Jun 1995 A
5428748 Davidson et al. Jun 1995 A
5429140 Burdea et al. Jul 1995 A
5430665 Jin et al. Jul 1995 A
5436640 Reeves Jul 1995 A
5445166 Taylor Aug 1995 A
5451924 Massimino et al. Sep 1995 A
5459382 Jacobus et al. Oct 1995 A
5461711 Wang et al. Oct 1995 A
5467441 Stone et al. Nov 1995 A
5467763 McMahon et al. Nov 1995 A
5470232 Kelso et al. Nov 1995 A
5473235 Lance et al. Dec 1995 A
5482051 Reddy et al. Jan 1996 A
5492530 Fischell et al. Feb 1996 A
5506605 Paley Apr 1996 A
5512919 Araki Apr 1996 A
5515078 Greschler et al. May 1996 A
5524637 Erickson Jun 1996 A
5541831 Thomas Jul 1996 A
5542672 Meredith Aug 1996 A
5542676 Howe, Jr. et al. Aug 1996 A
5547382 Yamasaki et al. Aug 1996 A
5548694 Frisken Aug 1996 A
5553198 Wang et al. Sep 1996 A
5559412 Schuler Sep 1996 A
5565840 Thorner et al. Oct 1996 A
5575761 Hajianpour Nov 1996 A
5577981 Jarvik Nov 1996 A
5580249 Jacobsen et al. Dec 1996 A
5584701 Lampotang et al. Dec 1996 A
5587937 Massie et al. Dec 1996 A
5591924 Hilton Jan 1997 A
5592401 Kramer Jan 1997 A
5599301 Jacobs et al. Feb 1997 A
5600348 Bartholow et al. Feb 1997 A
5607157 Nagashima Mar 1997 A
5607308 Copperman et al. Mar 1997 A
5609485 Bergman et al. Mar 1997 A
5609607 Hechtenberg et al. Mar 1997 A
5616030 Watson Apr 1997 A
5623582 Rosenberg Apr 1997 A
5625551 Mitarai et al. Apr 1997 A
5625576 Massie et al. Apr 1997 A
5629594 Jacobus et al. May 1997 A
5631861 Kramer May 1997 A
5631973 Green May 1997 A
5643087 Marcus et al. Jul 1997 A
5651775 Walker et al. Jul 1997 A
5657429 Wang et al. Aug 1997 A
5661253 Aoki Aug 1997 A
5661667 Rueb et al. Aug 1997 A
5666473 Wallace Sep 1997 A
5669818 Thorner et al. Sep 1997 A
5676157 Kramer Oct 1997 A
5680590 Parti Oct 1997 A
5684722 Thorner et al. Nov 1997 A
5691898 Rosenberg et al. Nov 1997 A
5694013 Stewart et al. Dec 1997 A
5695500 Taylor et al. Dec 1997 A
5701140 Rosenberg et al. Dec 1997 A
5709219 Chen et al. Jan 1998 A
5716016 Iwade et al. Feb 1998 A
5720619 Fisslinger Feb 1998 A
5724264 Rosenberg et al. Mar 1998 A
5731804 Rosenberg Mar 1998 A
5736978 Hasser et al. Apr 1998 A
5739811 Rosenberg et al. Apr 1998 A
5742278 Chen et al. Apr 1998 A
5749853 O'Donnell et al. May 1998 A
5755577 Gillio May 1998 A
5766016 Sinclair et al. Jun 1998 A
5769640 Jacobus et al. Jun 1998 A
5771181 Moore et al. Jun 1998 A
5776050 Chen et al. Jul 1998 A
5776126 Wilk et al. Jul 1998 A
5781172 Engel et al. Jul 1998 A
5797900 Madhani et al. Aug 1998 A
5800179 Bailey Sep 1998 A
5805140 Rosenberg et al. Sep 1998 A
5806521 Morimoto et al. Sep 1998 A
5807377 Madhani et al. Sep 1998 A
5808665 Green Sep 1998 A
5810007 Holupka et al. Sep 1998 A
5821920 Rosenberg et al. Oct 1998 A
5831408 Jacobus et al. Nov 1998 A
5844392 Peurach et al. Dec 1998 A
5882206 Gillio Mar 1999 A
5889670 Schuler et al. Mar 1999 A
5889672 Schuler et al. Mar 1999 A
5930741 Kramer Jul 1999 A
5945978 Holmes Aug 1999 A
5956484 Rosenberg et al. Sep 1999 A
5986643 Harvill et al. Nov 1999 A
5999185 Kato et al. Dec 1999 A
6004134 Marcus et al. Dec 1999 A
6024576 Bevirt et al. Feb 2000 A
6037927 Rosenberg Mar 2000 A
6038488 Barnes et al. Mar 2000 A
6042555 Kramer et al. Mar 2000 A
6047080 Chen et al. Apr 2000 A
6050962 Kramer et al. Apr 2000 A
6059506 Kramer May 2000 A
6062865 Bailey May 2000 A
6062866 Prom May 2000 A
6084587 Tarr et al. Jul 2000 A
6088017 Tremblay et al. Jul 2000 A
6104379 Petrich et al. Aug 2000 A
6110130 Kramer Aug 2000 A
6111577 Zilles et al. Aug 2000 A
6120465 Guthrie et al. Sep 2000 A
6148280 Kramer Nov 2000 A
6160489 Perry et al. Dec 2000 A
6162190 Kramer Dec 2000 A
6195592 Schuler Feb 2001 B1
6219032 Rosenberg et al. Apr 2001 B1
6222523 Harvill et al. Apr 2001 B1
6239784 Holmes May 2001 B1
6275213 Tremblay et al. Aug 2001 B1
6323837 Rosenberg Nov 2001 B1
6413229 Kramer et al. Jul 2002 B1
6428490 Kramer et al. Aug 2002 B1
6497672 Kramer Dec 2002 B2
6538634 Chui et al. Mar 2003 B1
RE38242 Engel et al. Sep 2003 E
6876891 Schuler et al. Apr 2005 B1
6885361 Harvill et al. Apr 2005 B1
7215326 Rosenberg May 2007 B2
7681579 Schwartz Mar 2010 B2
20020072814 Schuler et al. Jun 2002 A1
20020107573 Steinberg Aug 2002 A1
20020168618 Anderson et al. Nov 2002 A1
20030032876 Chen et al. Feb 2003 A1
20030069719 Cunningham et al. Apr 2003 A1
20040009459 Anderson et al. Jan 2004 A1
20040015070 Liang et al. Jan 2004 A1
20040086175 Parker et al. May 2004 A1
20040234933 Dawson et al. Nov 2004 A1
20050196740 Moriyama Sep 2005 A1
20060173338 Ma et al. Aug 2006 A1
20060211940 Antonelli et al. Sep 2006 A1
20060290695 Salomie Dec 2006 A1
20070027733 Bolle et al. Feb 2007 A1
20070043285 Schwartz Feb 2007 A1
20070049817 Preiss et al. Mar 2007 A1
20070148625 Biltz et al. Jun 2007 A1
20070231779 Santhanam et al. Oct 2007 A1
20090018808 Bronstein et al. Jan 2009 A1
20090310847 Matsuzaki et al. Dec 2009 A1
20110092804 Schoenefeld et al. Apr 2011 A1
Foreign Referenced Citations (30)
Number Date Country
0 147 516 Mar 1988 EP
0 265 011 Apr 1988 EP
0 393 683 Oct 1990 EP
0 456 103 Nov 1991 EP
0 489 469 Jun 1992 EP
0 316 763 Aug 1992 EP
0 567 215 Oct 1993 EP
0 571 827 Dec 1993 EP
0 624 861 Nov 1994 EP
0 626 634 Nov 1994 EP
0 623 066 Jul 1997 EP
0 632 709 Mar 2002 EP
2592514 Dec 1985 FR
2 195 808 Apr 1988 GB
2 252 656 Aug 1992 GB
2 288 686 Oct 1995 GB
03-98080 Apr 1991 JP
WO9106935 May 1991 WO
WO 9111775 Aug 1991 WO
WO 9304625 Mar 1993 WO
WO 9308517 Apr 1993 WO
WO 9314483 Jul 1993 WO
WO 9318475 Sep 1993 WO
WO 9425948 Nov 1994 WO
WO 9502233 Jan 1995 WO
WO9510080 Apr 1995 WO
WO 9532459 Nov 1995 WO
WO 9616389 May 1996 WO
WO9628800 Sep 1996 WO
WO 9938141 Jul 1999 WO
Non-Patent Literature Citations (163)
Entry
Levy (2001) ACM SIGGRAPH, Aug. 12-17, Los Angeles, CA, pp. 417-424).
International Search Report for International Publication No. PCT/IL2010/00172 date of mailing Jun. 17, 2010.
Wierzbicki et al. Four-Dimensional Modeling of the Heart for Image Guidance of Minimally Invasive Cardiac Surgeries. Medical Imaging 2004: Visualization, Image-Guided Procedures, and Display. vol. 5367, May 2004 pp. 302-311 XP-002512099.
Gering et al. An Integrated Visualization System for Surgical Planning and Guidance Using Image Fusion and Interventional Imaging. Medical Image Computing and Computer Assisted Intervention—MIC CAI' 99 Lecture Notes in Computer Science, LNCS, Springer, Berlin, DE. vol. 1679, Jan. 1, 2006 pp. 809-820 XP019036236 MIT AI Laboratory, Cambridge MA, USA Bringham & Women's Hospital, Harvard Medical School, Boston MA, USA.
Torsen Butz et al. Pre- and Intra-operative Planning and Simulation of Percutaneous Tumor Ablation Medical Image Computing and Computer Assisted Intervention  MICCAI 2000 Lecture Notes in Computer Science, LNCS, Springer, Berlin, DE. vol. 1935, Feb. 11, 2004, pp. 317-326, XP019001272.
Nakao M et al. Haptic reproduction and interactive visualization of a beating heart for cardiovascular surgery simulation. International Journal of Medical Informatics, Elsevier Scientific Publishers, Shannon, Ireland. vol. 68, No. 1-3 Dec. 18, 2002.
J. S. Denson and S. Abrahamson A Computer-Controlled Patient Simulator Apr. 21, 1969—vol. 208, No. 3, pp. 504-508 LA, USA.
D. Hon Ixion's Realistic Medical Simulations Virtual Reality World, vol. 2, No. 4 Jul. /Aug. 1994 pp. 58-62.
B.G Jackson L.B Rosenberg Force Feedback and Medical Simulation IOS Press and Ohmsha Jan. 19-22, 1995 pp. 147-151—CA, USA.
U.G. Kuhnapfel Realtime Graphical Computer Simulation for Endoscopic Surgery Symposium: Medicine Meets Virtual Reality II Jan. 27-30,1994 San Diego, CA, USA.
K.T. McGovern et al. The Virtual Clinic™, A Virtual Reality Surgical Simulator Symposium: Medicine Meets Virtual Reality II pp. 151-157 Jan. 27-30, 1994 San-Diego CA, USA.
M.D. Noar N. Soehendra Endoscopy Simulation Training Devices Endoscopy 1992, vol. 24 pp. 159-166 Georg Thieme Vering Stuttgart. New York.
M.D. Noar Robotics Interactive Simulation of RCP Sphincterotomy and EGD, Endoscopy 1992, vol. 24, pp. 539-541 Supplement 2 Georg Thieme Vering Stuttgart. New York.
O.C Zienkiewicz The Finite Element Method McGraw-Hill Book Company (UK) Limited, 1977, pp. 677-757.
Barfield et at Virtual Environments and Advanced Interface Design 1995 pp. 358-414.
Kuhnapfel et al. Endosurgery Simulations with KISMET Virtual Reality World, pp. 165-171 1995.
Massie et al. The PHANTOM Haptic Interface: A Device for Probing Virtual Objects Proceedings of the ASME Winter Annual Meeting, Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, Chicago, IL Nov 1994.
Bailie Gastrointestinal Endoscopy: Time for Change Scott Med J. Feb. 1989; 34 (1): 389-90.
lwata et al. Volume Haptization IEEE 1993, pp. 16-18.
Anon. VR in Medicine VR News: April, 1996 vol. 5, Issue 3.
Office Action for U.S. Appl. No. 12/224,314 dated Apr. 29, 2011.
US Final Office Action for U.S. Appl. No. 12/224,314, mailed on Dec. 1, 2011.
Vidal, “Simulation of Image Guided Needle Puncture: Contribution to Real-time Ultrasound and Fluoroscopic Rendering, and Volume Haptic Rendering”, School of Computer Science, Bangor University, United Kingdom Jan. 1, 2008, pp. 1-230.
Office Action issued by the United States Patent and Trademark Office for U.S. Appl. No. 12/224,314 dated May 31, 2012.
Office Action issued by the United States Patent and Trademark Office for U.S. Appl. No. 13/015,343 dated Jun. 7, 2012.
Notice of Allowance issued by the United States Patent and Trademark Office for U.S. Appl. No. 12/224,314 dated Feb. 19, 2013.
Yoshitaka, Adachi et al. Intermediate Interpretation for Stiff Virtual Objects Proceedings of the Virtual Reality Annual International Symposium (VRAIS' 95) Technical Research Center, Suzuki Motor Corporation, Yokohama, Japan.
Peter K Allen et al. Acquisition and Interpretation of 3-D Sensor Data from Touch Dept. of Computer Science, Columbia University, NY CH2813-4/89/0000/0033/$01.00 1989 IEEE pp. 33-40.
Fumihito Arai et al. Intelligent Assistance in Operation of Active Catheter for Minimum Invasive Surgery. Nagoya University—Nagoya, Japan Kinjo University—Nagoya, Japan IEEE International Workshop on Robot and Human Communication 0-7803-2002-6/94 $4.00 1994 IEEE.
Daniel Bachofen et al. Enhancing the Visual Realism of Hysteroscopy Simulation Book Series: Studies in Health Technology and Informatics—Book Medicine meets Virtual Reality 14: Accelerating Change in Health Care: Next Medical Toolkit vol. 119/2005 pp. 31-36.
J. Baille et al. Use of Computer Graphics Simulation for Teaching of Flexible Sigmoidoscopy. Duke University Medical Center, North Carolina, USA. Endoscopy 3 vol. 23 May 1991 pp. 126-129.
David Baraff An Introduction to Physically Based Modeling: Ridged Body Simulation II—Nonpenetration Constraints Robotics Institute Carnegie Mellon Institute pp. 32-68 1997.
Adelstein et al. ASME Symposium 1992 Design and Implementation of a Force Reflecting Manipuladum for Manual Control Research. CA, USA.
J.H. Anderson et al. Da Vinci: A Vascular Catheterization and Interventional Radiology-Based Training and Patient Pretreatment Planning Simulator (Abstract) JVIR Supplement, Journal of Vascular and Interventional Radiology, vol. 7, No. 1, Part 2. Jan.-Feb. 1996 Washington, US.
J. Batter and F. Brooks, Jr. Grope-1: A Computer Display to the Sense of Feel 1972 North Carolina, USA.
M. Bostrom et al. Design of an Interactive Lumbar Puncture Simulator With Tactile Feedback IEEE Neutral Network Counsel Virtual Reality Annual International Symposium Conference Sep. 18-22, 1993; Seattle, Washington U.S.
M. Bostrom Design of Hardware for Simulating Lumbar Puncture with Force Feedback Thayer School of Engineering, Dartmouth College. Mar. 17, 1993.
F. P. Brooks, et al. Project Grope—Haptic Displays for Scientific Visualization ACM, Computer Graphics, vol. 24, No. 4. Aug. 1990—Chapel Hill NC, USA.
Burdea et al. A Distributed Virtual Environment with Dextrous Force Feedback Informatique '92, International Conference Interface to Real and Virtual Worlds, Rutgers University EC2 Conference Mar. 23-27, 1992, NJ, USA.
J. Capowski, Remote Manipulators as a Computer Input Device University Microfilms, A XEROX Company, Ann Arbor, Michigan UMI Dissertation Services. 1971—Michigan USA.
J. S. Denson and S. Abrahamson A Computer-Controlled Patient Simulator Apr. 21, 1969—vol. 208, No. 3 pp. 504-508 LA, USA.
D. Gillies and C. Williams, London UK An Interactive Graphic Simulator for the Teaching of Fibrendoscopic Techniques Eurographics '87 Elsevier Science Publishers B.V North Holland pp. 127-138.
Gillies, Haritsis and Williams Computer Simulation for Teaching Endoscopic Procedures Endoscopy, Supplement II, vol. 24, Jul. 1992. pp. 455-550.
A. Haritsis D. Gillies CH. Williams (Eurographics) Realistic Generation and Real Time Animation of Images of the Human Colon Computer Graphics Forum vol. II No. 3, conference issue—Sep. 7-11, 1992. NNC Blackwell.
Haritsis 1992 (Hellenic) A.Haritsis D. Gillies CH. Williams Computer Simulation: New Horizons in Endoscopy Teaching Hellenic Journal of Gastroenterology 1992 pp. 54-63 London UK.
G. Higgins, et al. Higgins 1995 (Surg. Tech. Int'l IV) Virtual Reality Surgery: Implementation of a Coronary Angioplasty Training Simulator. University Medical Press, San Francisco, 1995. pp. 379-383.
D. Hon Ixion's Laparoscopic Surgical Skills Simulator Symposium: Medicine Meets Virtual Reality II Jan. 27-30, 1994 San Diego, USA.
D. Hon Ixion's Realistic Medical Simulations Virtual Reality World, vol. 2, No. 4 Jul. / Aug. 1994 pp. 58-62.
H. Iwata Artificial Reality with Force-feedback: Development of Desktop Virtual Space with Compact Master Manipulator. ACM SIGGRPAH 1990 Computer Graphics & Interactive Techniques vol. 24, No. 4. pp. 165-170 Aug. 6-10, 1990.
B.G Jackson LB Rosenberg Force Feedback and Medical Simulation IOS Press and Ohmsha Jan. 19-22, 1995 pp. 147-151—CA, USA.
P.J. Kilpatrick Kilpatrick Thesis 1976 pp. 11-27 The Use of a Kinesthetic Supplement in an Interactive Graphics System. The University of North Carolina, USA.
Kotoku et al. A Force Display System for Virtual Environments and its Evaluation International Workshop on Robot and Human Communication IEEE Sep. 1-3, 1992 pp. 246-251—Ibaraki, Japan.
U.G. Kuhnapfel Realtime Graphical Computer Simulation for Endoscopic Surgery Symposium: Medicine Meets Virtual Reality II Jan. 27-30, 1994 San Diego, CA, USA.
U.G.Kuhnapfel et al. Endo surgery simulations with KISMET: a flexible tool for surgical instrument design, operation room planning and VR technology based abdominal surgery training. Virtual Reality World '95, Conference Stuttgart, Germany Computerwoche Verlag, 1995. pp. 165-171.
B. Marcus Feedback Technology and Virtual Environments pp. 87-95 Jul. 1-3, 1992—1992 International Conference on Artificial Reality and Telexistence (ICAT 1992) pp. 87-95.
Mark et al. Adding Force Feedback to Graphics Systems: Issues and Solutions Aug. 4-9, 1996 ACM SIGGRAPH 1996 Computer Graphics Proceedings, Annual Conference Chapel Hill. North Carolina, USA.
T.H. Massie Design of a Three Degree of Freedom Force-Reflecting Haptic Interface MIT, USA Thesis—pp. 6-38 May 18, 1993 Submitted May 17, 1993.
K.T. McGovern et al. The Virtual Clinic™, A Virtual Reality Surgical Simulator Symposium: Medicine Meets Virtual Reality II pp. 151-157 Jan. 27-30, San-Diego CA, USA.
D. Meglan Making Surgical Simulation Real ACM SIGGRAPH Computer Graphics pp. 37-39 Nov. 1996 Rockville, MD, USA.
Meglan et al. The Teleos Virtual Environment Toolkit for Simulation-Based Surgical Education Interactive Technology and the New Paradigm for Healthcare Proceeding of MMVR 3, IOS Press and Ohmsha pp. 346-351. Jan. 17-20, 1996 San-Diego CA, USA.
J. R. Merril The Future of Virtual Reality, Medicine, and the Information Superhighway Journal of Knowledge Engineering & Technology, vol. 7, No. 1 Spring 1994 pp. 33-35 MD, USA.
Merril et al. Photorealistic Interactive Three-Dimensional Graphics in Surgical Simulation Interactive Technology and the New Paradigm for Healthcare Proceeding of MMVR 3, IOS Press and Ohmsha pp. 244-252 Jan. 19-22, 1995 San Diego, USA.
Merril et al. Surgical Simulation Using Virtual Reality Technology: Design, Implementation, and Implications. Surgical Technology International III 1994 pp. 53-60. Published by Universal Medical Press, CA, USA.
Merril et al. Virtual Heart Surgery—Trade Show and Medical Education 1994 Virtual Reality World pp. 55-57 Jul./Aug. 1994 MD, USA.
Merril et al Cyber Surgery—Cutting Costs, Sewing Benefits The Virtual Reality Special Report, Miller Freedman Inc. Summer 1994 pp. 39-42 MD, USA.
Minsky et al. Feeling and Seeing: Issues in Force Display ACM 1990 pp. 235-243 CA, USA.
M.D. Noar N. Soehendra Endoscopy Simulation Training Devices Endoscopy 1992, vol. 24 pp. 159-166 Georg Thieme Vexing Stuttgart. New York.
M.D. Noar Robotics Interactive Simulation of RCP Sphincterotomy and EGD, Endoscopy 1992, vol. 24, pp. 539-541 Supplement 2 Georg Thieme Vexing Stuttgart. New York.
A. M. Noll Man-Machine Tactile Communication Polytechnic Institute of Brooklyn, Jun. 1971, pp. 1V-X111 and 1-87.
Ernest M. Otani Software Tools for Dynamic and Kinematic Modeling of Human Emotion Department of Computer & Information Science Technical Reports (CIS) University of Pennsylvania, Jul. 1989, pp. 1-74.
M. Ouh-Young Force Display in Molecular Docking UNC, The University of North Carolina at Chapel Hill 1990, pp. 1-369.
J. Peifer, et al. Medicine Meets Virtual Reality, Health Care in the Information Age Applied Virtual Reality for Simulation of Endoscopic Retrograde Cholangio-Pancreatography IOM Press, Proceedings of Medicine Meets Virtual Reality 4, San Diego, California, Jan. 17-20, 1996, pp. 36-42.
S. Pieper et al. Stereoscopic Displays and Applications II Virtual environment system for simulation of leg surgery SPIE vol. 1457, Feb. 25-27, 1991, pp. 188-197.
S. Pieper et al. Interactive Graphics for Plastic Surgery: A task-level analysis and Implementation 1992 ACM Computer Graphics Special Issue on 1992 Symposium on Interactive 3D Graphics, Cambridge, MA Mar. 29-Apr. 1, 1992, pp. 127-134.
D. Popa Simulation of Lumbar Puncture Procedure using Force Feedback in Virtual Environments Thayer School of Engineering, Dartmouth College, Hanover, New Hampshire, Jun. 1994, pp. 1-134.
Preminger et al. Medicine Meets Virtual Reality, Health Care in the Information Age Virtual Reality Surgical Simulation in Endoscopic Urologic Surgery IOM Press, Proceedings of Medicine Meets Virtual Reality 4, San Diego, California, Jan. 17-20, 1996, Chapter 19, pp. 157-163.
L.B Rosenberg, B.G Jackson Foot-Based Interfaces to Virtual Environments Using the Immersion Interface Box (TM) Virtual Reality and Persons With Disabilities, Second Annual International Conference, Jun. 8-10, 1994, pp. 145-148.
L.B Rosenberg “Virtual Fixtures”—Perceptual overlays enhance operator performance in telepresence tasks Stanford University, Aug. 1994. pp. 1-214.
M. A. Russo The Design and Implementation of a Three Degree of Freedom of Freedom Force Output Joystick MIT, May 11, 1990. pp. 1-131.
Salisbury et al. Haptic Rendering: Programming Touch Interaction with Virtual Objects Symposium on Interactive 3D Graphics, 1995 ACM, pp. 123-130.
S. S. Saliterman A Computerized Simulator for Critical-Care Training: New Technology for Medical Education Scientific session of the Mayo Medical School Alumni Society , Nov. 4, 1989, pp. 968-978.
B. Schmult et al. Application Areas for a Force-Feedback Joystick DSC vol. 49. Advances in Robotics, Mechatronics, and Haptic Interfaces ASME 1993, pp. 47-54.
Singh et al. Design of an Interactive Lumbar Puncture Simulator With Tactile Feedback IEEE International Conference on Robotics and Automation, May 8-13, 1994, pp. 1734-1739.
M. Stanley and J. Colgate Computer Simulation of Interacting Dynamic Mechanical Systems using Distributed Memory Parallel Processors ASME Winter Annual Meeting, Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, Nov. 8-13, 1992, pp. 55-61.
Sharon A. Stansfield Visually-Guided Haptic Object Recognition University of Pennsylvania 1987 UMI, pp. 1-216.
I. Sutherland The Ultimate Display for Production Proceedings of the IFIP Congress 1965, pp. 506-508.
D. Terzopoulos and D. Metaxas Dynamic 3D Models with Local and Global Deformations: Deformable Superquadrics IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 13, No. 7, Aug. 30, 1990, pp. 703-714.
Williams, Baillie, Gillies, Borislow and Cotton Teaching Gastrointestinal Endoscopy by Computer Simulation: a Prototype for Colonoscopy and ERCP Gastrointestinal Endoscopy vol. 36, No. 1., 1990, pp. 49-54.
C. M. Winey III Computer Simulated Visual and Tactile Feedback as an aid to Manipulator and Vehicle Control MIT, Jul. 31, 1981, pp. 1-132.
O.C. Zienkiewicz the Finite Element Method McGraw Hill Book Company (UK) Limited, 1977, pp. 677-757.
Beth A. Marcus Hands on: Haptic Feedback in Surgical Simulation Exos, Inc., Jan. 27-30, 1994, pp. SIMB 004163-SIMB 004174.
Virtual Reality and Medicine the Cutting Edge SIG Advanced Applications, Inc. Conference and Exhibition, Sep. 8-11, 1994, The New York Hilton.
Daane et al. A $100 Surgical Simulator for the IBM PC Interactive Technology and the New Paradigm for Healthcare Jan. 1995—pp. 79-80.
Strutz et al. 3-D Guided Endoscopic Surgery of Paranasal Sinusese Surgical Technology International IV, Oct. 1995, pp. 195-197.
Stone Haptic Human-Computer Interaction—Haptic Feedback: A Brief History from Telepresence to Virtual Reality Haptic Human-Computer Interaction, First International Workshop, Glasgow, UK Proceedings. Aug. 31-Sep. 1, 2000.
Loftin et al. A Virtual Environment for Laparoscopic Surgical Training Medicine Meets Virtual Reality II: Interactive Technology & Healthcare, Jan. 1994.
Durrani et al. Advanced Endoscopic Imaging: 3-D Laparoscopic Endoscopy Surgical Technology International III, Oct. 1994.
Johnston et al. Assessing a Virtual Reality Surgical Skills Simulator Stud Health Technol Inform. 1996; 29:608-17.
Barfield et al Virtual Environments and Advanced Interface Design 1995 pp. 358-414.
Bejczy et al. Controlling Remote Manipulators Through Kinesthetic Coupling Computers in Mechanical Engineering Jul. 1983, pp. 48-60.
Beer-Gable Computer Assisted Training in Endoscopy (C.A.T.E.): From a Simulator to a Learning Station.Endoscopy 1992; 24:suppl. 2: pp. 534-538.
Kuenhapfel et al. CAD-Based Graphical Computer Simulation in Endoscopic Surgery Institute fur Angewandte Informatik, Kernforschumgszentr urn Karlsruhe Germany, Oct. 1994.
Campos et al. A Robotic Haptic System Architecture University of Pennsylvania, Dept. of Computer & Information Science Technical Reprot No. MS-CIS-00-51 1990.
Merril et al. Changing the Focus of Surgical Training Virtual Reality World, Mar./Apr. 1995, pp. 56-60.
Szabo et al. Choreographed Instrument Movements During Laparoscopic Surgery: Needle Driving, Knot Tying, and Anastomosis Techniques. Medicine Meets Virtual Reality II; Interactive Technology & Healthcare, Jan. 1994. pp. 216-217.
Dumay Cybersurgery Medicine Meets Virtual Reality II. Interactive Technology & Healthcare, Jan. 1994.
Greenleaf DataGlove and Datasuit: Virtual Reality Technology Applied to the Measurement of Human Movement. Medicine Meets Virtual Reality II. Interactive Technology & Healthcare, Jan. 1994, pp. 63-69.
Burdea et al. Dextrous Telerobotics with Force Feedback—An Overview, Part 1: Human Factors Rutgers—The State University of New Jersey, Dept. of Electrical and Computer Engineering, Robotica (1991) vol. 9, pp. 171-178.
Online reference dated May 31, 1995, updates chapter 13 of the AutoCAD Release 13 Developer's Guide dated Apr. 14, 1995.
Christensen Bringing Telematics Into Health Care in The European Communities Medicine Meets Virtual Reality II. Interactive Technology & Healthcare, Jan. 1994, 21-23.
Marcus et al. Exos Research on Master Controllers for Robotic Devices NASA Technical Reports Server NTRS, pp. 238-245 Jan 1, 1992.
Merril VR for Medical Training and Trade Show “Fly-Pape”: Virtual Reality World, May/Jun. 1994, pp. 53-57.
Baumann et al. Force Feedback for Virtual Reality Based Minimally Invasive Surgery Simulator Medicine Meets Virtual Reality IV: Health Care in the Information Age, Jan. 1996.
Jason Fritz Haptic Rendering Techniques for Scientific Visualization Jason P. Fritz Thesis at University of Delaware Fall 1996.
Rosenberg et al. A Haptic Interface for Virtual Simulation of Endoscopic Surgery Medicine Meets Virtual Reality IV: Health Care in the Information Age, Jan. 1996 pp. 371-387.
Ho et al. IGES and PDES, The Current Status of Product Data Exchange Status Dept. of Computer Science, Univ. Of Mo-Rolla, Rolla MO, 1988 IEEE, pp. 210-216.
Hooper The Interactive Assembly and Computer Animation of Reconfigurable Robotic Systems Mechanical Engineering Dept. The University of Texas at Austin. 1990.
Rosenberg Louis B. Human Interface Hardware for Virtual Laparoscopic Surgery Interactive Technology and the New Paradigm for Health Care Immersion Corp. Santa Clara, CA. Chapter49, pp. 322—Jan. 1995.
Funda et al. Image-guided Command and Control of a Surgical Robot Medicine Meets Virtual Reality II. Interactive Technology & Healthcare, Jan. 1994.
Jaramaz et al. Integrating Finite Element Analysis Into Pre-Operative Surgical Planning and Simulation of Total Joint Replacement Surgery Medicine Meets Virtual Reality II. Interactive Technology & Healthcare, Jan. 1994, pp. 34-37.
Merck & Co. An Introduction to the Robotic Endoscopy Simulator 1989.
Filerman et al. Issues in the Design of Tactile Input Devices for Mechanical CAD Systems Massachusetts Institute of Technology, Artificial Intelligence Laboratory 1989.
Hon Ixion's Laparoscopic Surgical Skills Simulator Medicine Meets Virtual Reality II. Interactive Technology & Healthcare, Jan. 1994.
Kilpatrick The use of a Kinesthetic Supplement in an Interactive Graphics System Xerox University Microfilms 1976.
Kuhnapfel et al. Endosurgery Simulations with KISMET Virtual Reality Workd, pp. 165-171 1995.
Immersion Corporation Laparoscopic Impulse Engine Impulse Engine 2000™ Software Development Kit (ver. 1.0)(Immersion) Immersion Corporation—Version 1.0 Mar. 1995.
McKensie et al. Lasers in Surgery and Medicine Wessex Regional Medical Physics Service and Department of Otolaryngology, vol. 29, No. 6, pp. 619-641 1984.
Massie et al. The PHANTOM Haptic Interface: A Device for Probing Virtual Objects Proceedings of the ASME Winter Annual Meeting, Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, Chicago, IL Nov. 1994.
Poston et al. The Medical Reality Sculptor Medicine Meets Virtual Reality II. Interactive Technology & Healthcare, Jan. 1994. pp. 174-176.
Satava Medical Virtual Reality: The Current Status of the Future Medicine Meets Virtual Reality IV: Health Care in the Information Age, Jan. 1996.
Merril et al. Virtual Reality for Trade Shows and Individual Physician Training Virtual Reality Systems, pp. 40-44 Spring 2004.
Flynn Virtual Reality and Virtual Spaces Find a Niche in Real Medicine; Simulated Surgery on a Computer—This Won't Hurt. New York Times Jun. 5, 1995.
Hannaford et al. Performance Evaluation of a Six-Axis Generalized Force-Reflecting Teleoperator IEEE May/Jun. 1991, vol. 21, No. 3 pp. 620-633.
Merril Presentation Material: Medicine Meets Virtual Reality II. Interactive Technology & Healthcare, Jan. 1994.
Immersion Human Interface Corporation Immersion PROBE and Personal Digitizer Programmer's Technical Reference Manual: Immersion Probe and Personal Digitizer May 19, 1994.
Durlach Psychophysical Considerations in The Design of Human-Machine Interfaces for Teleoperator and Virtual-Environment Systems Medicine Meets Virtual Reality II. Interactive Technology & Healthcare, Jan. 1994 pp. 45-47.
Hubner et al. Real-Time Volume Visualization of Medical Image Data for Diagnostic and Navigational Purposes in Computer Aided Surgery Proc., Computer Assisted Radiology, CAR'96 Paris, pp. 751-756 Jun. 26-29, 1996.
Merril et al. Revealing the Mysteries of the Brain with VR Virtual Reality Special Report, Winter 1994, pp. 61-65.
Neisius et al. Robotic Telemanipulator for Laparoscopy 1995 IEEE-EMBC and CMBEC Theme 5: Neuromuscular Systems/ Biomechamics, pp. 1199-1200.1995.
Medical World News Virtual Reality Shapes Surgeon's Skills Medical World News, Feb. 1994, pp. 26-27.
Hon Tactile and Visual Simulation: A Realistic Endoscopy Experience Medicine Meets Virtual Reality: Discovering Applications for 3-D Multi-Media Interactive Technology in the Health Sciences, Jun. 4-7, 1992.
Johnson Tactile Feedback Enhancement to Laparoscopic Tools Medicine Meets Virtual Reality II. Interactive Technology & Healthcare, Jan. 1994.
Fischer et al. Tactile Feedback for Endoscopic Surgery Interactive Technology and the New Paradigm for Healthcare, Jan. 1995.
Peine et al. A Tactile Sensing and Display System for Surgical Applications Interactive Technology and the New Paradigm for Healthcare, Jan. 1995 pp. 283-288.
Hunter et al. Teleoperated Microsurgical Robot and Associated Virtual Environment Medicine Meets Virtual Reality II. Interactive Technology & Healthcare, Jan. 1994.
Holler et al. Teleprescence Systems for Application in Minimally Invasive Surgery Medicine Meets Virtual Reality II. Interactive Technology & Healthcare, Jan. 1994.
Satava Virtual Reality Surgical Simulator: The First Steps Medicine Meets Virtual Reality: Discovering Applications for 3-D Multi-Media Interactive Technology in the Health Sciences—Jun. 4-7, 1992.
Frolich et al. The Responsive Workbench: A Virtual Working Environment for Physicians Interactive Technology and the New Paradigm for Healthcare, Jan. 1995, pp. 118-119.
Doyle et al. The Virtual Embryo: VR Applications in Human Developmental Anatomy Medicine Meets Virtual Reality II: Interactive Technology & Healthcare, Jan. 1994, pp. 38-41.
Baille Gastrointestinal Endoscopy: Time for Change Scott Med J. Feb. 1989; 34 (1): 389-90.
Song et al. Tissue Cutting in Virtual Environments Interactive Technology and the New Paradigm for Healthcare, Jan. 1995.
Gyeong-Jae et al. Tissue Cutting in Virtual Environments Interactive Technology and the New Paradigm for Healthcare, Jan. 1995 359-364J.
Sukthankar Towards Virtual Reality of “Tissue Squeezing”: A Feasibility Study Medicine Meets Virtual Reality II: Interactive Technology & Healthcare, Jan. 1994, pp. 182-186.
Adachi Touch and Trace on the Free-Form Surface of Virtual Object Proceedings of IEEE Virtual Reality Annual International Symposium—Sep. 18-22, 1993 Seattle, WA pp. 162-168.
CH Products CH Products Virtual Pilot Control Yoke 1993.
Hoffman Virtual Reality and the Medical Curriculum: Integrating Extant and Emerging Technologies Medicine Meets Virtual Reality II: Interactive Technology & Healthcare, Jan. 1994 pp. 73-76.
Burdea et al. Virtual Reality Technology Chap. 6, pp. 221-242. Wiley-Interscience 2003.
Iwata et al. Volume Haptization IEEE 1993, pp. 16-18.
Anon. VR in Medicine VR News; Apr. 1996 vol. 5, Issue 3.
Ota et al. Virtual Reality in Surgical Education ComputBiol Med., Mar. 1995, 25(2): 127-37.
MacDonald et al. Virtual Reality Technology Applied to Anesthesiology Interactive Technology and the New Paradigm for Healthcare, Jan. 1995.
Bell et al. The Virtual Reality Modeling Language, version 1.0 Specification 1996.
Merril Why I Simulate Surgery . . . Virtual Reality World, Nov./Dec. 1994, pp. 54-57.
Related Publications (1)
Number Date Country
20090177454 A1 Jul 2009 US
Provisional Applications (1)
Number Date Country
60880415 Jan 2007 US
Continuation in Parts (1)
Number Date Country
Parent 12224314 US
Child 12405954 US